Semester page for FYS5429 - Spring 2025

Teachers

Canvas

This course uses Canvas. See tips and guides for Canvas.

Log into Canvas

Dear all, welcome back and we hope the week has started the best possible way.

First a practical note. Due to a meeting at the MN-faculty at UiO during our lecture time, the lecture this week will be posted as a recording before (most likely before the actual lecture time). This means that there is no in-person or direct zoom session on April 3. We will mail you all when the recording is ready. I am sorry for this. Next week we will have a regular lecture (in-person and direct zoom sessions, the lecture will recorded as usual).

The topic this week is a direct continuation from last week, that is generative models and energy models in particular. We will go through the math (and codes) for making a Boltzmann machine. We will also discuss Gibbs sampling and repeat some of the central topics from last week. The jupyter-notebook this week is somewhat long, but a good part of the initial material is material fro...

Apr. 1, 2025 9:49 AM

Dear all, welcome back to FYS5429/9429. We hope you've had a great weekend.

This week we start with generative models, after a short reminder on autoencoders and a discussion on how to implements these using Tensorflow/Keras or Pytorch. We will spend the rest of the semester on these methods (and we get time we will end with a discussion on Transformers/Reinforcement learning).

Our plan is to cover energy based models, GANS, variational autoenconders and diffusion models. We will start with the basic math of these models, including Monte Carlo sampling.  The plan for this week is thus to 

Plans for the week March 24-28

  1. Finalizing discussion on autoencoders and implementing Autoencoders with TensorFlow/Keras and PyTorch

  2. Overview of generative models

  3. Probability distributions and Markov Chain Monte Carlo simulati...

Mar. 24, 2025 7:59 AM

Dear all, welcome back to FYS5429/9429.

We hope obviously that the week has started the best possible way!  

This week the plan is to continue our discussions of Autoenconders (AEs) and linking it with the principal component analysis (PCA) method, which is a very useful and much used method for reducing the dimensionality of a given data set (finding the relevant features). 

We will show, for a linear transformation in an AE, that this corresponds to performing a PCA analysis.  The lecture will be devoted to a discussion of the PCA and its link with AEs. We will also show code examples in Pytorch and TensorFlow on how to use AEs.

The jupyter-notebook this week, with codes and more is at https://github.com/CompPhysics/AdvancedMachineLearning/blob/main/doc/pub/week9/ipyn...

Mar. 18, 2025 9:20 PM

Plans for the week of March 10-14

Dear all, we hope you've had an excellent start of the week, and a pleasant weekend.

This week, the first part of the lecture is an attempt at wrapping up our discussion of RNNs and introduce LSTMs (long short time memory). 

After that, the second part introduces now a new method, the so-called Autoencoders. In this connection we need also to discuss a famous methods for reducing the dimensionality of the problems at hand, the Principal Component Analysis, or just PCA. The reason for this is that we can show that a linear autoencoder corresponds to us performing a PCA reduction. 

The slides this week are somewhat dense, and most likely a good fraction of these will be used next week as well. I may add corrections during the coming days as well.

Plans for the week March 10-14

  1. RNNs and discussion of Long-Shor...

Mar. 10, 2025 12:07 PM

Dear all, welcome back to FYS5429/9429. Our plans this week are

  1. Reminder on basics of recurrent neural networks (RNNs) from last week

  2. We will discuss in depth the mathematics of RNNs as this serves as an input to the possibility of developing own codes. The last point this week is then to develop our own code.

  3. Writing our own codes for RNNs

Reading recommendations:

a. Goodfellow, Bengio and Courville's chapter 10 from Deep Learning, see https://www.deeplearningbook.org/

b. Sebastian Rashcka et al, chapter 15, Machine learning with Sickit-Learn and PyTorch, see https://sebastianraschka.com/blog/2022/ml-pytorch-book...

Mar. 4, 2025 11:10 PM

Dear all, welcome back to FYS5429/9429. We hope the week has started the best possible way for you all.

This week we start with recurrent neural networks, with a basic overview and their math. Note that the jupyter-notebook for this week, will be updated with more material (basic math of RNNs) the next two days. 

We will continue with RNNs next week as well, and discuss how we can encode them, plus more on their math.

For those of you interested in say large language models, RNNs served as prototypes and could represent a full semester project, like the CNN coding project. 

Plans for the week February 24-28, 2025

Intro to and mathematics of Recurrent Neural Networks (RNNs)

More material will be added (Tuesday and Wednesday this week).

Reading recommendations

For RNNs, see Goodfellow et al chapter 10, see https://www.deeplearningbook.org/contents/rnn.html. This book g...

Feb. 25, 2025 9:18 AM

Dear all, welcome back to FYS5429/9429. Here are our suggestions for the lecture on Thursday. In addition, we will send later an update on all projects defined by you. This should give you all an overview of the different groups and for those of you who have not yet made up your mind, feel free to reach out to the groups.

Else, this week we plan to discuss how to implement CNNs using PyTorch, TensorFlow and own codes. After that we will start scratching the surface of Recurrent Neural Networks (RNNs). Next week we discuss in more detail the math of RNNs and how to build our own RNNs code and how to implement it using either TensorFlow or PyTorch. The lecture notes as a jupyter-notebook are at https://github.com/CompPhysics/AdvancedMachineLearning/blob/main/doc/pub/week5/ipynb/week5.ipynb

So, this week the plans are as

  1. Implementing Convolutional Neural Networks (CNNs), own codes, TensorFlow+Keras and PyTorch examples...

Feb. 18, 2025 10:43 AM

Dear all and welcome back to FYS5429/9429.  

Last week we started with scratching the surface of CNNs. This week we will go in more details into the math of CNNs and hopefully also start discussing implementations, either writing our own code or using Pytorch/Tensorflow. You will find examples of codes applied to the MNIST data set and other sets in the jupyter-notebook for this week. 
We will finalize this topic next week and then start with RNNs.
We recommend using the jupyter-notebook since there are several code examples there. The file is at 

https://github.com/CompPhysics/AdvancedMachineLearning/blob/main/doc/pub/week4/ipynb/week4.ipynb

The plan this week is to 

-Discuss the basics and mathematics of Convolutional Neural Networks with code examples.

- We will also (during the lecture and the exercise session) present a summary of various project topics people have defined.&nbs...

Feb. 12, 2025 10:20 PM

Dear all, we have some of you who will give a short presentation of possible projects. If we get more there is no problem in adding you to the list.  The program here is thus very preliminary. We start at 215pm, after the break which follows the lecture. Here's the preliminary list (some of the titles are very preliminary). If I have forgotten somebody, please let me know. The zoom link for those who cannot be here is  https://uio.zoom.us/my/mortenhjLinks to an external site.

Ellen Beate Tysv?r, Build an LLM from scratch

Bj?rn Egil Ludvigsen,  Physics-Informed Neural Operators

Simon Elias Schrader, Learning to optimize: Solving the Schr?dinger equation faster

Kristian Holme, Deep Reinforcement Learning for active flow control of flu...

Feb. 6, 2025 9:27 AM

Dear all and welcome back to FYS5429/FYS9429. We hope the week has started the best possible way.

The aim this week is to wrap up the review of neural networks we started last week, with a discussion of codes. 

Thereafter we start with CNNs and discuss their mathematics and how to implement CNNs ourselves, or using Tensorflow/PyTorch. We will continue with CNNs next week as well before we move over to RNNs. 

Also, this week we will have presentations of possible project ideas from you (5 min approx, 1-3 slides) towards the end of the second lecture and the first hour of the lab session.  Please send me a final confirmation by the end of Wednesday (Feb 5) if you wish to present and I will set up a schedule for tomorrow. And if you have a short title (and perhaps spme slides), even better.

You will also find examples of several projects from last year, see ...

Feb. 5, 2025 8:05 AM

Dear all, we hope this week has started the best possible way. Here are our plans for the lecture on January 30.

The aim this week is to give a review of the basics of neural networks. Many of you have seen similar material before, but we think it is useful to repeat some of the basics as neural networks are essential parts of most algorithms we describe later, whether these are CNNs, RNNs, autoencoders or other methods we will discuss.

We will also present and discuss different project variants. Next week we will also try to have presentations from those of you who have defined specific and own projects.

Note also that I have changed the zoom link to my UiO account. Our permanent zoom link for the rest of the semester is

https://uio.zoom.us/my/mortenhj

 

The jupyter-notebook  with the material for this week (with code examples) is at ...

Jan. 29, 2025 11:14 PM

Our zoom link for the rest of the semester is https://uio.zoom.us/my/mortenhj

Jan. 28, 2025 9:41 PM

Dear all, welcome to FYS5429/9429. Our first lecture is Thursday January 23 at 1215pm to 2pm. We have also set aside an eventual lab session from 2pm to 4pm on Thursdays. Our lecture room is F?434 at the Department of Physics.   For those who cannot attend in person, the zoom link is https://msu.zoom.us/j/99649445421 Meeting ID: 996 4944 5421 and all lectures will be recorded.  

The emphasis is on deep learning algorithms, starting with the mathematics of neural networks (NNs), moving on to convolutional NNs (CNNs) and recurrent NNs (RNNs), autoencoders, graph neural networks and other dimensionality reduction methods to finally discuss generative methods. These will include Boltzmann machines, variational autoencoders, generalized adversarial networks, diffusion methods and other. See the course GitHub link for more information, weekly plans and more http...

Jan. 22, 2025 12:16 PM

Dear all, welcome to FYS5429/9429. Our first lecture is Thursday January 23 at 1215pm to 2pm. We have also set aside an eventual lab session from 2pm to 3pm on Thursdays. Our lecture room is F?434 at the Department of Physics. 

The emphasis is on deep learning algorithms, starting with the mathematics of neural networks (NNs), moving on to convolutional NNs (CNNs) and recurrent NNs (RNNs), autoencoders, graph neural networks and other dimensionality reduction methods to finally discuss generative methods. These will include Boltzmann machines, variational autoencoders, generalized adversarial networks, diffusion methods and other. See the course GitHub link for more information, weekly plans and more https://github.com/CompPhysics/AdvancedMachineLearning/tree/main. There you will also find a tentative weekly plan with lecture notes and reading suggestions. You will also find a link to the textbooks (with codes and more) that we will follow. 

...
Dec. 5, 2024 10:11 AM