Plans for week of january 26-30
Dear all,
We hope your week has started well. Below we outline the plans for the lecture on January 29.
The main objective of this week’s lecture is to review the fundamentals of neural networks and to connect these concepts to the numerical solution of differential equations, with a particular focus on Physics-Informed Neural Networks (PINNs). While many of you may already be familiar with parts of this material, we believe it is valuable to revisit the core ideas, as neural networks form the backbone of many of the methods we will cover later in the course, including convolutional neural networks (CNNs), recurrent neural networks (RNNs), autoencoders, and related architectures.
During the lab session, we will also introduce and discuss different project variants. If you already have a preliminary idea for a project, you are encouraged to prepare 3–5 slides outlining your proposed topic. These can be uploaded to the course GitHub repository, where we have created a dedicated folder for project proposals at https://github.com/CompPhysics/AdvancedMachineLearning/tree/main/doc/Projects/ProjectProposals/2026
We will hold similar project presentation sessions next week as well. Presenting a project proposal is an excellent opportunity to receive feedback and to connect with potential project partners.
The jupyter-notebook with the material for this week (with code examples) is at https://github.com/CompPhysics/AdvancedMachineLearning/blob/main/doc/pub/week2/ipynb/week2.ipynbLinks to an external site.
Note that additions may be made.
The plan for tomorrow is as follows:
Mathematics of neural networks with emphasis on PINNs
Writing own code (bring back to life your NN code if you have one)
Discussion of project alternatives at the lab session
Videos on Neural Networks
Video on Neural Networks at
https://www.youtube.com/watch?v=CqOfi41LfDwLinks to an external site.
Video on the back propagation algorithm at
https://www.youtube.com/watch?v=Ilg3gGewQ5ULinks to an external site.
Mathematics of deep learning
Two recent books online.
The Modern Mathematics of Deep Learning, by Julius Berner, Philipp Grohs, Gitta Kutyniok, Philipp PetersenLinks to an external site., published as Mathematical Aspects of Deep Learning, pp. 1-111. Cambridge University Press, 2022Links to an external site.
Mathematical Introduction to Deep Learning: Methods, Implementations, and Theory, Arnulf Jentzen, Benno Kuckuck, Philippe von WurstembergerLinks to an external site.
Reminder on books with hands-on material and codes
Sebastian Rashcka et al, Machine learning with Sickit-Learn and PyTorchLinks to an external site.
David Foster, Generative Deep Learning with TensorFlowLinks to an external site.
Bali and Gavras, Generative AI with Python and TensorFlow 2Links to an external site.
All three books have GitHub addresses from where one can download all codes. We will borrow most of the material from these three texts as well as from Goodfellow, Bengio and Courville's text Deep LearningLinks to an external site.
Reading recommendations
Rashkca et al., chapter 11, jupyter-notebook sent separately, from GitHubLinks to an external site.
Goodfellow et al, chapter 6 and 7 contain most of the neural network background.
Best wishes,
Morten, Oda and Ruben