Plans for week 43, October 21-25

Dear all, note that this coming Monday there is no in-person lecture (or direct zoom session) since Morten is away for a conference.  The video of the lecture will however be uploaded this coming Sunday.  The material is covered by the lecture notes at for example https://github.com/CompPhysics/MachineLearning/blob/master/doc/pub/week43/ipynb/week43.ipynb

The material for this week deals with how we can set up a final code for neural networks, either by writing our own code or using popular software libraries like TensorFlow and/or PyTorch.  Hopefully it can function as a source of inspiration for project 2.

The plan for this week is as follows (with literature suggestions)

 

Material for the lecture on Monday October 21, 2024.

  • Building our own Feed-forward Neural Network with intro to Tensorflow

  • Solving differential equations with Neural Networks

Exercises and lab session week 43

Lab sessions on Tuesday and Wednesday.

  • Exercise on writing your own neural network code

  • The exercises this week will be continued next week as well

  • Discussion of project 2

Mathematics of deep learning

Two recent books online.

  1. The Modern Mathematics of Deep Learning, by Julius Berner, Philipp Grohs, Gitta Kutyniok, Philipp Petersen at https://arxiv.org/abs/2105.04026, published as Mathematical Aspects of Deep Learning, pp. 1-111. Cambridge University Press, 2022

  2. Mathematical Introduction to Deep Learning: Methods, Implementations, and Theory, Arnulf Jentzen, Benno Kuckuck, Philippe von Wurstemberger at https://doi.org/10.48550/arXiv.2310.20360

Reminder on books with hands-on material and codes

Reading recommendations

  1. Rashkca et al., chapter 11, jupyter-notebook sent separately, from GitHub site at https://github.com/rasbt/machine-learning-book. See also chapters 12 and 13 on using Pytorch to make a Neural network code.

  2. Goodfellow et al, chapter 6 and 7 contain most of the neural network background.

Using Automatic differentiation

In our discussions of ordinary differential equations and neural network codes we will also study the usage of Autograd, see for example https://www.youtube.com/watch?v=fRf4l5qaX1M&ab_channel=AlexSmola in computing gradients for deep learning. For the documentation of Autograd and examples see the lectures slides from week 39 and the Autograd documentation at https://github.com/HIPS/autograd.

Back propagation and automatic differentiation

For more details on the back propagation algorithm and automatic differentiation see

  1. https://www.jmlr.org/papers/volume18/17-468/17-468.pdf

  2. https://deepimaging.github.io/lectures/lecture_11_Backpropagation.pdf

  3. Slides 12-44 at http://cs231n.stanford.edu/slides/2017/cs231n_2017_lecture4.pdf

Best wishes to you all,

Fahimeh, Ida, Karl Henrik, Mia, Morten, Odin, and Sigurd

Publisert 19. okt. 2024 09:28 - Sist endret 19. okt. 2024 09:28