Weekly plans for week 39
Dear all, we hope this week as well has started the best possible way.
Here follows the weekly digest, but first a change to the deadline for project 1.
We have changed the deadline to Monday October 11 at midnight. It suffices btw to upload your GitHub or GitLab link to the where you have your report and codes. In the repository please also add a README file so we can easily find the relevant files. It is normally convenient to have for example three folders, one for codes you have developed, one for the report itself and perhaps another folder for different test runs (selected examples) which we can reproduce when we run your codes.
The report can be a doc-file (office or openoffice), a PDF file or a jupyter-notebook.
Else, this week we will discuss on Thursday we will wrap up our discussion on logistic regression from last week (with selected examples). Thereafter we discuss how we can use various gradient methods to find the optimal parameters of a given model. We will focus on gradient descent methods and the family of stochastic gradient descent methods.
The plan for this week is thus
Lab Wednesday as usual with digital labs 8-10 and 14-16
Lecture Thursday: Logistic Regression with examples and Gradient Optimization methods
Lecture Friday: Gradient methods
Reading recommendations:
See lecture notes for week 39 at https://compphysics.github.io/MachineLearning/doc/web/course.html.
For a good discussion on gradient methods, see Goodfellow et al section 4.3-4.5 and chapter 8. We will come back to the latter chapter in our discussion of Neural networks next week.
We will also discuss some practicalities concerning project 1 and how to use a functionality like grid search
in Scikit-Learn in order to find the optimal regularization parameters.
Best wishes to you all,
Morten et al