Beskjeder

Publisert 5. okt. 2025 20:29

Dear all, first of all thx so much for heroic efforts with project 1. We are truly impressed by what you have been doing. Keep up the good work and best wishes to you all with the finalization of project 1.

This week we start discussing how to actually develop a neural network code. This will be the topic for the second project.  We will make the project  available next week and discuss it in more detail during the lectures and the lab sessions. This week we plan to start with a simpler set of exercises where you implement the feed-forward part of a code for a neural network. The exercises for this week can then in turn be used as a basis for the code in project 2. 

The plans this week are (see also links to various videos):

Material for the lecture on Monday October 6, 2025

  1. Neural Networks, setting up the basic steps, from the simple perc...

Publisert 29. sep. 2025 07:46

Dear all, we hope you've had a great weekend.

Here are  the updates and plans for this and the coming week.

Today we will continue our discussion of logistic regression that we started last week, with coding examples as well. We will repeat some of the essential elements and derivations. Logistic regression will serve as our stepping stone towards neural networks and deep learning methods. Next week we will devote our time to setting up a neural network code and we will also introduce automatic differentiation, which will allow us to compute gradients and derivatives for different cost functions, without having to encode directly the expressions for the derivatives.

The plans for this week, with some video recommendations, are:

Lecture Monday September 29, 2025

  1. Logistic regression and gradient descent, examples on how to code
    ...
Publisert 22. sep. 2025 06:56

Dear all, welcome back to FYS-STK and a new week. We hope you had a great weekend.

Here are the plans for this week.

For the lecture on Monday the 22nd, the plans are

  1. Resampling techniques, Bootstrap and cross validation and bias-variance tradeoff

  2. Logistic regression, our first classification encounter and a stepping stone towards neural networks

Readings and Videos, resampling methods

  1. Raschka et al, pages 175-192

  2. Hastie et al Chapter 7, here we recommend 7.1-7.5 and 7.10 (cross-validation) and 7.11 (bootstrap). See https://link.springer.com/book/10.1007/978-0-387-84858-7.

  3. Video on bi...

Publisert 18. sep. 2025 06:33

Dear all, the video from one of the lab sessions, where we discuss and derive the bias-variance tradeoff, is available at 

https://youtu.be/GBWc1abChKo

It may be of relevance for the exercises this week and obviously project 1.  Furthermore, the whiteboard notes for this week have been updated and you will find the derivation of the bias-variance tradeoff equations there as well, see https://github.com/CompPhysics/MachineLearning/blob/master/doc/HandWrittenNotes/2025/FYSSTKweek38.pdf

 

Finally, there was a typo in exercise 4a (this has been corrected now).  There should only be a single target, as we should not resample targets.

These lines:

predictions = n...

Publisert 14. sep. 2025 09:16

Dear all, welcome back to a new exciting week. We hope you all have enjoyed and are still enjoying the weekend.

The plans this week are to start with a discussion of a statistical interpretation of OLS, Ridge and Lasso (to be continued next week as well). 

This is relevant for the weekly exercises this week and the final part of project 1.  We will also discuss the so-called Bias-Variance tradeoff and resampling methods like cross-validation and bootstrap. The videos listed below may also be helpful. Take a look at them before the lecture if you can. I am particularly fond of the Statquest videos of Josh Starmer, see https://statquest.org/

Material for the lecture on Monday September 15.

  1. Statistical interpretation of OLS regression and other statistical properties

  2. Resampling techniques, Bootstrap and cross validation and bias-variance tradeoff (this ma...

Publisert 7. sep. 2025 13:32

Dear all, welcome back to FYS-STK3155/4155 and a new exciting week! Our plans this week are

to discuss the family of gradient descent methods which we need to implement in the project (parts c-e, including Lasso regression).

  1. Plain gradient descent (constant learning rate), reminder from last week with examples using OLS and Ridge

  2. Improving gradient descent with momentum

  3. Introducing stochastic gradient descent

  4. More advanced updates of the learning rate: ADAgrad, RMSprop and ADAM

Readings and Videos:

  1. Recommended: The textbook Goodfellow et al, Deep Learning, contains a good introduction to gradient descent, see sections 4.3-4.5 at https://www.deeplearningbook.org/contents/nume...

Publisert 2. sep. 2025 06:49

Dear all, project 1 (first version) is available from the folder https://github.com/CompPhysics/MachineLearning/tree/master/doc/Projects/2025/Project1 (either as pdf+latex or jupyter-notebook) and from the jupyter-book site https://compphysics.github.io/MachineLearning/doc/LectureNotes/_build/html/project1.html

We will discuss the project during the coming lab sessions.

Do not hesitate to ask in case you are in doubt.

Best wishes to you all.

Publisert 31. aug. 2025 21:37

Dear all, welcome back to a new week. We hope you've had a nice and relaxing weekend. 

Here are the plans for the coming week. 

We will discuss in more detail the mathematics of ordinary least squares, Ridge regression and Lasso regression, introduced at the end of the lecture last week. This is will be the material for the first lecture on Monday. Thereafter (second lecture) we will start discussing the numerical solution of the optimization problem using gradient methods, or what are normally called gradient descent methods. 

 

Ordinary Least Squares and Ridge regression are methods where we have an analytical solution for the optimal parameters. For Lasso regression we need to solve the equations numerically. This is the standard situation in essentially all machine learning methods.  Introducing gradient methods, we will thus also introduce the recipe for solving the equation for the Lasso regression...

Publisert 25. aug. 2025 07:02

Dear all, welcome back to FYS-STK3155/4155. We hope you've had a great weekend. This week we plan to continue our discussion of linear regression, with an emphasis on its derivation and links to mathematical interpretations. We plan also to start our discussion of Ridge and Lasso regression. The plans are

 Brief repetition from last week

o Discussions  of the equations for ordinary least squares (OLS)

o Discussion on how to prepare data and examples of applications of linear regression

o Mathematical interpretations of OLS

o Introduction of Ridge and Lasso regression

 

Reading recommendations:

  1. The weekly lecture notes

  1. Goodfellow, Bengio and Courville, Deep Learning, chapter 2 on linear algebra 

  2. Raschka et al on preprocessing of data, relevant...

Publisert 18. aug. 2025 12:51

If you are interested in finding team mates for projects and exercises, please fill in the form at https://docs.google.com/forms/d/1W57mA196ojFKxqW4-T6NsN0rx1BFUl-j8b4nQzrcG34/edit

Publisert 4. aug. 2025 13:56

First of all a warm welcome to you all.

Our first lecture is Monday August 18, 215pm-4pm.

The sessions on Tuesdays and Wednesdays last four hours for each group (four in total) and will include lectures in a flipped mode (promoting active learning) and work on exercises and projects. The sessions will begin with lectures and questions and answers about the material to be covered every week. There are four groups, Tuesdays 815am-12pm and 1215pm-4pm and Wednesdays 815am-12pm and 1215pm-4pm. Please sign up as soon as possible for one of the groups. Max capacity per group is 30-40 participants. Please select the group which fits you best.

The first week we start with simple linear regression, a repetition of linear algebra and elements of statistics needed for the course.

Note that there is no mandatory presence. All lectures can be accessed either in person or live via zoom at  https://uio.zoom.us/my/mortenhj

...