Beskjeder
The deadline for project is now December 18 at midnight.
Best wishes to you all.
We will try to organize a last lab session for those interested this coming Thursday (December 15) from 10am to 12pm, both online (same zoom link as before) and in person at F?434.
Best wishes to you all and don't hesitate to swing by if you have questions.
Morten et al
Dear all, we hope you all doing well during these hectic exam times.
Since our last regular lab sessions and lectures ended last week (Friday), we thought of organizing (this week) some additional questions and answers sessions. Since many of you are probably busy working on exams we thought of first of offering these sessions via zoom, but also in-person for those of you who prefer/can.
This week we are planning on zoom and in-person Q&A and help sessions Thursday 2pm-4pm and Friday 2pm-4pm. We will use the same zoom link as we used for the lectures. If you wish to be there in person, we will be at our lab F?434 at the same time.
Don't hesitate to come by for questions or just swing by our offices in case of questions and more.
Best wishes to you all,
Morten et al
p.s. we are hoping to be able to send feedback on project 2 to you all by the middle of next week
Hi all, this is sadly our last week and after our discussions on support vector machines last week (see lecture notes and videos) we are now ready to wrap up the semester by scratching the surface of unsupervised learning methods. We will focus on the standard principal component analysis (PCA) method (which allows us to revisit the correlation and covariance matrices and the SVD) and one of the simplest (and very intuitive ) clustering methods, namely what is called k-means clustering. You will find all this wonderful material, plus a summary and more by jumping into the lecture slides for week 47, see for example https://compphysics.github.io/MachineLearning/doc/pub/week47/html/week47-reveal.html.
Else, see also
- Geron's chapter 9 on PCA
- Hastie et al Chapter 13 (sections 13.1-13.2 are the most relevant ones)
- and excellent videos at:
- We recommend hi...
Hi all,
this is just a quick reminder that the final deadline for project 2 is now set to Friday the 18th (midnight). We pushed it from last Friday to Wednesday and then finally to Friday this week.
Also, feel free to come with suggestions for project topics for project 3 to be presented at Friday this week.
We will discuss this also during the lecture on Thursday.
Finally, here are some general observations from us about project 1. Hopefully these remarks can be of use when you wrap up the report for project 2 and work on project 3 as well.
Best wishes to you all,
Morten et al.
////. Comments about project 1
Summary after corrections:
* Many of you have written very nice codes! Thx, this part is very good. And there were many excellent results.
* However, many of you are not used to write scientific reports. Here are so...
Dear all, welcome to a new week with FYS-STK.
Last week we wrapped up our discussions on decision trees and ensemble methods based on decisions trees (bagging, random forests, boosting and gradient boosting). These are all very popular methods and in particular for classification problems, often produce excellent results on training and predictions. And they are all simple to implement and have a low-level of mathematical complexity. Last week we started also with our last supervised learning method, Support Vector machines. This topic will also keep us busy this coming week. We are also planning to run an eventual mini-workshop on possible topics for project 3. Here you'll find the topics presented by different groups in 2020 and 2021.
In 2020 the contributions were (and some of these ended up in thesis work and/or publications, online only due to Covid-19!)
- Maria Emine Nylund: Lego Bricks Classifier...
Dear all, welcome to a new week and FYS-STK.
Last week we went through the basic algorithms of decision trees for classification and regression, with an emphasis on the so-called CART algorithm. We discussed also so-called ensemble methods like bagging, voting and random forests before starting with boosting methods. The latter use also decision trees as weak learners. We will go through the details of these methods this week and discuss ADAboost and gradient boosting. If we get time, we may start with support vector machines, our second last topic this semester. Else, the deadline for project 2 is now set to Wednesday 16th of November since we also postponed the deadline for project 1 by five days.
Lab Wednesday and Thursday: work on project 2, new deadline November 16 at midnight.
Lecture Thursday: Boosting m...
Important note: Due to the "High-School teachers' week" (Faglig pedagogisk dag in Norwegian) at the University of Oslo, our lecture hall is occupied Thursday and our Thursday lecture has to be on zoom only.
It will be recorded as usual. On Friday we are back to our regular auditorium. We apologize for this inconvenience.
Dear All, welcome to a new week and a new topic (our third last).
Last week we ended our discussions of deep learning methods with a discussion of convolutional neural networks and recurrent neural networks. This week we start with another set of very popular methods for both classification and regression. We will start with decision trees and then move over to ensembles of decisions trees (random forests, bagging and other methods) and then end with boosting methods and gradient boosting.
The plans for this week are
Lab Wednesday and Thursday: work on project 2
Lecture Thursday: Basics of decision trees, classification and regression algorithms
Lecture Friday: Decision trees and ensemble models (bagging and random forests)
Teaching material: Lecture notes week 44 at https://compphysics.gi...
Dear all,
last week we applied neural networks to the solution of differential equations and started our discussion of convolutional neural networks. The plan this week is as follows
* Lab: Wednesday and Thursday work on project 2. The lab on Wednesdays is only at F?434.
* Lecture Thursday: Convolutional Neural Networks (CNN)
* Lecture Friday: Recurrent Neural Networks (RNN)
=== Videos and reading recommendation
Video on Convolutional Neural Networks from MIT at https://www.youtube.com/watch?v=iaSUYvmCekI&ab_channel=AlexanderAmini
Video on Recurrent Neural Networks from MIT at https://www.youtube.com/watch?v=SEnXr6v2ifU&ab_channel=AlexanderAmini
=== Reading Recommendations
CNN readings
Goodfe...
The center on computational science and data science at UiO, dScience, would like to welcome students and people in the data science community to the 5th annual Data Science Day! This is an evening of socialization, learning and entertainment.
Dear all, last week we discussed how to build a neural network. The topics covered last week, with pertinent videos (videos should be available now, I was not too happy with the first iterations) were
- Lecture Thursday: Deep learning and Neural Networks, developing a code for Neural Networks, discussion of the back propagation algorithm
- Video of Lecture at https://youtu.be/yzbxJI6LgL0
- Lecture Friday: Building a neural network
- Video of Lecture at https://youtu.be/CPj4mh7M9no
This week we will focus on further discussions of neural networks, their pros and cons, how to use TensorFlow and Keras, more examples and applications to the solution of differential equations and if we get time, we will start discussing convolutional neural networks.
This week looks thus like this:
-Wednesday and Thursday Lab...
Dear all,
first of all, congratulations to you all for heroic and great efforts with project 1. We are very proud of what you all have done. Congrats.
This week due to a workshop Morten is attending, the lectures on Thursday and Friday will be recorded only. The video for Thursday will be uploaded by Thursday and a mail will be sent to you all with the link to the video. Thus, there is no direct lecture via zoom on nor in-person lecture Thursday. The video will also be posted under the schedule link.
This applies to Friday as well. The two videos will replace our regular lectures and the topics to be covered are
* Thursday: Building our own Feed-forward Neural Network and discussion of project 2.
* Friday: Playing around with our own Feed-forward Neural Network and introduction to TensorFlow. Solving differential equations with neural networks.
Reading suggestions for both days: The lecture notes for week41,...
Dear all, the deadline is changed to Tuesday October 11 at midnight.
Hi all. We hope you had a great weekend. Here's a shot update with a brief summary from last week.
Last week, unfortunately, the wireless was down (or very flaky) at the auditorium we use. This means that the zoom session aborted several times and we could not produce a video of the lecture. We will make a separate and new recording today and post it tomorrow. We apologize for these technical problems.
Last week we covered optimization methods and these will be part of project 2. There the aim is to write your own gradient descent method with adaptive learning rate schedulers, as discussed during the lectures last week.
This week starts thus with a recap from last week on optimization schemes and where to find code examples relevant for project 2. Project 2 is available from Friday this week.
This week we start also with neural networks and the week looks more or less like this
Wednesday: Lab at F?434, and Thursday at F?398
T...
NOTE: Due to UngForsk we don't have access to our regular auditorium on Thursday. Thursday's lecture will thus be digital only via zoom. We are sorry for this.
Lab on Wednesday is at F?397 from 815am-12pm. Next week and till the end of the semester, the lab on Wednesdays is always at F?434
Dear all, welcome back to FYS-STK3155/4155.
Here's the weekly update with some messages below which could be of interest to some of you.
Last week we discussed logistic regression and started with optimization methods. We will devote this week to several of these optimization methods and in particular on ways to estimate the gradients. This will lead us from the simple gradient descent to various variants of stochastic gradient descent. We will also discuss how to make life less painful with algorithms like automatic differentiation.
The material this week is covered by the slides for week 39 at https://compphysics.github.io/MachineLearning/doc/web/course.html. In addition, for a good discussion on gradient methods, we would like to recommend Goodfellow et al section 4.3-4.5 and chapter 8. We will come back to this in our discussion of Neural networks as well. See https://www.deeplearningbook.org/
For Stochastic Gradient Descent an...
Dear all, sorry for spamming you with messages this week.
It seems that room F?434 is available from 8am-12pm every Wednesday till the end of the semester, except for next week (Wednesday September 28 8-12).
It means that Lab group 1, Wednesdays 815-10am and lab group 3, Wednesday 1015-12pm move from F?397 to F?434 (one floor up). F?434 is a better room and has space for more people.
Except for next week thus, all labs on Wednesdays are in F?434.
Best wishes to you all,
Morten et al.
p.s. These changes will appear at the official calendar page asap
Hi all, feel free to contact Domantas Sakalys and Synne Sandnes at domantas.sakalys@fys.uio.no and s.m.sandnes@fys.uio.no.
They have kindly volunteered to act as representatives for all participants following the course. Feel free to reach out to them in case there are topics you wish to bring up, they will convey your inputs to the teaching team, hopefully improving the quality of the course.
Dear all, welcome back to a new week and FYS-STK3155/4155. We hope you've had a relaxing weekend.
Last week we discussed resampling methods like the bootstrap and cross-validation, as well as other statistical properties such as the central limit theorem and expectation values. Data sampling refers to statistical methods for selecting observations from the domain with the objective of estimating a population parameter. Whereas data resampling refers to methods for economically using a collected dataset to improve the estimate of the population parameter and help to quantify the uncertainty of the estimate.
Both data sampling and data resampling are methods that are required in a predictive modeling problem.
This week we will use the first lecture on Thursday to wrap up our discussion on resampling methods, with a focus on cross-validation. Thereafter, we move over to classification problems. Think of these as data sets with discrete outcomes (yes o...
Here's a quick overview of the plans for this week, with a short review of last week.
Last week we started with project 1, which focuses on linear regression (ordinary least squares, Ridge and Lasso regression). The project aims at first fitting a two-dimensional function where we can generate the data. This serves as a stepping stone towards the analysis of real data (maps in three dims). The reason why we focus on linear regression is that several of these methods have analytical solutions for the optimal parameters. Furthermore, they allow to make links with a statistical interpretation and to discuss central issues in machine learning like resampling methods, overfitting/underfitting (via the bias-variance tradeoff) and more.
Last week during the lectures we discussed how to interpret ordinary least squares and ridge regression from a linear algebra point of view(via the singular value decomposition of a matrix). We ended Friday's lecture with...
Dear all, welcome to a new week. Project 1 is now available and we will discuss this both during the lab sessions and during the various lectures. You can find the project either by scrolling down to project 1 at https://compphysics.github.io/MachineLearning/doc/web/course.html (you will find the project as a latex file, a pdf file, a jupyter-notebook and various html style). Let us know if you spot typos and/or inconsistencies. You can also obtain all new files by a git pull or going to the folder https://github.com/CompPhysics/MachineLearning/tree/master/doc/Projects/2022/Project1.
This week at the lab you can opt to start with the project or do the exercises from last week (which prepare your for project 1). Feel free to opt for the alternative which fits best.
Else, this week we continue our discussion of the singular value decomposition in connection with inverses of matrices, Ridge and Lasso regression and interp...
For those of you who are new to UiO or just wish to find collaborators for the various projects and exercises during this fall semester, feel free to fill in the questionnaire at
https://docs.google.com/forms/d/12VNXJOqMfLGism580eBps_M7zk-gzXe7Qd-B2Ll_s8o/edit
Based upon your responses we will try to suggest possible teams (2-3 persons per group).
Welcome to a new week to you all.
This week the plans are as follows
o Lab Wednesday: Work on exercises 1-5 for week 35, see end of weekly slides for the exercises, https://compphysics.github.io/MachineLearning/doc/pub/week35/html/week35.html. These exercises are not mandatory and don't stress if you don't finish them all. We can easily continue with these exercises next week as well. They serve as a basis for the first project.
o Thursday: Review of ordinary Least Squares with applications, reminder on statistics and start discussion of Ridge Regression and Singular Value Decomposition
o Friday: Discussion of Ridge and Lasso Regression and links with Singular Value Decomposition
Reading recommendations:
o See lecture notes for week 35 at URL:"https://compphysics.github.io/MachineLearning/doc/web/cou...
Dear all the video for the lecture is at
The handwritten notes are at
https://github.com/CompPhysics/MachineLearning/tree/master/doc/HandWrittenNotes/2022