Messages
You will find these under the "Exam and past papers" menu.
The syllabus for the exam is as follows:
All material in the lecture notes (including all exercises), except Section 4.2 and the statement of Theorem 7.1.
For the exam, you will be allowed to bring the following:
- A calculator
- A single sheet of A4 paper with your own handwritten notes. You may write on both sides.
I see that many of the comments I made to your oblig submissions are missing on Canvas, for some unknown reason. Please approach me after today's lecture if you want some quick feedback.
Next week we will revise for the exam. The plan is as follows:
- Tuesday 21 November: 2017
- Thursday 23 November: 2021
You can find both papers here on the course webpage. I have also included last year's exam, along with sample solutions, if you want more revision.
Please do exercises 41, 42, 46, 47 and 49 for Tuesday. You may also find it useful to look at exercises 59 and 63 (on Gibbs sampling) along with 64-66 (on convergence diagnostics), which we will cover on Thursday.
Please do exercises 33, 36, 37 and 38 for next Tuesday. Also, you can prepare for the lecture on November 9 by looking at exercise 57.
We will cover the oblig, along with exercise 33 and 38.
Since a few of you have been asking: Yes, you are asked to include your code (with a trial run so I can see the results) in your oblig submission. If you are using Jupyter Notebook, you can easily export your files as a pdf.
Tomorrow, on Thursday, we will cover the Bernstein-von Mises theorem (chapter 7) and start on the general state space Markov chain theory (chapter 8).
You can download the latest version of the lecture notes here on the course webpage.
There is a mistake in problem 2h in the oblig. It should say
Sn-1 = αI + βΦTΦ, like equation (5.15) in the lecture notes.
Based on some questions I have received, I want to clarify two things in the oblig:
Problem 1b: If we want to pay full attention to what is conditioned on at each step, the setup is as follows:
- θ ~ π( . ) (prior)
- y | θ ~ π( . | θ) (forward/observation model)
- θ' | y ~ π( . | y) (posterior)
The question then asks for the marginal distribution of θ'.
Problem 3
I write that Xn+...
In problem 1b in the Oblig, I use the term "forward model", which is not entirely standard terminology. The term "observation model" would be more appropriate.
In Thursday's lecture, we will finish Section 5 (including exercise 43), and cover all of Section 6 (part I of the interlude), including exercise 45.
Since you are all busy with the oblig, there will be no homework for the class on Tuesday. Instead, I plan to cover three exercises:
- Exercise 21, where I made a silly but serious mistake both in the class and the code,
- Exercise 28, where we will look at why so many of you got different results,
- Exercise 43, which is not intended for homework, but part of the lectures on Regression and Classification.
Today, Per August will be your substitute. He will cover the rest of Section 5.1.4, and most of Section 5.2.
The oblig is now available, as of 17:00 October 10. The dataset bjornholt.csv can be found here on the course webpage.
Best of luck to everyone!
Please do exercises 27 - 31 for next Tuesday.
Also, the mandatory exercises (oblig) will be made available to you on Tuesday, and you will have 16 days to complete them, until October 26. Please be aware that you need to pass this assignment to be able to take the exam!
In the next lecture we will cover everything up to and including Section 5.1.2, which ends on page 35. I will also cover exercise 32 and, if time, 34. The dataset is up on the webpage, so feel free to give these exercises a go before the lecture.
A small correction to today's class: In exercise 25, we end up with I(θ) = 1 / σ2, the key point being that the Jeffreys prior is improper.
Please do exercises 23 - 26 for Tuesday next week.
We now have two student representatives (tillitsvalgte) for the course! Many thanks to Kirsten and Markus. Their contact information will be posted on the course webpage.
Today we covered the broad theme of interpretability versus predictability. The discussion is largely based on Leo Breiman's foundational paper "Statistical Modeling: The Two Cultures" (Statistical Science 16:199-231, 2001). It is certainly not on the syllabus, but I encourage everyone interested to read it. If you really want to dig deep into this theme, you can also read C. P. Snow's original book "The Two Cultures" from 1959.
Also, in today's lecture, I should have been more clear about the regression model being linear in w. We can see this clearly in equation (5.5), where the mean is Фw, a linear function applied to w. Th...
We will finish Section 4 on model selection and model averaging. We will also start on Section 5, and cover everything up to and including approximately page 30.
Please do exercise 19-22. We will also cover exercises 17 and 18 (which we didn't have time for last time) in class.
We will finish Sections 2 and 3, which end on page 25.
There was a small typo in the solutions to exercises 11 and 12 (it now says \Gamma(a + b) rather than \Gamma(a + b - 1) under the integral on page 3).
The small typo in exercise_13.ipynb has also been fixed.
We will cover the rest of Section 2.1, which ends on the middle of page 18.