Plans for week 44, October 31-November 4
Dear All, welcome to a new week and a new topic (our third last).
Last week we ended our discussions of deep learning methods with a discussion of convolutional neural networks and recurrent neural networks. This week we start with another set of very popular methods for both classification and regression. We will start with decision trees and then move over to ensembles of decisions trees (random forests, bagging and other methods) and then end with boosting methods and gradient boosting.
The plans for this week are
Lab Wednesday and Thursday: work on project 2
Lecture Thursday: Basics of decision trees, classification and regression algorithms
Lecture Friday: Decision trees and ensemble models (bagging and random forests)
Teaching material: Lecture notes week 44 at https://compphysics.github.io/MachineLearning/doc/pub/week44/html/._week44-bs001.html
Videos: Video on Decision trees https://www.youtube.com/watch?v=RmajweUFKvM&ab_channel=Simplilearn
Other Reading: Decision Trees: Geron's chapter 6 covers decision trees while ensemble models, voting and bagging are discussed in chapter 7. See also lecture from STK-IN4300, lecture 7 /studier/emner/matnat/math/STK-at IN4300/h20/slides/lecture_7.pdf. Chapter 9.2 of Hastie et al contains also a good discussion.
Concerning the feedback on project 1, we are striving to have our first round of feedbacks by the end of this week.
A small addendum on deep learning and feature visualization for those interested, see https://distill.pub/2017/feature-visualization/ and how neural networks build up their understanding of images