Plans for week 45, November 7-11
Dear all, welcome to a new week and FYS-STK.
Last week we went through the basic algorithms of decision trees for classification and regression, with an emphasis on the so-called CART algorithm. We discussed also so-called ensemble methods like bagging, voting and random forests before starting with boosting methods. The latter use also decision trees as weak learners. We will go through the details of these methods this week and discuss ADAboost and gradient boosting. If we get time, we may start with support vector machines, our second last topic this semester. Else, the deadline for project 2 is now set to Wednesday 16th of November since we also postponed the deadline for project 1 by five days.
Lab Wednesday and Thursday: work on project 2, new deadline November 16 at midnight.
Lecture Thursday: Boosting methods, from AdaBoost to Gradient boosting
Lecture Friday: Gradient boosting and discussion of Decision trees and ensemble methods. Wrapping up trees and start discussing Support Vector Machines
Interesting Videos
Video on Decision trees https://www.youtube.com/watch?v=RmajweUFKvM&ab_channel=Simplilearn
Video on boosting methods by Hastie https://www.youtube.com/watch?v=wPqtzj5VZus&ab_channel=H2O.ai
Video on AdaBoost https://www.youtube.com/watch?v=LsK-xG1cLYA
Video on Gradient boost, part 1, parts 2-4 follows https://www.youtube.com/watch?v=3CC4N4z3GJc
Reading
Hastie et al, chapter 10.1-10.10 https://github.com/CompPhysics/MachineLearning/blob/master/doc/Textbooks/elementsstat.pdf. Geron's chapters 6 and 7 are also useful.
Best wishes to you all