Week 45 information

Dear All,

last week we wrapped up the discussion on the few unsupervised learning methods discussed in this course, namely principal component analysis and clustering (k-means). We started discussing decision trees. We will this week discuss in detail the algorithms for regression and classification for decision tree and then move on to ensembles of trees, either via bagging (bootstrap with aggregate) , random forests, voting and boosting methods. 

The plans for this week are 

Wednesday: lab, in person and digital (8-10 and 14-16). Note that we will have pizza at F?434 around 1pm tomorrow, feel free to get some extra energy while wrapping up project 2. At the lab we will also discuss formats for the report.

Thursday Lecture: Basics of Decision Trees, Bagging and Voting

Friday Lecture: More on Bagging, Voting, Random Forests and start Boosting

Videos

Video on Decision trees

Video on boosting methods by Hastie.

Reading Suggestions:

Decision Trees: Geron's chapter 6 covers decision trees while ensemble models, voting and bagging are discussed in chapter 7. See also lecture from STK-IN4300, lecture 7. Chapter 9.2 of Hastie et al contains also a good discussion.

Best wishes to you all with wrapping up project 2.

Morten et al
 

Publisert 9. nov. 2021 07:34 - Sist endret 9. nov. 2021 07:34