Linear models for classification and regression

This week we review linear models for classification and regression

Lecture notes are here
(late due to travel)

Reading material:
http://cs229.stanford.edu/notes/cs229-notes1.pdf
Only pages 1-7 and 16-19

Relevant video links: https://www.youtube.com/playlist?list=PL3FW7Lu3i5JvHM8ljYj-zLfQRF3EO8sYv Lecture 2-3 (note: they do not cover regression)

 

Exercises will be a mix of theory and programming.

The exercises are for groups  27. and 30.1
 

Theory exercises are here

Programming exercise: are here

Your task is to complete the notebooks knn.ipynb og softmax.ipynb. Open the notebooks using anaconda python 3, e.g. using /opt/ifi/anaconda3/bin/jupyter-notebook xxx.ipynb and fill in as described in the notebooks

UPDATE 31.01: Derivatives of softmax loss are given in slides for lecture 3, slides 62  and 73-74 (vectorised)
An update of the exercises have been made.

Solutions to theory exercises are here (link updated May 2020)
Theory solution

Since the programming exercise is similar to a mandatory exercise at Stanford, no solution will be published on the web.

Publisert 16. jan. 2020 11:24 - Sist endret 25. mai 2020 20:39