Weekly plans for week 40
Dear all and welcome to a new week!
Here's a brief summary from last week with plans for this.
Lab on Wednesday: We still have the digital labs from 8-10 and 14-16, the zoom link is the same,
https://msu.zoom.us/j/95317649875?pwd=aWM1akppam4yWVBIY29KaXE5cHpSZz09
Meeting ID: 953 1764 9875
Passcode: 536396
And the deadline for project 1 is Monday October 11. It suffices to upload the link to your GitHub or Gitlab repository by the deadline.
Project 2 will be available this coming Sunday and deals with neural networks.
This leads to this week's topics with a brief mention of what we discussed last week. Last week we summarized our discussion of Logistic regression for binary classification problems and started discussing optimization methods like various gradient descent methods. We will finalize this discussion this Thursday by studying the various stochastic gradient descent approaches that exit. The first lecture on Thursday will be devoted to this. We will also discuss how to use automatic differentiation in the calculation of gradients.
These techniques and methods play a crucial role in deep learning. We start on Thursday with neural networks and continue with this topic on Friday and the coming weeks as well. The slides for this week are at
https://compphysics.github.io/MachineLearning/doc/web/course.html, go to week 40. The reading suggestions are for Stochastic Gradient Descent,:
-chapter 4 of Geron's text.
For neural networks we recommend Goodfellow et al chapters 6 and 7 and Bishop 5.1-5.4
In the next weeks we will follow closely Goodfellow et al on deep learning, chapters 6-12.
Best wishes for the week,
Morten et al