Wednesday, January 17th: We shall begin by reviewing some of the material from the appendix at the end of the book. Most of this should be known from previous courses, but it may be a good idea to review if from our perspective. The lecture will be a little shorter than usual as I have an exam going on from 3 PM.
Friday, January 19th: The plan is to cover section 1.1 with an emphasis on the examples (but we probably won't do quite as many as the book). Hopefully we shall also get started on section 1.2. It's a good idea to look at the material beforehand!
Wednesday, January 24th: The physical lecture is cancelled as I'm out of town, but I have made a recorded lecture in three parts (se links below). The material is from section 1.3, but I do things in a slightly different order than the book. The lecture of Friday January 26th will have the same format and be on the remaining parts of section 1.3 and section 1.4.
1. Stopping times and the strong Markov property: Notes. Video.
2. Recurrent and transient states: Notes. Video. (Look out for slips of the tongue: In one of the examples I first call some states "transient", before I correct myself and change to the correct "recurrent".)
3. A criterion for recurrence: Notes. Video.
Friday, January 26th: The physical lecture is cancelled as I'm out of town, but I have instead made a recorded lecture in three parts (se links below). The first two are from section 1.3, the last one from section 1.4.
1. Decomposition of the state space. Notes. Video.
2. Communication classes. Notes. Video. (This year's textbook doesn't seem to use the term "communication classes" – or just "classes" – but they appear in so many old exam problems that they are good to know about).
3. Stationary distributions. Notes. Video. (Excuse the quality - I have to correct myself several times, but I didn't have the energy to make a new version).
Wednesday, January 31st: The physical lecture is cancelled due a minor accident, and I have put out three videos instead. They are all from sections 1.4 and 1.5.
1. More about stationary distributions and especially the doubly stochastic situation. Notes. Video.
2. The detailed balance equation. Notes. Video.
3. Time reversed processes. Notes. Video.
Friday, February 2nd: I'm not able to give this lecture in person and have prepared two videos instead. The first video has a format I don't really like (the text is written out beforehand, and I only talk my way through it), but sitting for long periods is uncomfortable at the moment. The second video combines material from sections 1.6 and 1.8 (I'm aiming at proving the Convergence Theorem 1.19 before I do section 1.7 and the rest of section 1.6).
1. Kolmogorov's criterion for the detailed balance equation. Notes. Video. (In the book, this argument is hard to read due to many \(q\)'s appearing in random places where \(p\) should have been.)
Wednesday, February 7th: I'm on partial sick leave for the next two weeks, and will continue with the digital lectures through this period.
The main theme for February 7th will be convergence to stationary distributions, and the main event will be the proof of the Convergence Theorem 1.23 (also known as Theorem 1.19). The proof is in section 1.8 and uses the so called coupling method, which is a much used technique in more advanced Markov chain theory. The two other videos are on important theorems in section 1.7. As the three videos below cover more material than an ordinary lecture, I will make the videos for Friday shorter – and also less theoretical (the course is not at all as theoretical as this weeks videos may lead you to assume!)
1. The Convergence Theorem. Notes. Video.
2. Return to states (Theorems 1.20 and 1.21). Notes. Video.
3. Existence of stationary measures (Theorem 1.24). Notes. Video.
Friday, February 9th: The first video is on the last remaining result from sections 1.6/1.7, the so-called Ergodic Theorem 1.22. The proof in the book is very sketchy, and I have tried to fill in a few missing steps. The other two videos illustrate how you can use the tools we now have at our disposal to analyse simple Markov chains.
1. The Ergodic Theorem. Notes. Video
2. Example: An irreducible Markov chain. Notes. Video.
3. Example: A Markov chain with transient states. Notes. Video.
Wednesday, February 14th: As there has been a lot of theory lately, I thought it was time for an example that sums up much of what we have been doing so far, and at the same time introduces the themes we are going to look at next. So in the first video, you will find what is basically a solution of Problem 1 from the exam in 2005. If you want to try this problem on your own, be aware that some of the older exam problems use a different convention for transition matrices: What they call the transition matrix is the transpose of what we call the transition matrix, and hence you have to turn columns into rows before you start to solve the problem our way.
The second video is an alternative treatment of the last part of the first video, and it paves the way for the third video, which is again more theoretical.
1. Long example based on Problem 1 from Exam 2005. Notes. Video.
2. An alternative (more matrix oriented) approach to the last part of the previous video. Notes. Video.
3. The fundamental theorem of exit probabilities (Theorem 1.28). Notes. Video.
Friday, February 16th. To compensate for the recent long and dense lectures, here are just a few short examples of applications of exit probabilities (and how to compute them).
1. Endgame in tennis. Notes. Video.
2. An unfair game. Notes. Video.
Wednesday, February 21st: Today's topic is exit times from section 1.10. In addition to the material in the book, I have a written a supplementary note on the expected time a Markov chain spends in transient states. As this has been a popular exam topic, it is useful to know something about it when one tries to solve old exam problems. As usual for Wednesdays, there is more material here than for a normal lecture, and I shall try to restrain myself on Friday (seems I didn't manage).
1. Introductory examples. Notes. Video.
2. Proof of Theorem 1.29. Notes. Video.
3. Example: Three heads in a row. Notes. Video.
4. Time spent in transient states (see the supplementary note). By accident, the notes for the video got split in two: Notes (part 1), Notes (part 2). Video.
Friday, February 23rd: The main topic of the day is from section 1.11 and deals with the distinction between two kinds of recurrent behavior, positive recurrence and null recurrence. Null recurrence is a borderline kind of recurrence, and it can only occur when the state space is infinite. The first video introduces the topic and proves some fundamental theorems; the second video illustrates the concepts by looking at what happens with random walks on the half-line. The third video introduces the notion of branching processes and proves a fundamental theorem about when such a process will die out in final time. This videos are the last ones from Chapter 1, and we shall continue with Chapter 2 next time.
1. Positive recurrence and null recurrence. Notes. Video.
2. Random walks on the half-line. Notes. Video.
3. Branching processes. Notes. Video.
Wednesday, February 28th: It's time to start Chapter 2. The first (long) video is on exponential random variables, the second, much shorter one, is on Poisson distributions.
1. Exponential distributions (section 2.1): Notes. Video. (Note that I make a mistake with the limits of integration at about 41:00, but correct myself soon after).
2. Poisson distributions (from the beginning of section 2.2 - most of you have probably seen this before): Notes. Video.
Friday, March 1st: The main theme today is Poisson processes. The first video defines Poisson processes and shows how they can be constructed from exponential random variables. The second video tries to explain how Poisson processes often appear in practice (including exams), and contains another description of Poisson processes that is sometimes referred to as "the infinitesimal description" (I forgot to mention this in the video). The third video proves a sometimes useful fact that does not seem to be in the textbook: If a continues distribution is memoryless, it is necessarily exponential.
1. Poisson processes: Definition and construction: Notes. Video.
2. P oisson processes in practice. Notes. Video.
3. Bonus track: The exponential distributions are the only memoryless distributions with a density. Notes. Video.
Wednesday, March 6th: The themes of the day are inhomogeneous Poisson processes (Poisson like processes where the rate varies with time) and compound Poisson processes (where a Poisson process triggers other random variables).
1. Definition of inhomogeneous Poisson processes plus an example. Notes. Video.
2. Proof of the infinitesimal description of Poisson processes. Notes. Video.
3. Compound Poisson processes. Notes. Video.
Friday, March 8th (note the date): Today's material is from section 2.4, and the videos are the last from Chapter 2. As we have been working hard recently, there is a little less material than usual (but I have included a bonus track).
1. Combining and thinning Poisson processes. Notes. Video.
2. Conditioning. Notes. Video.
3. Bonus track from last year: An example of thinning applied to the collection of football cards. Notes. Video.
Wednesday, March 13th (remember that this is a physical lecture, but I'll try to make a recording). We'll start Chapter 3, and my hope is to get through sections 3.1, 3.2.1, and 3.2.2.
Friday, March 15th: This is the last lecture before Easter (next week is midterms), and I'll do my best to finish Chapter 3. As we don't have much time, I may have to treat section 3.3 rather superficially.
Wednesday, April 3rd (remember that the mandatory assignment is due the next day): We'll start chapter 4 on continuous time Markov chains and try to cover section 4.1 and parts of 4.2. Continuous time Markov chains are amalgamations of discrete time Markov chains and Poisson processes. You can think of them either as Markov chains moving in continuous time or as Poisson processes that have more options than just jumping one upwards.
Friday, April 5th: I'll start by deriving the Kolmogorov forward equations (very much like the derivation of the backward equations). I'll probably skip the examples of DNA models (examples 4.9 and 4.10) and concentrate on branching processes (but only example 4.11 and not 4.12 which uses other methods). Perhaps we also have time to start section 4.3.
Wednesday, April 10th: We shall work on section 4.3 on limit behavior. This is actually easier for continuous time chains than for discrete time chains as the periodicity problem does not appear in the continuous setting.
Friday, April 12th: I'll first finish Example 4.15 on birth-and-death processes and then continue with section 4.4.
Wednesday, April 17th: The physical lecture is canceled and replaced by the four prerecorded videos below:
1. Exit times (section 4.4). This video recycles an example from the previous lecture ("Mesternes mester"): Notes. Video.
2. Queueing theory: Notation and survey (intro to Section 4.5): Notes. Video.
3. Variations on M/M/1-queues (from section 4.5.1): Notes. Video.
4. M/M/s-queues (from section 4.5.2): Notes. Video.
Friday, April 19th: I aim to cover section 4.5.3 and the beginning of 4.6.
Wednesday April 24th: Due to personal circumstances, I had to drop the physical lecture, and the digitaI lecture is quite short. It does, however, finish section 4.6 and the syllabus from the textbook.
Friday April 26th: This will be a digital lecture (videos) on the note on Brownian motion, which is the last part of the syllabus. The lecture is quite short and consists of two videos:
1. Random walks with vanishing increments: Notes. Video.
2. Brownian motion: Notes. Video.
Friday, May 3rd (there are no classes on the public holiday May 1st). Unfortunately, I have to cancel also this physical lecture, and replace it by two videos covering the last two sections of the note on Brownian motion. This is the last lecture one new material, the rest will be review.
1. It?'s Formula: Notes. Video.
2. Geometric Brownian motion: Notes. Video.
Wednesday, May 8th: The plan is to review as much as possible of Chapter 1 and do a few examples along the way. It will be a physical lecture provided Vy manages to get me to Oslo on time. (PS. As Vy did not manage to get me to Oslo on time, the lecture is being replaced by videos, see the "Reports from lectures" page).
Friday, May 10th: We shall continue reviewing Markov chains by looking at conditions for convergence to the stationary distribution and techniques for finding exit distributions and exit times. I shall also start the review of Chapter 2 by looking at exponential distributions and (if time allows) Poisson processes.
Wednesday, May 15th: I'll continue the review and hope to cover the rest of Chapter 2 and Section 3.1 (I want to collect everything that has to do with queueing theory in one batch at the end). If time allows, I will solve Problem 3 from Exam 2017 and Problem 3, from Exam 2009.
Wednesday, May 22nd: The plan is first to review sections 4.1-4.4 on continuous time Markov chains, and then try to cover everything on queueing theory (sections 3.2.1, 3.2.3, 4.5, and 4.6) in one sweep on Friday. If time permits, I shall try to do problem 2 from exam 2013.
Friday, May 24th. We finish the lectures with a review of queueing theory and Brownian motion. If there's time at the end, I'll do problem 2 from the exam 2006.