Messages
Key words this Week 2: stories (Odin, antidepressant); simulation for gamma posteriors and Dirichlet posteriors (and mixtures thereof); approximate normality things; prior elicitation.
We have been through (more or less): Ch7 exercises 1, 2, 3, 4, 5, 9, 12, 13, 15 (note the mixture prior), 16; then Stories #41 (Odin's children, Bayesian part), #13 (antidrepressant, Bayesian part).
I've placed more Nils-R-code on the site; check e.g. com351a, with mixture prior things for 7.15 (see also 7.23).
For Week 3 onwards: Story #54 (Star Trek, Bayesian part), MCMC 17, 18, 19. Extra: Suppose people in a city have iqs following the \N(100,\sigma^2) distribution, with \sigma = 15. You visit a special school, with \N(\xi,0.5 \sigma^2), and with the normal above as prior. How do you change your view about the pupils of this school, when the first pupil you test has iq = 130?
Nex story (after this): Story #31 (deer in forest, Ba...
Note: the *curriculum for the course* will essentially be the list of exercises and stories we work through (plus some bits from Chs 1 and 5), in the course of the course (!).
We have been through (more or less): Ch7 exercises 1, 2, 3, 4, 5, 9, 12, 13, and Story #41 (Odin's children, Bayesian part).
I've placed more Nils-R-code on the site; check com351, with mixture prior things for 7.15.
For Wed Aug 27 and next week: first Story #13 (antidepressant, Bayesian part); then Ch 7 exercises 15, 16; then Story #54 (Star Trek, Bayesian part); then exercises 17, 18 (MCMC).
We'll do Exercise 7.15 during Wed Aiug 27. There's a little mistake in the text (which will duly be repaired): I first write outcomes (4, 6, 6, 7, 8, 9), and later in the same exercise (2, 5, 3, 7, 5, 8). This is of minor importance, for this toy example; the point is to build and run Dirichlet-simulation based algorithms, then read off the answers. It should be very easy to re-run your code, for any new dataset (and with the same priors). But let's stick to the (4, 6, 6, 7, 8, 9).
Point (b), with the delta method for the ML estimate of \delta, requires a bit of machinery from Ch 5 (or Ch 2), and I will spend a couple of minutes on that as well -- but the main Bayesian things, at this stage, is to understand the posterior, run simulations, and read off answers.
The recipe via independent gammas of point (a) is perfectly fine & insightful. You may however also do library("MCMCpack") and then their "rdirichlet", to save you from a couple...
I was asked about the "theory vs. applied-ness" aspect of our course. This leads me to dig up this, from what we write in the preface of Tore Schweder & Nils Lid Hjort CLP book, "Confidence, Likelihood, Probability" (Cambridge, 2016). I think it applies here, to the stk4021 course, and several others we teach at our Statistics and Data Science section. So I post it here (and ja, read more Goethe).
Epilogue
To round off our preface we feel an apt quotation for our project is the classic "Grau, teurer Freund, ist alle Theorie, und grün des Lebens goldner Baum" ("All theory is grey, dear friend, and green the golden tree of life"). This is indeed an often-quoted and sweet-sounding sentiment -- but perhaps by people not quite realising that this was Mephistopheles' diabolic advice, in reality not expressing Goethe's opinion. The spirit underlying our book is that green and golden applications rest on solid theory, and th...
We used time this first week to think through the basic organisational elements of the course and the teaching material -- you should all get on board Hjort-Stoltenberg's PartOne and PartTwo (and find a printer).
I have uploaded R code scripts, related to (i) Odin's children, (ii) the streetcars of San Francisco, (iii) Poisson data with gamma priors.
How to enjoy the course & get the most out of life (and with a good mark at the end): stay tuned, work with Exercises and Stories, try things, check not only the math but altso the interpretations and implications of each exercise, play with code, sample & have a look through various Stories (without necessesarily doing them, yet), f?lg med i timen.
This Week 1 (Aug 18 ff): we've essentially been through exercises 1, 2, 3, 4, 5, and the Bayesian part of Story #41.
For Week 2 (Aug 25 ff): we start with the dramatic Story #13, where you shouod work through the Bayesian parts. Th...
PartOne and PartTwo of The Book, "Statistical Inference: 600 Exercises, 100 Stories" by Nils Lid Hjort and Emil Stoltenberg, are now uploaded at the course website. Find a printer.
PartOne has Chs 1, 3, 5, 7, 8, 13, 14, where the curriculum will mostly focus on 7 and 14 -- but where we shall need material from Ch 5 (lots of likelihoods), Ch 1 (lots of models), a bit from Ch 13 (Markov chains, for MCMC computations). PartTwo has the current version of the 100 Stories, where we shall delve into perhaps a dozen of these as the course proceeds.
Teaching, 2 + 2 hours per week, will be a mix of lecture style going through theory, and doing, discussing, learning from the many exercises and stories.
Exercises for Weeks 1 (Mon 18 Aug plus) and 2 (Mon 25 Aug plus): we dive right into Ch 7, and aim at going through exercises 1, 2, 3, 4, 5, 7, 9. Also do the Bayesian part of Story #41.
I do indeed judge an exclamation mark to be in order here (I otherwise use it sparingly). I hope for & predict a good & lively course on Bayesian methodology & applications, starting Mon August 18.
Teaching is Mon 12:15 - 14:00 (Room 107) and Wed 12:15 - 14:00 (Room 108). "Regne?velser" will to some extent be integrated with the "forelesninger", and the overall balance will be about 3 hr lectures plus 1 hr exercises, each week, but for some weeks closer to 2 + 2.
The teaching material will be decided on more accurately at the start of August, but will be taken from
(i) N.L. Hjort and E.Aa. Stoltenberg (2025): "Statistical Inference: 600 Exercises, 100 Stories", some of the chapters;
(ii) earlier Course Notes, from previous versions of the course, byt N.L. Hjort (2021) and by D. Christensen (2024).
The (ii) documents are available below at the course site, and the (i) document will be uploaded ...