SDS 391-3: Linear Models


Alessandro (Ale) Rinaldo - Fall, 2025

SDS 387 is an intermediate graduate course in theoretical statistics for PhD students, covering two separate but interrelated topics: (i) stochastic convergence, (ii) selected topics in learning theory and (iii) linear regression modeling. The material and style of the course will skew towards the mathematical and theoretical aspects of common models and methods, in order to provide a foundation for those who wish to pursue research in statistical methods and theory. This is not an applied regression analysis course.


  • Lectures: Tuesday and Thursday, 9:00am - 10:30am, FAC 101B

  • Ale's Office hours: by appointment

  • Homework submission and solutions: use Canvas


Due date
Homework 1


Tuesday, August 26

Class canceled (Ale was sick)

Thursday, August 28

Lecture 1: Introduction and course logistics. Deterministic convergence and convergence with probability one. Limsup and liminf of sequences of events.

Tuesday, September 2

Nicola's Baritellt candidacy talk, part of which covered the content of his latest paper with Stephen Walker and Bernardo Flores.

Thursday, September 4

Lecture 2: Limsup and liminf of events. Borel-Cantelli's Second Lemma. Convergence in probability and comparison with convergence with probability one.
References:

Tuesday, September 9

Lecture 3: WLLN and SLLN. Glivenko Cantelli Theorem and DKW inequality. For proof of the Glivenko-Cantelli Theorem, see Theorem 19.1 in van der Vaart's book.

Thursday, September 11

Lecture 4: Proof of Glivenko Cantelli Theorem and DKW inequality. Lp spaces and convergence.

Tuesday, September 16

Lecture 5: Lp convergence, Minkowski, Holder and Jensen inequalities. Relations between Lp convergence and convergence in probability and with probability one. Convergence in distribution for univariate random variables. C.d.f.'s in multivariate settings.

Thursday, September 18

Lecture 6: Convergence in distribution. Relation with other forms of convergence. Marginal vs joint convergence in distribution. For the proof of the claim that convergence in probability implies convergence in distribution, see page 330 of Billingsley's book /Probability and Measure.

Tuesday, September 23

Lecture 7: Uniqueness of stochastic limits. Portmantreau Theorem (see, e.g, chapter 2 in van der Vaart's Asymptotic Statistics book).

Thursday, September 25

Lecture 8: Characteristics functions and Continuity Theorem, Cramer-Wald device. I suggest reading Chapter 3 of Ferguson's book (in particuar, Theorem 3(e) has a neat proof). For a reference to multivariate Taylor series expansions, see, e.g., Advanced Calculus by G. Folland, available here.

Tuesday, September 30

Lecture 9: Slutsky's theorem, more on convergence in distribution. Big-oh and little-oh notation. Prohorov's theorem about tightness of stochastic sequences.

Thursday, October 2

Lecture 10: CLT for i.i.d. variables using characteristic functions. Triangular arrays, Lindeberg Feller and Lyapunov conditions.

Tuesday, October 7

Lecture 11: Lindeberg Feller, examples and multivariate extension. Berry-Esseen bounds. A good reference for this lecture and the last is the book Sums of Independent Random Variables, by V.V. Petrov, Springer, 1975. Another classic and good reference is Approximation Theorems of Mathematical Statistics by Serfling, Wiley, 1980.