Alessandro (Ale) Rinaldo - Fall, 2025
SDS 387 is an intermediate graduate course in theoretical statistics for PhD students, covering two separate but interrelated topics: (i) stochastic convergence, (ii) selected topics in learning theory and (iii) linear regression modeling. The material and style of the course will skew towards the mathematical and theoretical aspects of common models and methods, in order to provide a foundation for those who wish to pursue research in statistical methods and theory. This is not an applied regression analysis course.
Syllabus: Syllabus
Lectures: Tuesday and Thursday, 9:00am - 10:30am, FAC 101B
TA: Hien Dang, hiendang@utexas.edu - Office hours: Tuesday, 2:00–3:00 pm in WEL 5.228H
Ale's Office hours: by appointment
Homework submission and solutions: use Canvas
Due date | |
Homework 1 | |
Class canceled (Ale was sick)
Lecture 1: Introduction and course logistics. Deterministic convergence and convergence with probability one. Limsup and liminf of sequences of events.
Nicola's Baritellt candidacy talk, part of which covered the content of his latest paper with Stephen Walker and Bernardo Flores.
Lecture 2: Limsup and liminf of events. Borel-Cantelli's Second Lemma. Convergence in probability and comparison with convergence with probability one.
References:
See Ferguson's book, chapters 1, 2 and 4.
A nice webpage summarizing the different modes of stochastic convergence and providing some good examples to illustrate their differences.
Lecture 3: WLLN and SLLN. Glivenko Cantelli Theorem and DKW inequality. For proof of the Glivenko-Cantelli Theorem, see Theorem 19.1 in van der Vaart's book.
Lecture 4: Proof of Glivenko Cantelli Theorem and DKW inequality. Lp spaces and convergence.
Lecture 5: Lp convergence, Minkowski, Holder and Jensen inequalities. Relations between Lp convergence and convergence in probability and with probability one. Convergence in distribution for univariate random variables. C.d.f.'s in multivariate settings.