Lecture Notes on Stochastic Processes (Advanced Probability II)
I've started putting the notes for my lectures on stochastic processes
(36-754) online at
the course homepage.
- Contents
- Table of contents, which gives a running list of definitions, lemmas,
theorems, etc. This will be updated with each new lecture.
- Lecture
1 (16 January)
- Definition of stochastic processes, examples, random functions
- Lecture
2 (18 January)
- Finite-dimensional distributions (FDDs) of a process, consistency of a
family of FDDs, theorems of Daniell and Kolmogorov on extending consistent
families to processes
- Lecture 3 (20 January)
- Probability kernels and regular conditional probabilities, extendings
finite-dimensional distributions defined recursively through kernels to
processes (the Ionescu Tulcea theorem).
- Homework Assignment 1 (due 27 January)
- Exercise 1.1; Exercise
3.1. Solutions.
- Lecture 4
(23 January)
- One-paramater processes and their representation by shift-operator
semi-groups.
- Lecture 5 (25 January)
- Three kinds of stationarity, the relationship between strong stationarity and measure-preserving transformations (especially shifts).
- Lecture 6 (27
January)
- Reminders about filtrations and optional times, definitions of various
sorts of waiting times, and Kac's Recurrence Theorem.
- Homework Assigment 2 (due 6 February)
- Exercise 5.3; Exercise 6.1; Exercise
6.2. Solutions
- Lecture 7
(30 January)
- Kinds of continuity, versions of stochastic processes, difficulties of
continuity, the notion of a separable random function.
- Lecture 8
(1 February)
- Existence of separable modifications of stochastic processes, conditions
for the existence of measurable, cadlag and continuous modifications.
- Lecture 9
(3 February)
- Markov processes and their transition-probability semi-groups.
- Lecture
10 (6 February)
- Markov processes as transformed IID noise; Markov processes as operator
semi-groups on function spaces.
- Lecture
11 (8 February)
- Examples of Markov processes (Wiener process and the logistic map).
Overlaps with solutions to
the second homework assignment.
- 10 February
- Material from section 2 of lecture 10, plus an excursion into sofic
processes.
- Lecture
12 (13 February)
- Generators of homogeneous Markov processes, analogy with exponential
functions.
- Lecture 13 (15 February)
- The strong Markov property and the martingale problem.
- Homework Assignment 3 (due 20 February)
- Exercises 10.1 and 10.2
- Lecture 14 (17, 20 February)
- Feller processes, and an example of a Markov process which isn't
strongly Markovian.
- Lecture 15 (24 February, 1 March)
- Convergence in distribution of cadlag processes, convergence of Feller
processes, approximation of differential equations by Markov processes.
- Lecture 16 (3 March)
- Convergence of random walks to Wiener processes.
- Homework Assignment 4 (due 13 March)
- Exercise 16.1, 16.2 and 16.4.
- Lecture 17 (6 March)
- Diffusions, Wiener measure, non-differentiability of almost all continuous curves.
- Lecture 18 (8 March)
- Stochastic integrals: heuristic approach via Euler's method, rigorous approach.
- Lecture 19 (20, 21, 22 and 24 March)
- Examples of stochastic integrals. Ito's formula for change of variables.
Stochastic differential equations, existence and uniqueness of solutions.
Physical Brownian motion: the Langevin equation, Ornstein-Uhlenbeck
processes.
- Lecture 20 (27 March)
- More on SDEs: diffusions, forward (Fokker-Planck) and backward equations.
White noise.
- Lecture 21 (29, 31 March)
- Spectral analysis; how the white noise lost its color. Mean-square
ergodicity.
- Lecture
22 (3 April)
- Small-noise limits for SDEs: convergence in probability to ODEs, and our
first large-deviations calculations.
- Lecture
23 (5 April)
- Introduction to ergodic properties and invariance.
- Lecture
24 (7 April)
- The almost-sure (Birkhoff) ergodic theorem.
- Lecture
25 (10 April)
- Metric transitivity. Examples of ergodic processes. Preliminaries on
ergodic decompositions.
- Lecture
26 (12 April)
- Ergodic decompositions. Ergodic components as minimal sufficient
statistics.
- Lecture
27 (14 April)
- Mixing. Weak convergence of distribution and decay of correlations.
Central limit theorem for strongly mixing sequences.
- Lecture
28 (17 April)
- Introduction to information theory. Relations between Shannon entropy,
relative entropy/Kullback-Leibler divergence, expected likelihood and Fisher
information.
- Lecture 29 (24 April)
- Entropy rate. The asymptotic equipartition property, a.k.a. the
Shannon-MacMillan-Breiman theorem, a.k.a. the entropy ergodic theorem.
Asymptotic likelihoods.
- Lecture
30 (26 April)
- General theory of large deviations. Large deviations principles and rate
functions; Varadhan's Lemma. Breeding LDPs: contraction principle,
"exponential tilting", Bryc's Theorem, projective limits.
- Lecture
31 (28 April)
- IID large deviations: cumulant generating functions, Legendre's transform,
the return of relative entropy. Cramer's theorem on large deviations of
empirical means. Sanov's theorem on large deviations of empirical measures.
Process-level large deviations.
- Lecture
32 (1 May)
- Large deviations for Markov sequences through exponential-family
densities.
- Lecture 33 (2 May)
- Large deviations in hypothesis testing and parameter estimation.
- Lecture 34 (3 May)
- Large deviations for weakly-dependent sequences (Gartner-Ellis
theorem).
- Lecture 35 (5 May)
- Large deviations of stochastic differential equations in the small-noise
limit (Freidlin-Wentzell theory).
- References
- The bibliography, currently confined to works explicitly cited.
Everything to
date
In the staggeringly-unlikely event that anyone wants to keep track of the
course by RSS, this should do
the trick.
Enigmas of Chance;
Corrupting the Young
Posted at January 18, 2006 12:00 | permanent link