April 26, 2007

Lecture Notes on Stochastic Processes (Advanced Probability II), Spring 2007

Since the first lecture of my class coincided with the first non-trivial snow-fall of the winter, talk of the "spring" semester seems like a cruel joke, but there you go. One of my New Year's resolutions was to leave the notes as nearly alone as possible, so they will largely follow last year's, but with typo corrections, a few occasional improvements, more examples, and some pictures (not, I dare say, enough).

Update, 26 April 2007: The link at the end of this list to complete set of notes now goes to the complete notes, including chapters yet to be covered by the lectures.

This page will be updated with new lecture notes as the semester goes on. If you want an RSS feed, this should do it.

Chapter 24: Birkhoff's Ergodic Theorem (4 May)
The almost-sure or individual ergodic theorem of Birkhoff: if the system is asymptotically mean stationary, then, with probability 1, the time average of every well-behaved observable converges to its expectation.
Chapter 23: Ergodic Properties (26 April and 1 May)
Ideological remarks on ergodic theory. Dynamical systems and their invariants; connections to Markov processes. Ergodic properties of functions and ergodic limits. Asymptotic mean stationarity.
Chapter 22: Spectral Analysis and Mean-Square Ergodicity (12--24 April)
"White noise" as a linear functional; its description as a generalized stochastic process (mean zero, uncorrelated, Gaussian) and equivalence to the Ito integral. Spectral representation of stochastic processes, especially weakly stationary ones, or, random Fourier transforms. The Wiener-Khinchin theorem linking the spectrum to the autocorrelation function. How the white noise lost its color. Our first ergodic theorem: convergence of time averages in mean square. A first proof assuming rapid decay of correlations. A stronger second proof based on the spectral representation.
Chapter 21: A First Look at Small-Noise Limits of Stochastic Differential Equations (10 April)
SDEs defined by adding a small amount of white noise to ODEs. Solutions of the SDEs converge in distribution on the solutions of the ODE as the noise goes to zero (via Feller properties). An exponential upper bound on the probability of given magnitude of deviation between the solutions. Preview of coming attractions in large deviations theory.
Chapter 20: More on Stochastic Differential Equations (29 March and 1 April)
Solutions of SDEs are Feller diffusions (as they solve martingale problems). The Kolmogorov "forward" and "backward" equations, for the evolution of the probability density and the observables, respectively. Examples of the forward or Fokker-Planck equation and its solution.
Chapter 19: Stochastc Integrals and Stochastic Differential Equations (20--29 March)
Rigorous approach to stochastic integrals, after Ito. Ito integrals of "elementary" processes; extension to a wider class of integrands via approximation. Ito's isometry. Some simple but instructive examples. Ito processes. Ito's formula for change of variables. Stratonovich integrals (briefly). Representation of nice martingales as Ito integrals. Stochastic differential equations: existence and uniqueness of solutions. A more realistic model of Brownian motion, leading to a stochastic differential equation (the Langevin equation) and Ornstein-Uhlenbeck processes.
Chapter 18: Preview of Stochastic Integrals (8 March)
Why we want stochastic integrals. A heuristic approach via Euler's method (the Euler-Bernstein scheme).
Chapter 17: Diffusions and the Wiener Process (6 March)
Definition of diffusions. The Wiener process as the prototypical diffusion. Resume of the Wiener process's properties. Wiener processes with respect to arbitrary filtrations. Gaussian processes. Wiener measure. Non-dfferentiability of almost-all continuous functions.
Chapter 16: Convergence of Random Walks (27 February and 1 March)
The Wiener process as a Feller process. Continuous-time random walks. Convergence of random walks to the Wiener process via the Feller-process machinery; via direct use of the theorems on weak convergence.
Chapter 15: Convergence of Feller Processes (27 February)
Weak convergence of stochastic processes; hints as to the Skorokhod topology on cadlag functions; necessary and sufficient, and merely sufficient, conditions for convergence in distribution of cadlag processes. Convergence in distribution of Feller processes. Convergence of discrete-time Markov processes on Feller processes. Convergence of Markov processes on ordinary differential equations.
Chapter 14: Feller Processes (22 February)
Clarificiations on initial states and distributions of Markov processes; Markov families, the probability kernel from initial states to paths. Definition of Feller processes and its physical motivations; reformulation in terms of semi-groups; unique correspondence between Feller processes and their generators. Attributes of Feller processes: cadlag sample paths, strong Markov property, Dynkin's formula.
Chapter 13: Strongly Markovian Processes and Martingale Problems (20 February)
The strong Markov property is being Markovian even at random times. An example of how a Markov process can fail to be strongly Markovian. The concept of a "martingale problem". Relationship between solutions of martingale problems and strong Markov processes.
Chapter 12: Generators (15 February)
The generators of the semi-groups associated with Markov processes: analogy with exponential functions, how to find generators starting from semi-groups, some uses of generators for solving differential equations, Laplace transforms and resolvents. Hille-Yosida theorem on building semi-groups from generators.
Chapter 11: Examples of Markov Processes (13 February)
The logistic map as an example of turning nonlinear, deterministic dynamical systems into linear Markov operators. The Wiener process as an example of finding the transition kernels and time-evolution operators. Generalization of the Wiener process example to other processes with stationary and independent increments, and connections to limiting distributions of sums of IID random variables.
Chapter 10: Two Views of Markov Processes (8 February)
Markov sequences as transformations of noise; transducers. Markov processes as collections of operators: Markov operator semi-groups evolve the distribution of states, and their adjoint operators evolve the "observables", the bounded measurable functions of the state. Some functional-analytic facts about conjugate spaces and adjoint operators.
Chapter 9: Markov Processes (6 February)
Definition and meaning of the Markov property. Transition probability kernels. Existence of Markov processes with specified transitions. Invariant distributions. Dependence of the Markov property on filtrations.
Chapter 8: More on Continuity (1 February)
Existence of separable modifications of a stochastic process (in detail). Idea of measurable modifications. Conditions for the existence of measurable, cadlag and continuous modifications.
Chapter 7: Continuity (1 February)
Kinds of continuity for stochastic processes. Versions and modifications of stochastic processes. Benefits of continuous sample paths, and an example of the impossibility of deducing them from the finite dimensional distributions alone. Separable random functions.
Chapter 6: Random Times and Recurrence (30 January)
Reminders about filtrations and stopping times. Waiting times of various sorts, especially recurrence times. Poincaré and Kac recurrence theorems. "The eternal return of the same" and its statistical applications.
Chapter 5: Stationarity and Dynamics (25 January)
Strong, weak, and conditional stationarity. Stationarity as shift-invariance. Measure-preserving transformations and stationary processes.
Chapter 4: One-Parameter Processes (23 January)
One-parameter processes; examples thereof. Representation of one-parameter processes in terms of shift operators.
Chapter 3: Building Infinite Processes by Recursive Conditioning (23 January)
Probability kernels and regular conditional probabilities. Theorem of Ionescu Tuclea on constructing processes from regular conditional distributions.
Chapter 2: Building Infinite Processes from Finite-Dimensional Distributions (18 January)
Finite-dimensional distributions of a process. Theorems of Daniell and Kolmogorov on extending finite-dimensional distributions to infinite-dimensional ones.
Chapter 1: Stochastic Processes and Random Functions (16 January)
Stochastic processes as indexed collections of random variables and as random functions. Sample paths and constraints on them. Examples.
Contents
Including a running list of all definitions, lemmas, theorems, corollaries, examples and exercises to date.
References
Confined to works explicitly cited.
The entire set of notes

Enigmas of Chance; Corrupting the Young

Posted at April 26, 2007 17:43 | permanent link

Three-Toed Sloth