Three-Toed Sloth
http://bactra.org/weblog/
Slow Takes from the Canopy (My Very Own Internet Tradition)enLecture Notes on Stochastic Processes (Advanced Probability II), Spring 2007
http://bactra.org/weblog/472.html
<P>Since the first lecture of
my <a href="http://www.stat.cmu.edu/~cshalizi/754/">class</a> coincided with
the first non-trivial snow-fall of the winter, talk of the "spring" semester
seems like a cruel joke, but there you go. One of my New Year's resolutions
was to leave the notes as nearly alone as possible, so they will largely
follow <a href="http://bactra.org/weblog/403.html">last year's</a>, but with
typo corrections, a few occasional improvements, more examples, and some
pictures (not, I dare say, enough).
<P><strong>Update</strong>, 26 April 2007: The link at the end of this list to
complete set of notes now goes to the <em>complete</em> notes, including
chapters yet to be covered by the lectures.
<P>This page will be updated with new lecture notes as the semester goes on.
If you want an RSS feed, <a href="http://bactra.org/weblog/472.rss">this</a>
should do it.
<dl>
<dt><a
href="http://www.stat.cmu.edu/~cshalizi/754/notes/lecture-24.pdf">Chapter
24</a>: Birkhoff's Ergodic Theorem</a> (4 May)</dt>
<dd>The almost-sure or individual ergodic theorem of Birkhoff: if the system is
asymptotically mean stationary, then, with probability 1, the time average of
every well-behaved observable converges to its expectation.</dd>
<dt><a
href="http://www.stat.cmu.edu/~cshalizi/754/notes/lecture-23.pdf">Chapter
23</a>: Ergodic Properties (26 April and 1 May)</dt>
<dd>Ideological remarks on ergodic theory. Dynamical systems and their
invariants; connections to Markov processes. Ergodic properties of
functions and ergodic limits. Asymptotic mean stationarity.</dd>
<dt><a
href="http://www.stat.cmu.edu/~cshalizi/754/notes/lecture-22.pdf">Chapter
22</a>: Spectral Analysis and Mean-Square Ergodicity (12--24 April)</dt>
<dd>"White noise" as a linear functional; its description as a generalized
stochastic process (mean zero, uncorrelated, Gaussian) and equivalence to the
Ito integral. Spectral representation of stochastic processes, especially
weakly stationary ones, or, random Fourier transforms. The Wiener-Khinchin
theorem linking the spectrum to the autocorrelation function. How the white
noise lost its color. Our first ergodic theorem: convergence of time averages
in mean square. A first proof assuming rapid decay of correlations. A
stronger second proof based on the spectral representation.</dd>
<dt><a
href="http://www.stat.cmu.edu/~cshalizi/754/notes/lecture-21.pdf">Chapter
21</a>: A First Look at Small-Noise Limits of Stochastic Differential Equations
(10 April)</dt>
<dd>SDEs defined by adding a small amount of white noise to ODEs. Solutions of
the SDEs converge in distribution on the solutions of the ODE as the noise goes
to zero (via Feller properties). An exponential upper bound on the probability
of given magnitude of deviation between the solutions. Preview of coming
attractions in large deviations theory.</dd>
<dt><a
href="http://www.stat.cmu.edu/~cshalizi/754/notes/lecture-20.pdf">Chapter
20</A>: More on Stochastic Differential Equations (29 March and 1 April)</dt>
<dd>Solutions of SDEs are Feller diffusions (as they solve martingale
problems). The Kolmogorov "forward" and "backward" equations, for the
evolution of the probability density and the observables, respectively.
Examples of the forward or Fokker-Planck equation and its solution.</dd>
<dt><a
href="http://www.stat.cmu.edu/~cshalizi/754/notes/lecture-19.pdf">Chapter
19</a>: Stochastc Integrals and Stochastic Differential Equations (20--29
March)</dt>
<dd>Rigorous approach to stochastic integrals, after Ito. Ito integrals of
"elementary" processes; extension to a wider class of integrands via
approximation. Ito's isometry. Some simple but instructive examples. Ito
processes. Ito's formula for change of variables. Stratonovich integrals
(briefly). Representation of nice martingales as Ito integrals. Stochastic
differential equations: existence and uniqueness of solutions. A more
realistic model of Brownian motion, leading to a stochastic differential
equation (the Langevin equation) and Ornstein-Uhlenbeck processes.</dd>
<dt><a
href="http://www.stat.cmu.edu/~cshalizi/754/notes/lecture-18.pdf">Chapter
18</a>: Preview of Stochastic Integrals (8 March)</dt>
<dd>Why we want stochastic integrals. A heuristic approach via Euler's
method (the Euler-Bernstein scheme).</dd>
<dt><a
href="http://www.stat.cmu.edu/~cshalizi/754/notes/lecture-17.pdf">Chapter
17</a>: Diffusions and the Wiener Process (6 March)</dt>
<dd>Definition of diffusions. The Wiener process as the prototypical
diffusion. Resume of the Wiener process's properties. Wiener processes with
respect to arbitrary filtrations. Gaussian processes. Wiener measure.
Non-dfferentiability of almost-all continuous functions.</dd>
<dt><a
href="http://www.stat.cmu.edu/~cshalizi/754/notes/lecture-16.pdf">Chapter
16</a>: Convergence of Random Walks (27 February and 1 March)</dt>
<dd>The Wiener process as a Feller process. Continuous-time random walks.
Convergence of random walks to the Wiener process via the Feller-process
machinery; via direct use of the theorems on weak convergence.</dd>
<dt><a
href="http://www.stat.cmu.edu/~cshalizi/754/notes/lecture-15.pdf">Chapter
15</a>: Convergence of Feller Processes (27 February)</dt>
<dd>Weak convergence of stochastic processes; hints as to the Skorokhod
topology on cadlag functions; necessary and sufficient, and merely sufficient,
conditions for convergence in distribution of cadlag processes. Convergence
in distribution of Feller processes. Convergence of discrete-time Markov
processes on Feller processes. Convergence of Markov processes on ordinary
differential equations.</dD>
<dt><a
href="http://www.stat.cmu.edu/~cshalizi/754/notes/lecture-14.pdf">Chapter
14</a>: Feller Processes (22 February)</dt>
<dd>Clarificiations on initial states and distributions of Markov processes;
Markov families, the probability kernel from initial states to paths.
Definition of Feller processes and its physical motivations; reformulation in
terms of semi-groups; unique correspondence between Feller processes and their
generators. Attributes of Feller processes: cadlag sample paths, strong Markov
property, Dynkin's formula.</dd>
<dt><a
href="http://www.stat.cmu.edu/~cshalizi/754/notes/lecture-13.pdf">Chapter
13</a>: Strongly Markovian Processes and Martingale Problems (20 February)</dt>
<dd>The strong Markov property is being Markovian even at random times. An
example of how a Markov process can fail to be strongly Markovian. The concept
of a "martingale problem". Relationship between solutions of martingale
problems and strong Markov processes.</dd>
<dt><a href="http://www.stat.cmu.edu/~cshalizi/754/notes/lecture-12.pdf">Chapter 12</a>: Generators</a> (15 February)
<dd>The generators of the semi-groups associated with Markov processes: analogy
with exponential functions, how to find generators starting from semi-groups,
some uses of generators for solving differential equations, Laplace
transforms and resolvents. Hille-Yosida theorem on building semi-groups from
generators.</dd>
<dt><a href="http://www.stat.cmu.edu/~cshalizi/754/notes/lecture-11.pdf">Chapter 11</a>: Examples of Markov Processes</a> (13 February)</dt>
<dd>The logistic map as an example of turning nonlinear, deterministic
dynamical systems into linear Markov operators. The Wiener process as an
example of finding the transition kernels and time-evolution operators.
Generalization of the Wiener process example to other processes with stationary
and independent increments, and connections to limiting distributions of sums
of IID random variables.</dd>
<dt><a href="http://www.stat.cmu.edu/~cshalizi/754/notes/lecture-10.pdf">Chapter 10</a>: Two Views of Markov Processes (8 February)</dt>
<dd>Markov sequences as transformations of noise; transducers. Markov
processes as collections of operators: Markov operator semi-groups evolve the
distribution of states, and their adjoint operators evolve the "observables",
the bounded measurable functions of the state. Some functional-analytic facts
about conjugate spaces and adjoint operators.</dd>
<dt><a href="http://www.stat.cmu.edu/~cshalizi/754/notes/lecture-09.pdf">Chapter 9</a>: Markov Processes (6 February)</dt>
<dd>Definition and meaning of the Markov property. Transition probability
kernels. Existence of Markov processes with specified transitions. Invariant
distributions. Dependence of the Markov property on filtrations.</dd>
<dt><a href="http://www.stat.cmu.edu/~cshalizi/754/notes/lecture-08.pdf">Chapter 8</a>: More on Continuity (1 February)</dt>
<dd>Existence of separable modifications of a stochastic process (in detail).
Idea of measurable modifications. Conditions for the existence of measurable,
cadlag and continuous modifications.</dd>
<dt><a href="http://www.stat.cmu.edu/~cshalizi/754/notes/lecture-07.pdf">Chapter 7</a>: Continuity (1 February)</dt>
<dd>Kinds of continuity for stochastic processes. Versions and modifications
of stochastic processes. Benefits of continuous sample paths, and an example
of the impossibility of deducing them from the finite dimensional distributions
alone. Separable random functions.</dd>
<dt><a href="http://www.stat.cmu.edu/~cshalizi/754/notes/lecture-06.pdf">Chapter 6</a>: Random Times and Recurrence (30 January)</dt>
<dd>Reminders about filtrations and stopping times. Waiting times of various
sorts, especially recurrence times. Poincaré and Kac recurrence
theorems. "The eternal return of the same" and its statistical
applications.</dd>
<dt><a href="http://www.stat.cmu.edu/~cshalizi/754/notes/lecture-05.pdf">Chapter 5</a>: Stationarity and Dynamics (25 January)</dt>
<dd>Strong, weak, and conditional stationarity. Stationarity as
shift-invariance. Measure-preserving transformations and stationary
processes.</dd>
<dt><a href="http://www.stat.cmu.edu/~cshalizi/754/notes/lecture-04.pdf">Chapter 4</a>: One-Parameter Processes (23 January)</dt>
<dd>One-parameter processes; examples thereof. Representation of one-parameter
processes in terms of shift operators.</dd>
<dt><a href="http://www.stat.cmu.edu/~cshalizi/754/notes/lecture-03.pdf">Chapter 3</a>: Building Infinite Processes by Recursive Conditioning (23 January)</dt>
<dd>Probability kernels and regular conditional probabilities. Theorem
of Ionescu Tuclea on constructing processes from regular conditional
distributions.</dd>
<dt><a href="http://www.stat.cmu.edu/~cshalizi/754/notes/lecture-02.pdf">Chapter 2</a>: Building Infinite Processes from Finite-Dimensional Distributions (18 January)</dt>
<dd>Finite-dimensional distributions of a process. Theorems of Daniell and
Kolmogorov on extending finite-dimensional distributions to
infinite-dimensional ones.</dd>
<dt><a href="http://www.stat.cmu.edu/~cshalizi/754/notes/lecture-01.pdf">Chapter 1</a>: Stochastic Processes and
Random Functions (16 January)</dt>
<dd>Stochastic processes as indexed collections of random variables and as random functions. Sample paths and constraints on them. Examples.</dd>
<dt><a href="http://www.stat.cmu.edu/~cshalizi/754/notes/contents.pdf">Contents</a></dt>
<dd>Including a running list of all definitions, lemmas, theorems, corollaries,
examples and exercises to date.</dd>
<dt><a href="http://www.stat.cmu.edu/~cshalizi/754/notes/references.pdf">References</a></dt>
<dd>Confined to works explicitly cited.</dd>
<dt><a href="http://www.stat.cmu.edu/~cshalizi/754/notes/all.pdf">The entire set of notes</a></dt>
</dl>
<P><font size="-1">
<a href="http://bactra.org/weblog/cat_enigmas_of_chance.html">Enigmas of Chance</a>;
<a href="http://bactra.org/weblog/cat_corrupting_the_young.html">Corrupting the Young</a>
</font>