Notebooks

Stochastic Processes

10 Oct 2024 10:10

Things to understand better: Large deviations. Non-asymptotic convergence rates. Convergence properties of non-stationary processes. Coupling methods. Statistical inference on processes. Convergence in distribution of sequences of processes. Empirical process theory (i.e. processes where the index set is a sigma field or a function space), especially when the data are themselves generated by a non-IID process.

Central limit theorems for stochastic processes: A common thing to ask about a stochastic process (assuming it takes values in a vector space) is its time-average. For IID sequences, with finite variance at each time, the ordinary central limit theorem tells us the distribution of the average converges to a Gaussian. With infinite variance, the limiting distribution is one of the Levy distributions. For independent, non-identically distributed sequences, similar but weaker results hold. Quueries: Do we know a necessary-and-sufficient condition for central limit theorems (Gaussian or Levy) for dependent sequences? (I can find lots of sufficient ones, and trivial necessary ones.) Under some circumstances, one can show that a dependent sequence will converge to an exponential distribution. (The most common example is a random walk with a reflecting barrier.) Do we know necessary and sufficient conditions for convergence to exponentials? (This question is related to the origin of power-law distributions.) Is there a characterization for distributions which can be the limits of averaging dependent random variables? Can we take an IID, finite-variance sequence, and introduce dependence in such a way as to (1) leave the marginal distribution at each time alone but (2) make the limiting distribution Levy? (With thanks to Spyros Skouras for bugging me about these and related matters.)

Markov processes, branching processes, point processes, and stochastic differential equations are important enough to be spun off into separate notebooks.


Notebooks: