Stochastic Differential Equations
10 Mar 2024 21:53
Non-stochastic differential equations are models of dynamical systems where the state evolves continuously in time. If they are autonomous, then the state's future values depend only on the present state; if they are non-autonomous, it is allowed to depend on an exogeneous "driving" term as well. (This may not be the standard way of putting it, but I think it's both correct and more illuminating than the more analytical viewpoints, and anyway is the line taken by V. I. Arnol'd in his excellent book on differential equations.) Stochastic differential equations (SDEs) are, conceptually, ones where the the exogeneous driving term is a stochatic process. --- While "differential equation", unmodified, covers both ordinary differential equations, containing only time derivatives, and partial differential equations, containing both time and space derivatives, "stochastic differential equation", unmodified, refers only to the ordinary case. Stochastic partial differential equations are just what you'd think.
The solution of an SDE is, itself, a stochastic process. To understand how this comes about, it helps to start by thinking through what an ordinary differential equation is, what constitutes a solution, and how to find a solution. The canonical sort of autonomous ordinary differential equation looks like \[ \frac{dx}{dt} = f(x) \] or \[ \frac{dx}{dt} = f(x,t) \] if it's non-autonomous. (I won't keep noting all the modifications for non-autonomous equations when they're clear.) Here \( f \) is the vector field which gives the rate of change of the system's variables, either as a function of their current value (autonomous), or their current value and the time (non-autonomous). A solution to the equation is a function \( x(t) \) whose time derivative matches the field: \[ \frac{dx}{dt}(t) = f(x(t)) \] Now, you'll recall from baby calculus that \( x(t) \) and \( x(t)+c \) have the same time derivative, for any constant \( c \). This means that the solution to an ODE isn't unique. We typically add on an extra requirement, such as an initial condition, \[ x(0) = x_0 \] to make the solution unique (and to represent the initial state of the system).
Finding a solution to the ODE means finding such a function, which in principle we can do by integrating. This is easiest to see if the equation isn't just non-autonomous, but doesn't depend on \( x \) at all, \[ \frac{dx}{dt} = f(t) \] Then we integrate both sides over time, from 0 to our time of interest \( t \); I'll write the variable of integration as \( s \) to keep it distinct: \[ \begin{eqnarray*} \int_{0}^{t}{\frac{dx}{dt} ds} & = & \int_{0}^{t}{f(s) ds}\\ x(t) - x(0) & = & \int_{0}^{t}{f(s) ds}\\ x(t) & = & x_0 + \int_{0}^{t}{f(s) ds} \end{eqnarray*} \] using the fundamental theorem of calculus, and the initial condition. Even if the equation isn't completely externally driven, we can still in principle do the same thing: \[ x(s) = x_0 + \int_{0}^{t}{f(x(s), s) ds} \]
We can actually calculate such a thing by using many different numerical schemes. One which is particularly helpful to going forward is Euler's method. Pick a small increment of time \( h \). We start with \( x(0) = x_0 \). Then we say \[ x(t+h) = x(t) + h f(x(t)) \] for the points \( t=0 \), \( t=h \), \( t=2h \), etc. In between those points we interpolate linearly. This gives us a function which doesn't quite obey the differential equation, but one can show that it comes closer and closer to doing so as \( h \rightarrow 0 \). (As I recall, Arnol'd's textbook, mentioned above, contains a pretty proof.)
Now let's thing about making all this stochastic. The easiest thing to do is to add some sort of stochastic noise on the right-hand side of the differential equation: \[ \frac{dX}{dt}(t) = f(X(t)) + Z(t) \] A solution will, once again, be a function \( X \) which satisfies this equation. Since \( Z \) is a random function of time, \( X \) will also be random. But if we could somehow fix \( Z \), we'd just be back to the kind of equation we know how to solve: \[ X(t) = x_0 + \int_{0}^{t}{f(X(s)) ds} + \int_{0}^{t}{Z(t) dt} \]
To make sense of such an expression in general, we need to know how to integrate stochastic processes. In particular, we need to understand when one process, say \( \zeta \), is the time-integral of another, say \( Z \), and vice versa. If we know that, then we can actually use Euler's method to find solutions. We would, once again, start with \( X(t) = x_0 \), set \[ X(t+h) = X(t) + h f(X(t)) + \zeta(t+h) - \zeta(t) \] and linearly interpolate between the points \( t=0 \), \( t=h \), \( t=2h \), etc. We then let \( h\rightarrow 0 \) to recover an exact solution.
In fact, so long as expressions like this last one make sense, we could use them to define what it means for one stochastic process, say \( Z \), to be the time-derivative of another, \( \zeta \). This may seem crazy or, more politely, of merely mathematical interest, but there are lots of situations where \( \zeta \) is a much better behaved process than is \( Z \). The premier example is when \( \zeta \) is the Gaussian process called the Wiener process or (slightly inaccurately) Brownian motion [1], defined by \( W(0) = 0 \), \( W(t) \sim \mathcal{N}(0, t) \) (i.e., a Gaussian or "Normal" distribution with mean 0 and variance \( t \)), and the increment \( W(t) - W(s) \) being statistically independent of the increment over any other (non-overlapping) interval of time. If we try to compute the time-derivative of \( W \), we find that it is almost-surely ill-defined [2]. You might say "Well, then maybe we shouldn't use the Wiener process as something giving us noise in a differential equation", but look at what it means in the Euler scheme: over the time interval from \( t \) to \( t+h \), the trajectory moves from \( X(t) \) to \( X(t) + hf(X(t)) + W(t+h) - W(t) \). That is to say, the the process follows the deterministic differential equation, plus a little Gaussian random kick --- that seems very natural! It's even natural that the variance of the Gaussian perturbation should scale with \( h \), since that's what we'd see in, say, a random walk...
So there should be some way of making sense of seeing a Wiener process as an integral. There is, in fact, a whole theory of stochastic integrals, developed in the 1940s, by M. Loeve, K. Ito, and R. Stratonovich (all building on earlier work by, among others, N. Wiener). The theory of SDEs more strictly is largely owed to Ito and Stratonovich (in two slightly different forms, corresponding to subtle differences between Euler methods).
Most of what one encounters, in applications, as the theory of SDEs assumes that the driving noise is in fact white noise, i.e., Gaussian and uncorrelated over time. On the one hand, this is less of a restriction than it might seem, because many other natural sorts of noise process can be represented as stochastic integrals of white noise. On the other hand, the same mathematical structure can be used directly to define stochastic integrals and stochastic DEs driven by a far broader class of stochastic processes; on this topic Kallenberg is a very good introduction.
[1]: "Slightly inaccurately", because the physical process known as "Brownian motion" is actually rather more accurately described by an Ornstein-Ulhenbeck process, which is the solution of the Langevin SDE for the momentum \( \vec{P}(t) \), \( d\vec{P} = -\gamma \vec{P} dt + \delta \mathbf{I} dW \), the constants \( \gamma \) and \( \delta \) telling us about how effective friction and diffusion are (respectively). One can actually recover the Wiener process as the appropriate limit from this, so the Wiener process approximates Brownian motion, but just identifying the two is a lie told to children, or rather to probability theorists. See Selmeczi et al. 2006, arxiv:physics/0603142, and sec. 18.1 of Almost None of the Theory of Stochastic Processes. ^
[2]: The argument is so pretty that I can't resist repeating it. The time-derivative is of course \( \lim_{h\rightarrow 0}{\frac{W(t+h) - W(t)}{h}} \). By the defining properties of the Wiener process, the numerator inside the limit is a Gaussian random variable with expectation 0 and variance \( h \). The ratio in the limit is therefore a Gaussian random variable with expectation 0 and variance \( 1/h \), so the variance will blow up to infinity as \( h \) shrinks. Moreover, for any other time \( s \), no matter how close \( s \) might be to \( t \), eventually the intervals \( (s, s+h) \) and \( (t, t+h) \) will not over-lap, so the increments will be statistically independent. ^
- See also:
- Inference for Stochastic Differential Equations
- Inference for Markov and Hidden Markov Models
- Markov Models
- Path Integrals and Feynman Diagrams for Classical Stochastic Processes
- Recommended, more introductory:
- Geoffrey Grimmett and David Stirzaker, Probability and Random Processes [The last chapter of the 3rd edition has a good, if rather heuristic, first-glimpse look at SDEs. I can't recall if it appeared in earlier editions or not.]
- Josef Honerkamp, Stochastic Dynamical Systems
- Joel Keizer, Statistical Thermodynamics of Nonequilibrium Processes [Describes SDEs, in the Ito framework, from a heuristic viewpoint, motivated by, precisely, the need to model non-equilibrium thermodynamic processes; goes on to use them in many interesting physical applications.]
- Andrzej Lasota and Michael C. Mackey, Chaos, Fractals and Noise: Stochastic Aspects of Dynamics [The later chapters give a solid introduction to SDEs, starting from the Euler-Bernstein approach, but to my mind somewhat slighting the quite real advantages of the Ito calculus for more advanced problems]
- Robert S. Liptser, Lectures on Stochastic Processes [See especially lecture 10, on white noise, and lecture 12, on Ito integrals.]
- Bernt Oksendal, Stochastic Differential Equations
- Recommended, more advanced and/or specialized:
- Ole E. Barndorff-Nielsen, Fred Espen Benth and Almut E. D. Veraart, Ambit Stochastics
- David R. Brillinger, "The 2005 Neyman Lecture: Dynamic Indeterminism in Science", Statistical Science 23 (2008): 48--64, arxiv:0808.0620 [With discussions and response]
- David R. Brillinger, Brent S. Stewart, Charles L. Littnan, "Three months journeying of a Hawaiian monk seal", pp. 246--264 of Deborah Nolan and Terry Speed (eds.), Probability and Statistics: Essays in Honor of David A. Freedman (2008), arxiv:0805.3019 [A pretty application]
- I. I. Gikhman and A. V. Skorokhod, Introduction to the Theory of Random Processes [Excellent long chapter on stochastic integrals and SDEs; these authors went on to publish several more books on SDEs, but I confess I have not read them.]
- Olav Kallenberg, Foundations of Modern Probability [Builds up the theory of stochastic integrals and stochastic differential equations from scratch, ending with a very general framework which makes it clear just which parts of the original approach, tied to the Wiener process, were necessary and which were accidental. However, Kallenberg's book is intended as a comprehensive textbook on probability theory, from measure theory through large deviations. This means that it is both mathematically demanding, and that he takes a "spiral" approach, revisitng this topic, like many others, repeatedly through the text. There are, however, abundant cross-references.]
- Robert S. Liptser and Albert N. Shiryaev, Statistics of Random Processes [Vol. I gives a very detailed account of the classical, Wiener-process theory and its uses in optimal filtering; vol. II considers numerous applications in statistics and signal-processing, as well as some generalizations and extensions.]
- Michel Loeve, Probability Theory [Gives a very elegant account of Loeve's contributions to the theory of stochastic integrals]
- L. C. G. Rogers and D. Williams, Diffusions, Markov Processes, and Martingales [See especially Vol. II, Ito Calculus.]
- Semi-recommended:
- Don S. Lemons, An Introduction to Stochastic Processes in Physics
- Modesty forbids me to recommend:
- CRS with Aryeh Kontorovich, Almost None of the Theory of Stochastic Processes [Part IV, current chapters 16--20, is about stochastic integrals and SDEs, and very much what I've ripped off for the stuff above, including the jokes.]
- To read:
- Lakhdar Aggoun and Robert Elliott, Measure Theory and Filtering: Introduction with Applications
- David Applebaum, Lévy Processes and Stochastic Calculus
- Ari Arapostathis, Vivek S. Borkar, Mrinal K. Ghosh, Ergodic Control of Diffusion Processes
- Yuri Bakhtin and Jonathan C. Mattingly, "Stationary Solutions of Stochastic Differential Equation with Memory and Stochastic Partial Differential Equations", math.PR/0509166
- Viorel Barbu, Philippe Blanchard, Giuseppe Da Prato, Michael Röckner, "Self-organized criticality via stochastic partial differential equations", arxiv:0811.2093
- Ole E. Barndorff-Nielsen and Albert Shiryaev, Change of Time and Change of Measure
- François Bolley, Ivan Gentil, Arnaud Guillin, "Convergence to equilibrium in Wasserstein distance for Fokker-Planck equations", Journal of Functional Analysis 263 (2012): 2430--2457, arxiv:1110.3606
- Nicolas Bouleau and Dominique Lépngle, Numerical Methods for Stochastic Process
- A. A. Budini and M.O. Caceres, "Functional characterization of generalized Langevin equations", cond-mat/0402311 ["exact functional formalism to deal with linear Langevin equations with arbitrary memory kernels and driven by any noise structure characterized through its characteristic functional..."]
- Alberto Chiarini, Markus Fischer, "On large deviations for small noise Ito processes", Advances in Applied Probability 46 (2014): 1126--147, arxiv:1212.3223
- Carson C. Chow, Michael A. Buice, "Path Integral Methods for Stochastic Differential Equations", arxiv:1009.5966
- Emmanuelle Clément, Arturo Kohatsu-Higa, Damien Lamberton, "A duality approach for the weak approximation of stochastic differential equations", math.PR/0610178 = Annals of Applied Probability 16 (2006): 1124--1154 ["a new methodology to prove weak approximation results for general stochastic differential equations. Instead of using a partial differential equation approach as is usually done for diffusions, the approach considered here uses the properties of the linear equation satisfied by the error process"]
- Jacky Cresson and Sébastien Darses, "Stochastic embedding of dynamical systems", math.PR/0509713
- A. M. Davie, "Uniqueness of solutions of stochastic differential equations", arxiv:0709.4147
- Freddy Delbaen, Jinniao Qiu, Shanjian Tang, "Forward-Backward Stochastic Differential Systems Associated to Navier-Stokes Equations in the Whole Space", arxiv:1303.5329
- Cai Dieball, Aljaz Godec, "Feynman-Kac theory of time-integrated functionals: Ito versus functional calculus", arxiv:2206.04034
- Bruno Dupire, "Functional Itô Calculus", Bloomberg Portfolio Research Paper No. 2009-04-FRONTIERS
- Peter Friz and Nicolas B. Victoir, Multidimensional Stochastic Processes as Rough Paths: Theory and Applications
- Peipei Gao, Yong Liu, Yue Sun, Zuohuan Zheng, "Large deviations principle for stationary solutions of stochastic differential equations with multiplicative noise", arxiv:2206.02356
- Jorge Garcia, "A Large Deviation Principle for Stochastic Integrals", Journal of Theoretical Probability 21 (2008): 476--501
- Leszek Gawarecki and Vidyadhar Mandrekar, Stochastic Differential Equations in Infinite Dimensions, with Applications to Stochastic Partial Differential Equations
- Martin Hairer, "Exponential Mixing Properties of Stochastic PDEs Through Asymptotic Coupling," math.PR/0109115
- David Hochberg, Carmen Molina-Paris, Juan Pérez-Mercader and Matt Visser, "Effective Action for Stochastic Partial Differential Equations," cond-mat/9904215
- Helge Holden, Stochastic Partial Differential Equations: A Modeling, White Noise Functional Approach
- Xiangping Hu, Daniel Simpson, Finn Lindgren, Havard Rue, "Multivariate Gaussian Random Fields Using Systems of Stochastic Partial Differential Equations", arxiv:1307.1379
- Yoshifusa Ito and Izumi Kubo, "Calculus on Gaussian and Poisson White Noises", Nagoya Mathematical Journal 111 (1988): 41--84
- Gopinath Kallianpur and Jie Xiong, Stochastic Differential Equations in Infinite Dimensional Spaces
- Hye-Won Kang, Thomas G. Kurtz, Lea Popovic, "Central limit theorems and diffusion approximations for multiscale Markov chain models", arxiv:1208.3783
- Ioannis Kontoyiannis, Sean P. Meyn, "Approximating a Diffusion by a Hidden Markov Model", arxiv:0906.0259
- Peter Kotelenez, Stochastic Ordinary and Stochastic Partial Differential Equations: Transition from Microscopic to Macroscopic Equations
- Peter M. Kotelenez and Thomas G. Kurtz, "Macroscopic limits for stochastic partial differential equations of McKean-Vlasov type", Probability Theory and Related Fields 146 (2010): 189--222
- H. Kunita, Stochastic Flows and Stochastic Differential Equations
- Kai Liu, Stochastic Stability of Differential Equations in Abstract Spaces
- S. V. Lototsky and B. L. Rozovskii
- "Wiener Chaos Solutions of Linear Stochastic Evolution Equations", math.PR/0504558
- "Stochastic Differential Equations: A Wiener Chaos Approach", math.PR/0504559
- Yutao Ma, Ran Wang, Liming Wu, "Moderate Deviation Principle for dynamical systems with small random perturbation", arxiv:1107.3432
- Jonathan C. Mattingly, Andrew M. Stuart, M.V. Tretyakov, "Convergence of Numerical Time-Averaging and Stationary Measures via Poisson Equations", arxiv:0908.4450
- Anatolii V. Mokshin, Renat M. Yulmetyev, and Peter Hänggi, "Simple Measure of Memory for Dynamical Processes Described by a Generalized Langevin Equation", Physical Review Letters 95 (2005): 200601
- Esteban Moro and Henri Schurz, "Non-negativity preserving numerical algorithms for stochastic differential equations", math.NA/0509724
- Cyril Odasso, "Exponential mixing for stochastic PDEs: the non-additive case", Probability Theory and Related Fields 140 (2008): 41--82
- Fabien Panloup, "Recursive computation of the invariant measure of a stochastic differential equation driven by a L\'{e}vy process", math.PR/0509712
- Giovanni Peccati and Murad S. Taqqu, "Moments, cumulants and diagram formulae for non-linear functionals of random measures", arxiv:0811/1726
- S. Peszat and J. Zabczyk, Stochastic Partial Differential Equations with Lévy Noise: An evolution Equation Approach
- Philip Protter, Stochastic Integration and Differential Equations
- A. J. Roberts, "Normal form transforms separate slow and fast modes in stochastic dynamical systems", math.DS/0701623
- Simo Särkkä and Arno Solin, Applied Stochastic Differential Equations
- Alexander Sokol, Niels Richard Hansen, "Causal interpretation of stochastic differential equations", Electronic Journal of Probability 19 (2014): 100, arxiv:1304.0217
- Daniel W. Stroock, Markov Processes from K. Ito's Perspective
- Arne Traulsen, Jens Christian Claussen, Christoph Hauert, "Stochastic differential equations for evolutionary dynamics with demographic noise and mutations", arxiv:1203.3367
- Ramon van Handel, "Almost Global Stochastic Stability", math.PR/0411311 ["We develop a method to prove almost global stability of stochastic differential equations in the sense that almost every intial point ... is asymptotically attracted to the origin with unit probability."]
- Wei Wang and Jinqiao Duan, "Invariant manifold reduction and bifurcation for stochastic partial differential equations", math.DS/0607050
- Wei Wang, A. J. Roberts and Jinqiao Duan, "Large deviations for slow-fast stochastic partial differential equations", arxiv:1001.4826 ["the rate function is exactly that of the averaged equation plus the fluctuating deviation which is a stochastic partial differential equation with small Gaussian perturbation"]