Notebooks

Path Integrals and Feynman Diagrams for Classical Stochastic Processes

Last update: 13 Dec 2024 22:27
First version: 4 May 2021

This feels like a topic which should be obvious to me, but isn't, and so I want to wrap my head around it.

The parts I get

If we want to know the probability of a process with state space \( \mathcal{X} \) moving from state \( x \) at time \( 0 \) to state \( y \) at time \( t \), I need \( p(X(t) = y|X(0) = x) \). (Let's pretend everything has a density for now.) By the law of total probability I can insert as many intermediate times \( t_1, t_2, \ldots t_n \) as I like, and \[ p(X(t)=y|X(0)=x) = \int_{\mathcal{X}^n}{p(X(t_1) = u_1, X(t_2) = u_2, \ldots X(t_n) = u_n, X(t) = y|X(0) = x) du_{1:n}} \] which by the definition of conditional probability will be \[ p(X(t)=y|X(0)=x) = \int_{\mathcal{X}^n}{\left(\prod_{i=1}^n{p(X(t_i) = u_i|X(0)=x, X(t_1) = u_1, \ldots X(t_{i-1}) = u_{i-1}) } \right) du_{1:n}} \] with the understandings that \( t_{n+1} = t \), \( u_{n+1}=y \). If this is a Markov process, then earlier states become irrelevant when conditioning on later states, \[ p(X(t)=y|X(0)=x) = \int_{\mathcal{X}^n}{\left(\prod_{i=1}^n{p(X(t_i) = u_i| X(t_{i-1}) = u_{i-1})}\right) du_{1:n}} \] or \[ p(X(t)=y|X(0)=x) = \int_{\mathcal{X}^n}{\exp{\left(\sum_{i=1}^{n}{h(t_i, u_i| t_{i-1}, u_{i-1})} \right)} du_{1:n}} \] introducing the function \( h(t_i, u_i| t_{i-1}, u_{i-1}) \equiv \log{p(X(t_i) = u_i| X(t_{i-1}) = u_{i-1})} \). Assuming homogeneous transitions would then amount to assuming only the length of the interval \( t_i - t_{i-1} \) matters, so we could (overloading the notation a bit) write this as \( h(t_i, u_i| t_{i-1}, u_{i-1}) = h(u_i|u_{i-1}; t_{i} - t_{i-1}) \). A continuous, time-homogeneous Markov process will have a generator \( \mathbf{G} \), meaning that the transition operator over an interval \( \Delta t \) will be of the form \( e^{\Delta t \mathbf{G}} \). Taking \( \Delta t \) small, the transition operator will be \( \approx 1 + \Delta t \mathbf{G} \), and its log \( \approx \Delta t \mathbf{G} \), which would let us write \( h(u_i| u_{i-1}; \Delta t) \) in terms of the generator \( \mathbf{G} \) (at the cost of more algebra than I want to write down just now).

Passing non-rigorously to the limit, the sum inside the exponential will become an integral over time, and we should be able to write this over-all transition probability \( p(X(t)=y|X(0)=x) \) as a sum over all paths or histories, \[ p(X(t)=y|X(0)=x) = \int_{u: [0,1] \mapsto \mathcal{X}, u(0) = x, u(t)=y}{\exp{\left( \int_{s=0}^{t}{L(s, u(s), \dot{u}(s), \ldots) ds}\right)} du} \] where again \( L \) could be recovered from the generator if I was willing to do algebra. The presence of derivatives of the path in \( L \) comes from the fact that generators are (usually) differential operators.

(At this point a mathematical quibbler might well ask what \( du \) is, exactly, since we're now integrating over an infinite-dimensional space of continuous-time functions, and Lebesgue measure, for instance, doesn't properly extend to this setting. This is an important point I intend to ignore for the present.)

So what don't I understand?

Three big things stick out as especially irritating:
  1. Where the cumulant generating function fits in to all this. (One problem here might be that while I can and do use cumulants, I have no intuition about them at all.)
  2. How to (in general) read off diagrammatic expansions from this.
  3. What to do for non-Markov processes. (Wio, below, suggests coming up with Markovian approximations.)

More broadly, I want to understand how much of this structure I learned as a physicist really has anything to do with physics, and how much is just a generality about stochastic processes.


Notebooks: