In these notes, the approach taken to stochastic calculus revolves around stochastic integration and the theory of semimartingales. An alternative starting point would be to consider Markov processes. Although I do not take the second approach, all of the special processes considered in the current section are Markov, so it seems like a good idea to introduce the basic definitions and properties now. In fact, all of the special processes considered (Brownian motion, Poisson processes, Lévy processes, Bessel processes) satisfy the much stronger property of being Feller processes, which I will define in the next post.
Intuitively speaking, a process X is Markov if, given its whole past up until some time s, the future behaviour depends only its state at time s. To make this precise, let us suppose that X takes values in a measurable space and, to denote the past, let
be the sigma-algebra generated by
. The Markov property then says that, for any times
and bounded measurable function
, the expected value of
conditional on
is a function of
. Equivalently,
| (1) |
(almost surely). More generally, this idea makes sense with respect to any filtered probability space . A process X is Markov with respect to
if it is adapted and (1) holds for times
. Continue reading “Markov Processes”
