In this post I attempt to give a rigorous definition of integration with respect to Brownian motion (as introduced by Itô in 1944), while keeping it as concise as possible. The stochastic integral can also be defined for a much more general class of processes called semimartingales. However, as Brownian motion is such an important special case which can be handled directly, I start with this as the subject of this post. If
is a standard Brownian motion defined on a probability space
and
is a stochastic process, the aim is to define the integral
 |
(1) |
In ordinary calculus, this can be approximated by Riemann sums, which converge for continuous integrands whenever the integrator
is of finite variation. This leads to the Riemann-Stietjes integral and, generalizing to measurable integrands, the Lebesgue-Stieltjes integral. Unfortunately this method does not work for Brownian motion which, as discussed in my previous post, has infinite variation over all nontrivial compact intervals.
The standard approach is to start by writing out the integral explicitly for piecewise constant integrands. If there are times
such that
for each
then the integral is given by the summation,
 |
(2) |
We could try to extend to more general integrands by approximating by piecewise constant processes but, as mentioned above, Brownian motion has infinite variation paths and this will diverge in general.
Fortunately, when working with random processes, there are a couple of observations which improve the chances of being able to consistently define the integral. They are
- The integral is not a single real number, but is instead a random variable defined on the probability space. It therefore only has to be defined up to a set of zero probability and not on every possible path of
.
- Rather than requiring limits of integrals to converge for each path of
(e.g., dominated convergence), the much weaker convergence in probability can be used.
These observations are still not enough, and the main insight is to only look at integrands which are adapted. That is, the value of
can only depend on
through its values at prior times. This condition is met in most situations where we need to use stochastic calculus, such as with (forward) stochastic differential equations. To make this rigorous, for each time
let
be the sigma-algebra generated by
for all
. This is a filtration (
for
), and
is referred to as a filtered probability space. Then,
is adapted if
is
-measurable for all times
. Piecewise constant and left-continuous processes, such as
in (2), which are also adapted are commonly referred to as simple processes.
However, as with standard Lebesgue integration, we must further impose a measurability property. A stochastic process
can be viewed as a map from the product space
to the real numbers, given by
. It is said to be jointly measurable if it is measurable with respect to the product sigma-algebra
, where
refers to the Borel sigma-algebra. Finally, it is called progressively measurable, or just progressive, if its restriction to
is
-measurable for each positive time
. It is easily shown that progressively measurable processes are adapted, and the simple processes introduced above are progressive.
With these definitions, the stochastic integral of a progressively measurable process
with respect to Brownian motion
is defined whenever
almost surely (that is, with probability one). The integral (1) is a random variable, defined uniquely up to sets of zero probability by the following two properties.
Continue reading “Integrating with respect to Brownian motion” →
Like this:
Like Loading...