The martingale representation theorem states that any martingale adapted with respect to a Brownian motion can be expressed as a stochastic integral with respect to the same Brownian motion.
Theorem 1 Let B be a standard Brownian motion defined on a probability space and be its natural filtration.
Then, every –local martingale M can be written as
for a predictable, B-integrable, process .
As stochastic integration preserves the local martingale property for continuous processes, this result characterizes the space of all local martingales starting from 0 defined with respect to the filtration generated by a Brownian motion as being precisely the set of stochastic integrals with respect to that Brownian motion. Equivalently, Brownian motion has the predictable representation property. This result is often used in mathematical finance as the statement that the Black-Scholes model is complete. That is, any contingent claim can be exactly replicated by trading in the underlying stock. This does involve some rather large and somewhat unrealistic assumptions on the behaviour of financial markets and ability to trade continuously without incurring additional costs. However, in this post, I will be concerned only with the mathematical statement and proof of the representation theorem.
In more generality, the martingale representation theorem can be stated for a d-dimensional Brownian motion as follows.
Theorem 2 Let be a d-dimensional Brownian motion defined on the filtered probability space , and suppose that is the natural filtration generated by B and .
Then, every -local martingale M can be expressed as
for predictable processes satisfying , almost surely, for each .
By Lebesgue’s theorem, this shows that is given by the derivative
for almost every t.
Proving the representation theorem involves showing that, for all integrable random variables Z, there exist predictable and -integrable processes satisfying
(almost surely), for each time . As long as this can be shown to hold for a suitably large set of random variables, the representation theorem will follow.
for deterministic processes satisfying (for now, i denotes the square root of -1) and -measurable variable U. In particular, for a sequence of times and , taking gives
As are normal with zero mean and variance independently of , the expectation can be written out as
This is continuous and, setting , Ito’s lemma gives
Alternatively, it can be shown that (2) holds for all random variables of the form , for a bounded measurable function and time T. For times , is, by definition, joint normal with covariance matrix independently of . This allows us to write out for all as a function of ,
From this, it can be seen that f is twice continuously differentiable, so Ito’s formula can be applied,
Here the subscripts t and i represent the partial derivatives of with respect to time and respectively. The final term on the right-hand-side of (4) is a continuous FV process, all the other terms being local martingales. As continuous FV local martingales are constant, this allows us to break (4) up into the following two equations,
The first of these is the Kolmogorov backward equation, and the second shows that does indeed satisfy the representation property (2) with integrands . It should be clear that this argument is quite general, and applies to any continuous martingale which is Markov with twice continuously differentiable transition densities. In fact, it is not difficult to extend the argument to variables of the form
for times , -measurable variable U and bounded measurable function . However, I will not do this here, as Z of the form (3) is already enough to prove the martingale representation theorem for Brownian motion.
From here, it is possible to extend (2) to all bounded measurable random variables, by taking limits and applying the monotone class theorem. Ito’s isometry can be used to show that -convergence of the random variables implies convergence of the integrands . However, in the proof given here, I will instead make use of the following more general result, applying to all continuous local martingales. Given a continuous local martingale N, every other local martingale can be expressed as an integral with respect to N plus an `orthogonal’ term. This is a useful result in its own right. In finance, it has applications to models of incomplete markets in which it is not possible to exactly replicate all contingent claims by trading in the underlying stock. Instead, it is only possible to replicate them up to an orthogonal (i.e., unhedgeable) term.
Lemma 3 Let N be a continuous local martingale and M be any other local martingale. Then, there exists an N-integrable process and local martingale L such that
Proof: The lemma reduces to the statement that there is an N-integrable process satisfying . Then, defining gives
First, the FV process [M,N] is absolutely continuous with respect to [N]. That is, if is a bounded process satisfying , then (almost surely). This follows from the Kunita-Watanabe inequality,
This implies that there exists a predictable process with . It needs to be shown that is N-integrable. That is, is almost-surely finite for each time t. This is, again, a consequence of the Kunita-Watanabe inequality. For any constant K,
Cancelling the term from the right hand side, squaring, and taking the limit gives
which is finite, as required. ⬜
This has the following d-dimensional generalization.
Lemma 4 Let be continuous local martingales with covariations for . Then, every local martingale M can be expressed as
for -integrable processes and local martingale L satisfying .
Proof: By Lemma 3, there are -integrable processes such that satisfy . Setting and using the condition that for gives
The martingale representation theorem follows immediately from applying the following to Lemma 4, so that the orthogonal term L is constant.
Lemma 5 Let be a d-dimensional Brownian motion defined on the filtered probability space , and suppose that is the natural filtration generated by B and .
Then, any local martingale L satisfying for each i is constant.
Proof: By localization, it is enough to consider the case where L is a cadlag martingale and, replacing L by if necessary, we can assume that . Consider a bounded random variable Z satisfying (2). Then, is a bounded martingale with covariation . So, LM is a local martingale. As M is bounded and L is a martingale, LM will be of class (DL), so is a proper martingale. Then,
For any fixed time t, consider the set S of bounded random variables Z satisfying . As shown above, this includes all Z of the form (3), which generate the sigma-algebra given by B. Also, dominated convergence implies that S is closed under taking limits of uniformly bounded sequences of variables. So, by the monotone class theorem, S contains all bounded random variables measurable with respect to and B. In particular, if the filtration is generated by and B then is in S, giving,
and (almost surely), as required. ⬜