Continuous Local Martingales

Continuous local martingales are a particularly well behaved subset of the class of all local martingales, and the results of the previous two posts become much simpler in this case. First, the continuous local martingale property is always preserved by stochastic integration.

Theorem 1 If X is a continuous local martingale and {\xi} is X-integrable, then {\int\xi\,dX} is a continuous local martingale.

Proof: As X is continuous, {Y\equiv\int\xi\,dX} will also be continuous and, therefore, locally bounded. Then, by preservation of the local martingale property, Y is a local martingale. ⬜

Next, the quadratic variation of a continuous local martingale X provides us with a necessary and sufficient condition for X-integrability.

Theorem 2 Let X be a continuous local martingale. Then, a predictable process {\xi} is X-integrable if and only if

\displaystyle  \int_0^t\xi^2\,d[X]<\infty

for all {t>0}.

Proof: If {\xi} is X-integrable then the quadratic variation {V_t\equiv\int_0^t\xi^2\,d[X]} is finite. Conversely, suppose that V is finite at all times. As X and, therefore, [X] are continuous, V will be continuous. So, it is locally bounded and as previously shown, {\xi} is X-integrable. ⬜

In particular, for a Brownian motion B, a predictable process {\xi} is B-integrable if and only if, almost surely,

\displaystyle  \int_0^t\xi^2_s\,ds<\infty

for all {t>0}. Then, {\int\xi\,dB} is a continuous local martingale.

Quadratic variations also provide us with information about the sample paths of continuous local martingales.

Theorem 3 Let X be a continuous local martingale. Then,

  • X is constant on the same intervals for which [X] is constant.
  • X has infinite variation over all intervals on which [X] is non-constant.

Proof: Consider a bounded interval (s,t) for any {s<t}, and set {t^n_k=s+(t-s)k/n} for k=0,1,…,n. By the definition of quadratic variation, using convergence in probability,

\displaystyle  [X]_t-[X]_s=\lim_{n\rightarrow\infty}\sum_{k=1}^n(X_{t^n_k}-X_{t^n_{k-1}})^2 \le \lim_{n\rightarrow\infty}V\max _{k=1,...,n}\vert X_{t^n_k}-X_{t^n_{k-1}}\vert

where V is the variation of X over the interval (s,t). By continuity, {\vert X_{t^n_k}-X_{t^n_{k-1}}\vert} tends uniformly to zero as n goes to infinity, so {[X]_t=[X]_s} and [X] is constant over (s,t) whenever the variation V is finite. This proves the second statement of the theorem, which also implies that [X] is constant on all intervals for which X is constant.

It only remains to show that {X_t=X_s} whenever {[X]_t=[X]_s}. Applying this also to the countable set of rational times u in (s,t) will then show that X is constant on this interval whenever [X] is.

The process {Y_u\equiv X_u-X_{u\wedge s}} is a local martingale constant up until s, with quadratic variation {[Y]_u=[X]_u-[X]_s} for {u\ge s}. Then {\tau=\inf\{u\colon [ Y]>0\}} is a stopping time with respect to the right-continuous filtration {\mathcal{F}_{\cdot+}} and, by stopping, {Y^{\tau}} is a local martingale with zero quadratic variation {[Y^\tau]=[Y]^\tau=0}. Then, as previously shown, {Y^2=Y^2-[Y]} is a martingale and, therefore, {{\mathbb E}[Y_t^2]=0}. This shows that {X_{t\wedge\tau}=X_s} almost surely. Finally, on the set {[X]_t=[X]_s}, we have {\tau\ge t} and, hence, {X_t=X_{t\wedge\tau}=X_s}. ⬜

Theorem 3 has the following immediate consequence.

Corollary 4 Any continuous FV local martingale is constant.

Proof: By the second statement of Theorem 3, the quadratic variation [X] is constant. Then, by the first statement, X is constant. ⬜

The quadratic covariation also tells us exactly when X converges at infinity.

Theorem 5 Let X be a continuous local martingale. Then, with probability one, the following both hold.

  • {X_\infty=\lim_{t\rightarrow\infty}X_t} exists and is finite whenever {[X]_\infty<\infty}.
  • {\limsup_{t\rightarrow\infty}X_t=\infty} and {\liminf_{t\rightarrow\infty}X_t=-\infty} whenever {[X]_\infty=\infty}.

Proof: By martingale convergence, with probability one either {X_\infty} exists and is finite or {\limsup_{t\rightarrow\infty}X_t} and {\limsup_{t\rightarrow\infty}(-X_t)} are both infinite. It just remains to be shown that, with probability one, {X_\infty} exists if and only if {[X]_\infty} is finite..

Let {\tau_n=\inf\{t\colon [X]_t\ge n\}}. Then, {X^{\tau_n}} is a local martingale with quadratic variation {[X^{\tau_n}]=[X]^{\tau_n}} bounded by n. So, {{\mathbb E}[(X^{\tau_n}_t)^2]\le n} and {X^{\tau_n}} is an {L^2}-bounded martingale which, therefore, almost surely converges at infinity. In particular, on the set

\displaystyle  \left\{[X]_\infty<n\right\}\subseteq\left\{\tau_n=\infty\right\}

we have {X_\infty=\lim_{t\rightarrow\infty}X_t=\lim_{t\rightarrow\infty}X^{\tau_n}_t} outside of a set of zero probability. Therefore, {X_\infty} almost surely exists on

\displaystyle  \left\{[X]_\infty<\infty\right\}=\bigcup_{n=1}^\infty\left\{[X]_\infty<n\right\}.

For the converse statement, set {\tau_n=\inf\{t\colon\vert X_t\vert\ge n\}}. Then, {X^{\tau_n}} is a local martingale bounded by n and {{\mathbb E}[[X]_{\tau_n}]={\mathbb E}[X_{\tau_n}^2]\le n^2}. Hence, {[X]_{\tau_n}} is almost surely finite and {[X]_\infty} is finite on the set

\displaystyle  \left\{\sup_t\vert X_t\vert<n\right\}\subseteq\left\{\tau_n=\infty\right\},

outside of a set of zero probability. Therefore, {[X]_\infty} is almost surely finite on the set

\displaystyle  \left\{X_\infty{\rm\ exists}\right\}\subseteq\left\{\sup_t\vert X_t\vert<\infty\right\}=\bigcup_n\left\{\sup_t\vert X_t\vert<n\right\}.

Theorems 3 and 5 are easily understood once it is known that all local martingales are random time-changes of standard Brownian motion, as will be covered in a later post.

The topology of uniform convergence on compacts in probability (ucp convergence) was introduced in a previous post, along with the stronger semimartingale topology. On the space of continuous local martingales, these two topologies are actually equivalent, and can be expressed in terms of the quadratic variation. Recalling that semimartingale convergence implies ucp convergence and that quadratic variation is a continuous map under the semimartingale topology, it is immediate that the first and third statements below follow from the second. However, the other implications are specific to continuous local martingales.

Lemma 6 Let {\{M^n\}_{n\in{\mathbb N}}} and M be continuous local martingales. Then, as n goes to infinity, the following are equivalent.

  1. {M^n} converges ucp to M.
  2. {M^n} converges to M in the semimartingale topology.
  3. {(M^n_0-M_0)^2+[M^n-M]_t\rightarrow0} in probability, for each {t\ge0}.

Proof: As semimartingale convergence implies ucp convergence, the first statement follows immediately from the second. So, suppose that {M^n\xrightarrow{\rm ucp}M}. Write {N^n\equiv M^n-M} and let {\tau_n} be the first time at which {\vert N^n-N^n_0\vert\ge1}. Ucp convergence implies that {\tau_n} tends to infinity in probability, so to prove the third statement it is enough to show that {[N^n]_{\tau_n\wedge t}} tends to zero in probability. By continuity, the stopped process {(N^n-N^n_0)^{\tau_n}} is uniformly bounded by 1, so is a square integrable martingale, and Ito’s isometry gives

\displaystyle  \displaystyle{\mathbb E}\left[[N^n]_{\tau_n\wedge t}\right]={\mathbb E}\left[(N^n_{\tau_n\wedge t}-N^n_0)^2\right]\rightarrow0

as n goes to infinity. The limit here follows from the fact that {N^n_{\tau_n\wedge t}-N^n_0} is bounded by 1 and tends to zero in probability. So, we have shown that {N^n_{\tau_n\wedge t}} tends to zero in the {L^2} norm and, hence, in probability.

Now suppose that the third statement holds. This immediately gives {N^n_0\rightarrow0} in probability. Letting {\tau_n} be the first time at which {[N^n]\ge1} and {\vert\xi^n\vert\le1} be elementary predictable processes, Ito’s isometry gives

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle{\mathbb E}\left[\left(\int_0^{\tau_n\wedge t}\xi^n\,dN^n\right)^2\right] &\displaystyle\le{\mathbb E}\left[\int_0^{\tau_n\wedge t}(\xi^n)^2\,d[N^n]\right]\smallskip\\ &\displaystyle\le{\mathbb E}\left[[N^n]_{\tau_n\wedge t}\right]\smallskip\\ &\displaystyle\le{\mathbb E}\left[[N^n]_t\wedge1\right] \rightarrow0. \end{array}

So, in particular, {\int_0^{\tau_n\wedge t}\xi^n\,dN^n\rightarrow0} in probability. Finally, as {\tau_n > t} whenever {[N^n]_t < 1}, which has probability one in the limit {n\rightarrow\infty}, this shows that {\int_0^t\xi^n\,dN^n} tends to zero in probability and {N^n} tends to zero in the semimartingale topology. ⬜

Applying the previous result to stochastic integrals with respect to a continuous local martingale gives a particularly strong extension of the dominated convergence theorem in this case. Note that this reduces convergence of the stochastic integral to convergence in probability of Lebesgue-Stieltjes integrals with respect to {[X]}.

Theorem 7 Let X be a continuous local martingale and {\{\xi^n\}_{n\in{\mathbb N}}}, {\xi} be X-integrable processes. Then, the following are equivalent.

  1. {\int\xi^n\,dX} converges ucp to {\int\xi\,dX}.
  2. {\int\xi^n\,dX} converges to {\int\xi\,dX} in the semimartingale topology.
  3. {\int_0^t(\xi^n-\xi)^2\,d[X]\rightarrow0} in probability, for each {t\ge0}.

Proof: This follows from applying Lemma 6 to the continuous local martingales {M^n=\int\xi^n\,dX} and {M=\int\xi\,dX}. ⬜

Theorem 7 also provides an alternative route to constructing the stochastic integral with respect to continuous local martingales. Although, in these notes, we first proved that continuous local martingales are semimartingales and used this to imply the existence of the quadratic variation, it is possible to construct the quadratic variation more directly. Once this is done, the space {L^1(X)} of X-integrable processes can be defined to be the predictable processes {\xi} such that {\int_0^t\xi^2\,d[X]} is almost surely finite for all times t. Define the topology on {L^1(X)} so that {\xi^n\rightarrow\xi} if and only if {\int_0^t(\xi^n-\xi)^2\,d[X]\rightarrow0} in probability as {n\rightarrow\infty} for each t, and use ucp convergence for the topology on the integrals {\int\xi\,dX}. Then, Theorem 7 says that {\xi\mapsto\int\xi\,dX} is the unique continuous extension from the elementary integrands to all of {L^1(X)}.

17 thoughts on “Continuous Local Martingales

  1. Hi, nice blog!

    In Theorem 5, what do you mean by [X]_{\infty} < \infty. That the quadratic variation is uniformly bounded by a constant C?

    Otherwise your arguments would probably not work? If [X] would only be a.s. finite, then an "n" such that [X] \leq n a.s. would not exist (example: normal random variable takes on only finite values and is therefore a.s. finite but there is no upper bound to the values it takes)…

    Is that the way to understand your statement?

    Cheers

    Chris

    1. No, all that matters is that [X]_\infty < n for some n. n is not fixed, so uniform boundedness is not needed. You don't even need [X]_\infty to be almost surely finite. Even if it is only finite on a set of probability p < 1, X will converge on that set (up to a zero probability set).

  2. Ok, still not sure whether I can follow you.

    When [X]_\infty < n for some n and this holds for all \omega in the sample space then it is uniformly bounded by a constant.

    When you mean [X]_\infty(\omega) < n(\omega), that is for every element \omega in the sample space you find an n such that [X]_\infty is bounded by n on that given \omega only then the subsequent estimate E[(X^{\tau_n}_t)^2] \leq n does not hold anymore. That estimate would only hold if you have on the right of the estimate something like the ess sup n(\omega), so basically the essential supremum of n over all elements in the sample space. However nothing gurantees you, that this supremum even exists….

  3. I think you’re still misunderstanding my argument. There is no need to be thinking about essential supremums. I’ll try modifying the argument and proof to make it clearer. I’m not able to do this right now as I’m away from my computer (just replying by mobile). Hopefully will have time later tonight.

    For now, the theorem could be stated more precisely as follows.
    \displaystyle A=\{\omega\in\Omega: [X]_\infty(\omega) < \infty \}
    then, outside of a set of probability zero, X_\infty =lim_{t\rightarrow\infty} X_t exists and is finite on A, \limsup_{t\rightarrow\infty} X_t=\infty and \liminf_{t\to\infty} X_t=-\infty outside of A.

  4. Update: I have added a couple of extra results to this post. Lemma 6 shows that ucp convergence, semimartingale convergence and convergence in probability of quadratic variations all coincide. Theorem 7 uses this to give a much stronger version of the dominated convergence theorem for continuous local martingales.

  5. Hi George,

    I have been looking for an answer to the following question on continuous local martingales:
    Can one construct a non-zero (in sense of having P>0 of being nonzero) continuous local martingale which is identically equal to $0$ P-a.e. at a fixed time $T$? It has to do with an option hedging problem I am working on.

    Thanks in advance!
    Tigran

  6. Hi George, in the proof of Theorem 3, you use a stopping time with respect to the right continuous filtration and claim that Y^\tau is a local martingale (I assume with respect to the right continuous filtration), but why would a local martingale in the original filtration still be a local martingale in the right continuous filtration?

    1. A right-continuous martingale will still be a martingale under the right-continuous filtration. For times s < t < u, \mathbb E[X_u\vert\mathcal F_t]=X_t.
      Take conditional expectations of this and let t decrease to s to get \mathbb E[X_u\vert\mathcal F_{s+}]=X_s.

      Local martingales are right-continuous by definition (in any case, the theorem assumes that it is continuous). Any localizing sequence of stopping times will also be a localizing sequence in the larger filtration, so it remains a local martingale in the right-continuous filtration.

Leave a reply to Chris Cancel reply