# Martingale Convergence

The martingale property is strong enough to ensure that, under relatively weak conditions, we are guaranteed convergence of the processes as time goes to infinity. In a previous post, I used Doob’s upcrossing inequality to show that, with probability one, discrete-time martingales will converge at infinity under the extra condition of ${L^1}$-boundedness. Here, I consider continuous-time martingales. This is a more general situation, because it considers limits as time runs through the uncountably infinite set of positive reals instead of the countable set of positive integer times. Although these results can also be proven in a similar way by counting the upcrossings of a process, I instead show how they follow directly from the existence of cadlag modifications. We work with respect to a complete filtered probability space ${(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\ge 0},{\mathbb P})}$.

Recall that a stochastic process ${X}$ is ${L^1}$-bounded if the set ${\{X_t\colon t\in{\mathbb R}_+\}}$ is ${L^1}$-bounded. That is, ${{\mathbb E}|X_t|}$ is bounded above by some finite value as ${t}$ runs through the positive reals.

Theorem 1 Let ${X}$ be a cadlag and ${L^1}$-bounded martingale (or submartingale, or supermartingale). Then, the limit ${X_\infty=\lim_{t\rightarrow\infty}X_t}$ exists and is finite, with probability one.

Proof: It suffices to consider submartingales, as the martingale and supermartingale cases follow from applying the result to ${-X}$. So, suppose that ${X}$ is a submartingale such that ${{\mathbb E}|X_t|}$ is bounded above by some real number ${K>0}$. Define the process ${Y}$ by

 $\displaystyle Y_t = \begin{cases} X_{t/(1-t)},&\textrm{if }t<1,\\ 0,&\textrm{if }t\ge 1, \end{cases}$

and the filtration ${\mathcal{G}_t=\mathcal{F}_{t/(1-t)}}$ for ${t<1}$ and ${\mathcal{G}_t=\mathcal{F}_\infty}$ otherwise. Clearly ${Y}$ is right-continuous and is a submartingale with respect to this filtration over the range ${t< 1}$. The idea is to use the results of the post on cadlag modifications to show that it has a cadlag version, in which case the limit ${\lim_{t\rightarrow\infty}X_t=\lim_{t\uparrow\uparrow 1}Y_t}$ exists almost surely.

Consider an elementary process of the form

 $\displaystyle \xi_t=Z_01_{\{t=0\}}+\sum_{k=1}^nZ_k1_{\{t_{k-1}

for a finite sequence of times ${0=t_0, ${\mathcal{G}_{t_{k-1}}}$-measurable random variables ${Z_k}$ and ${\mathcal{G}_0}$-measurable random variable ${Z_0}$. If ${0\le\xi\le 1}$ then ${0\le Z_k\le 1}$. Also, let ${t_m}$ be the first of the times satisfying ${t_m\ge 1}$. Then, using the submartingale property for ${Y}$,

 $\displaystyle \setlength\arraycolsep{2pt} \begin{array}{rcl} \displaystyle{\mathbb E}\left[\int_0^\infty\xi\,dY\right]&\displaystyle=&\displaystyle{\mathbb E}\left[\int_0^{t_{m-1}}\xi\,dY\right]-{\mathbb E}[Z_mY_{t_{m-1}}]\smallskip\\ &\displaystyle\le&\displaystyle{\mathbb E}[Y_{t_{m-1}}-Y_0]+{\mathbb E}|Y_{t_{m-1}}| \le 3K. \end{array}$

So the set of expectations of elementary integrals ${{\mathbb E}[\int\xi\,dY]}$ for ${0\le\xi\le 1}$ is bounded by ${3K}$, and it follows that ${Y}$ has a cadlag modification. ⬜

A particularly simple consequence of this is that nonnegative martingales and supermartingales always have well defined limits at infinity.

Corollary 2 If ${X}$ is a nonnegative cadlag martingale (or supermartingale) then the limit ${X_\infty=\lim_{t\rightarrow\infty}X_t}$ exists and is finite, with probability one.

Proof: The supermartingale property gives ${{\mathbb E}[|X_t|]={\mathbb E}[X_t]\le{\mathbb E}[X_0]}$ showing that ${X}$ is ${L^1}$ bounded. So the previous result applies. ⬜

For uniformly integrable random variables, the following stronger result is obtained. Recall that a process ${X}$ is uniformly integrable if the set of random variables ${\{X_t\colon t\in{\mathbb R}_+\}}$ is uniformly integrable.

Theorem 3 Let ${X}$ be a uniformly integrable cadlag martingale. Then, the limit ${X_\infty=\lim_{t\rightarrow\infty}X_t}$ exists with probability one. Furthermore, this is the unique (up to a zero probability set) integrable and ${\mathcal{F}_\infty}$-measurable random variable such that ${X_t={\mathbb E}[X_\infty\mid\mathcal{F}_t]}$ for each ${t\in{\mathbb R}_+}$.

Proof: As uniformly integrable sets are ${L^1}$-bounded, Theorem 1 shows that the limit ${X_\infty=\lim_{t\rightarrow\infty}}$ exists and is almost surely finite. Uniform integrability allows us to take the limit ${s\rightarrow\infty}$ in the equality ${X_t={\mathbb E}[X_s\mid\mathcal{F}_t]}$ to obtain ${X_t={\mathbb E}[X_\infty\mid\mathcal{F}_t]}$. It only remains to show that this is the only integrable ${\mathcal{F}_\infty}$-measurable random variable with this property. So, suppose that ${X_t={\mathbb E}[Z\mid\mathcal{F}_t]}$ for all ${t}$ and some integrable and ${\mathcal{F}_\infty}$-measurable ${Z}$. Then

 $\displaystyle {\mathbb E}[1_A(X_\infty - Z)]={\mathbb E}[1_A({\mathbb E}[X_\infty\mid\mathcal{F}_t]-{\mathbb E}[Z\mid\mathcal{F}_t])]=0$

for all ${t\in{\mathbb R}_+}$ and ${A\in\mathcal{F}_t}$. The monotone class theorem extends this to all ${A\in\mathcal{F}_\infty}$ from which it follows that ${Z=X_\infty}$ almost surely. ⬜

#### Continuous martingales

Even stronger convergence results exist for continuous martingales.

Theorem 4 Let ${X}$ be a continuous martingale. Then, almost surely, one of the following is satisfied

• ${X_\infty=\lim_{t\rightarrow\infty}X_t}$ exists and is finite.
• ${\limsup_{t\rightarrow\infty}X_t=\infty}$ and ${\liminf_{t\rightarrow\infty}X_t=-\infty}$. In this case, the process hits every value in ${{\mathbb R}}$ at arbitrarily large times.

In the second statement above, the property that the process hits every value at arbitrarily large times is nothing more than the intermediate value theorem applied to ${\limsup_{t\rightarrow\infty}X_t=\infty}$ and ${\liminf_{t\rightarrow\infty}X_t=-\infty}$.

Consider, for example, a standard Brownian motion ${B}$. The sequence ${B_{n+1}-B_n}$ of random variables are independent with the standard normal distribution, so ${B_t}$ cannot converge in the limit as ${t}$ goes to infinity. Consequently, the second statement of Theorem 4 holds, showing that, with probability one, Brownian motion hits every real value at arbitrarily large times.

Theorem 4 follows from applying the following to both ${X}$ and ${-X}$.

Lemma 5 Let ${X}$ be a continuous submartingale. Then, almost surely, one of the following is satisfied.

• ${X_\infty=\lim_{t\rightarrow\infty}X_t}$ exists and is finite.
• ${\limsup_{t\rightarrow\infty}X_t=\infty}$.

Proof: Let ${\tau_n}$ be the sequence of stopping times

 $\displaystyle \tau_n=\inf\left\{t\in{\mathbb R}_+\colon X_t\ge n\right\}.$

Then, ${n-1_{\{\tau_n > 0\}}X^{\tau_n}}$ is a nonnegative supermartingale and Corollary 2 says that it must converge at infinity. This can only happen if either ${X}$ converges or if ${\sup_tX_t\ge n}$. Letting ${n}$ increase to infinity gives the result. ⬜

Although it is possible to generalize Theorem 4 to discontinuous process, it would still be necessary to put restrictions on the size of the jumps ${\Delta X}$. The result does not hold for arbitrary cadlag processes. To see this, consider the following example.

Choose a sequence ${p_n\in(0,1]}$ such that ${\sum_np_n<\infty}$. For example, we could take ${p_n=2^{-n}}$. Then let ${\epsilon_n}$ be a sequence of independent random variables satisfying ${{\mathbb P}(\epsilon_n=1)=p_n}$ and ${{\mathbb P}(\epsilon_n=0)=1-p_n}$. By the Borel Cantelli lemma, ${\epsilon_n=0}$ for all large ${n}$ (with probability one). The process

 $\displaystyle X_t=\sum_{n\le t}(-1)^n(1-\epsilon_n/p_n)$

is a martingale with respect to its natural filtration. However, with probability one, it ends up by forever oscillating up and down by unit jumps, so never converges and never diverges to infinity.

## 9 thoughts on “Martingale Convergence”

1. Tigran says:

Dear George,

I’m curious as to why exactly ythe cadlag property is required for Theorem 1. The proof seems to hold without any use of it.

Thanks,
Tigran

1. The proof uses the fact that the constructed process Y is right-continuous and, as any two right-continuous modifications of the same process are equal (up to evanescence), Y must equal its cadlag modification and, hence, have left-limits.

So, you need right-continuity. For martingales, right-continuity is equivalent to being cadlag.

2. Anonymous says:

I am wondering how you get that $X_t={\mathbb E}[X_\infty\mid\mathcal{F}_t]$ follows from uniform convergence in the proof of theorem 3 since $(Y_t)_{t\geq 0}$ uniformly integrable and $Y_t\rightarrow_{t\rightarrow\infty} Y_\infty$ does not necessarily imply $E[Y_t\mid\mathcal{G}]\rightarrow E[Y_\infty\mid \mathcal{G}]$?

1. I think it does imply $E[Y_t\mid\mathcal{G}]\rightarrow E[Y_\infty\mid \mathcal{G}]$, with $L^1$ convergence.

3. Sid says:

Dear George,

For theorem 4 and lemma 5, we require the martingale to be continuous. You have given an example of a cadlag martingale for which which inferences of the theorem 4 fails.

However, in the proof of lemma 5, where have you used the fact that submartingale is continuous? For the sequence of hitting times to be stopping times we just require right continuity(Debut theorem).

1. This is implicit in the statement that $n - X^{\tau_n}$ is nonnegative. If X was not continuous, it could jump right past the level n resulting in $n - X^{\tau_n}$ going negative.

Actually, to be strict, I should have assumed (wlog) that X starts from 0, just to ensure that it does not start from above n [Update: I fixed the proof to address this].

1. Sid says:

Thank you for the response.