# Lévy’s Characterization of Brownian Motion

Standard Brownian motion, ${\{B_t\}_{t\ge 0}}$, is defined to be a real-valued process satisfying the following properties.

1. ${B_0=0}$.
2. ${B_t-B_s}$ is normally distributed with mean 0 and variance ts independently of ${\{B_u\colon u\le s\}}$, for any ${t>s\ge 0}$.
3. B has continuous sample paths.

As always, it only really matters is that these properties hold almost surely. Now, to apply the techniques of stochastic calculus, it is assumed that there is an underlying filtered probability space ${(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\ge 0},{\mathbb P})}$, which necessitates a further definition; a process B is a Brownian motion on a filtered probability space ${(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\ge 0},{\mathbb P})}$ if in addition to the above properties it is also adapted, so that ${B_t}$ is ${\mathcal{F}_t}$-measurable, and ${B_t-B_s}$ is independent of ${\mathcal{F}_s}$ for each ${t>s\ge 0}$. Note that the above condition that ${B_t-B_s}$ is independent of ${\{B_u\colon u\le s\}}$ is not explicitly required, as it also follows from the independence from ${\mathcal{F}_s}$. According to these definitions, a process is a Brownian motion if and only if it is a Brownian motion with respect to its natural filtration.

The property that ${B_t-B_s}$ has zero mean independently of ${\mathcal{F}_s}$ means that Brownian motion is a martingale. Furthermore, we previously calculated its quadratic variation as ${[B]_t=t}$. An incredibly useful result is that the converse statement holds. That is, Brownian motion is the only local martingale with this quadratic variation. This is known as Lévy’s characterization, and shows that Brownian motion is a particularly general stochastic process, justifying its ubiquitous influence on the study of continuous-time stochastic processes.

Theorem 1 (Lévy’s Characterization of Brownian Motion) Let X be a local martingale with ${X_0=0}$. Then, the following are equivalent.

1. X is standard Brownian motion on the underlying filtered probability space.
2. X is continuous and ${X^2_t-t}$ is a local martingale.
3. X has quadratic variation ${[X]_t=t}$.

# Continuous Local Martingales

Continuous local martingales are a particularly well behaved subset of the class of all local martingales, and the results of the previous two posts become much simpler in this case. First, the continuous local martingale property is always preserved by stochastic integration.

Theorem 1 If X is a continuous local martingale and ${\xi}$ is X-integrable, then ${\int\xi\,dX}$ is a continuous local martingale.

Proof: As X is continuous, ${Y\equiv\int\xi\,dX}$ will also be continuous and, therefore, locally bounded. Then, by preservation of the local martingale property, Y is a local martingale. ⬜

Next, the quadratic variation of a continuous local martingale X provides us with a necessary and sufficient condition for X-integrability.

Theorem 2 Let X be a continuous local martingale. Then, a predictable process ${\xi}$ is X-integrable if and only if

 $\displaystyle \int_0^t\xi^2\,d[X]<\infty$

for all ${t>0}$.

# Preservation of the Local Martingale Property

Now that it has been shown that stochastic integration can be performed with respect to any local martingale, we can move on to the following important result. Stochastic integration preserves the local martingale property. At least, this is true under very mild hypotheses. That the martingale property is preserved under integration of bounded elementary processes is straightforward. The generalization to predictable integrands can be achieved using a limiting argument. It is necessary, however, to restrict to locally bounded integrands and, for the sake of generality, I start with local sub and supermartingales.

Theorem 1 Let X be a local submartingale (resp., local supermartingale) and ${\xi}$ be a nonnegative and locally bounded predictable process. Then, ${\int\xi\,dX}$ is a local submartingale (resp., local supermartingale).

Proof: We only need to consider the case where X is a local submartingale, as the result will also follow for supermartingales by applying to -X. By localization, we may suppose that ${\xi}$ is uniformly bounded and that X is a proper submartingale. So, ${\vert\xi\vert\le K}$ for some constant K. Then, as previously shown there exists a sequence of elementary predictable processes ${\vert\xi^n\vert\le K}$ such that ${Y^n\equiv\int\xi^n\,dX}$ converges to ${Y\equiv\int\xi\,dX}$ in the semimartingale topology and, hence, converges ucp. We may replace ${\xi_n}$ by ${\xi_n\vee0}$ if necessary so that, being nonnegative elementary integrals of a submartingale, ${Y^n}$ will be submartingales. Also, ${\vert\Delta Y^n\vert=\vert\xi^n\Delta X\vert\le K\vert\Delta X\vert}$. Recall that a cadlag adapted process X is locally integrable if and only its jump process ${\Delta X}$ is locally integrable, and all local submartingales are locally integrable. So,

$\displaystyle \sup_n\vert\Delta Y^n_t\vert\le K\vert\Delta X_t\vert$

is locally integrable. Then, by ucp convergence for local submartingales, Y will satisfy the local submartingale property. ⬜

For local martingales, applying this result to ${\pm X}$ gives,

Theorem 2 Let X be a local martingale and ${\xi}$ be a locally bounded predictable process. Then, ${\int\xi\,dX}$ is a local martingale.

This result can immediately be extended to the class of local ${L^p}$-integrable martingales, denoted by ${\mathcal{M}^p_{\rm loc}}$.

Corollary 3 Let ${X\in\mathcal{M}^p_{\rm loc}}$ for some ${0< p\le\infty}$ and ${\xi}$ be a locally bounded predictable process. Then, ${\int\xi\,dX\in\mathcal{M}^p_{\rm loc}}$.

# Local Martingales

Recall from the previous post that a cadlag adapted process ${X}$ is a local martingale if there is a sequence ${\tau_n}$ of stopping times increasing to infinity such that the stopped processes ${1_{\{\tau_n>0\}}X^{\tau_n}}$ are martingales. Local submartingales and local supermartingales are defined similarly.

An example of a local martingale which is not a martingale is given by the double-loss’ gambling strategy. Interestingly, in 18th century France, such strategies were known as martingales and is the origin of the mathematical term. Suppose that a gambler is betting sums of money, with even odds, on a simple win/lose game. For example, betting that a coin toss comes up heads. He could bet one dollar on the first toss and, if he loses, double his stake to two dollars for the second toss. If he loses again, then he is down three dollars and doubles the stake again to four dollars. If he keeps on doubling the stake after each loss in this way, then he is always gambling one more dollar than the total losses so far. He only needs to continue in this way until the coin eventually does come up heads, and he walks away with net winnings of one dollar. This therefore describes a fair game where, eventually, the gambler is guaranteed to win.

Of course, this is not an effective strategy in practise. The losses grow exponentially and, if he doesn’t win quickly, the gambler must hit his credit limit in which case he loses everything. All that the strategy achieves is to trade a large probability of winning a dollar against a small chance of losing everything. It does, however, give a simple example of a local martingale which is not a martingale.

The gamblers winnings can be defined by a stochastic process ${\{Z_n\}_{n=1,\ldots}}$ representing his net gain (or loss) just before the n’th toss. Let ${\epsilon_1,\epsilon_2,\ldots}$ be a sequence of independent random variables with ${{\mathbb P}(\epsilon_n=1)={\mathbb P}(\epsilon_n=-1)=1/2}$. Here, ${\epsilon_n}$ represents the outcome of the n’th toss, with 1 referring to a head and -1 referring to a tail. Set ${Z_1=0}$ and

$\displaystyle Z_{n}=\begin{cases} 1,&\text{if }Z_{n-1}=1,\\ Z_{n-1}+\epsilon_n(1-Z_{n-1}),&\text{otherwise}. \end{cases}$

This is a martingale with respect to its natural filtration, starting at zero and, eventually, ending up equal to one. It can be converted into a local martingale by speeding up the time scale to fit infinitely many tosses into a unit time interval

$\displaystyle X_t=\begin{cases} Z_n,&\text{if }1-1/n\le t<1-1/(n+1),\\ 1,&\text{if }t\ge 1. \end{cases}$

This is a martingale with respect to its natural filtration on the time interval ${[0,1)}$. Letting ${\tau_n=\inf\{t\colon\vert X_t\vert\ge n\}}$ then the optional stopping theorem shows that ${X^{\tau_n}_t}$ is a uniformly bounded martingale on ${t<1}$, continuous at ${t=1}$, and constant on ${t\ge 1}$. This is therefore a martingale, showing that ${X}$ is a local martingale. However, ${{\mathbb E}[X_1]=1\not={\mathbb E}[X_0]=0}$, so it is not a martingale. Continue reading “Local Martingales”

# Localization

Special classes of processes, such as martingales, are very important to the study of stochastic calculus. In many cases, however, processes under consideration almost’ satisfy the martingale property, but are not actually martingales. This occurs, for example, when taking limits or stochastic integrals with respect to martingales. It is necessary to generalize the martingale concept to that of local martingales. More generally, localization is a method of extending a given property to a larger class of processes. In this post I mention a few definitions and simple results concerning localization, and look more closely at local martingales in the next post.

Definition 1 Let P be a class of stochastic processes. Then, a process X is locally in P if there exists a sequence of stopping times ${\tau_n\uparrow\infty}$ such that the stopped processes

$\displaystyle 1_{\{\tau_n>0\}}X^{\tau_n}$

are in P. The sequence ${\tau_n}$ is called a localizing sequence for X (w.r.t. P).

I write ${P_{\rm loc}}$ for the processes locally in P. Choosing the sequence ${\tau_n\equiv\infty}$ of stopping times shows that ${P\subseteq P_{\rm loc}}$. A class of processes is said to be stable if ${1_{\{\tau>0\}}X^\tau}$ is in P whenever X is, for all stopping times ${\tau}$. For example, the optional stopping theorem shows that the classes of cadlag martingales, cadlag submartingales and cadlag supermartingales are all stable.

Definition 2 A process is a

1. a local martingale if it is locally in the class of cadlag martingales.
2. a local submartingale if it is locally in the class of cadlag submartingales.
3. a local supermartingale if it is locally in the class of cadlag supermartingales.