Predictable Stopping Times

Although this post is under the heading of the general theory of semimartingales’ it is not, strictly speaking, about semimartingales at all. Instead, I will be concerned with a characterization of predictable stopping times. The reason for including this now is twofold. First, the results are too advanced to have been proven in the earlier post on predictable stopping times, and reasonably efficient self-contained proofs can only be given now that we have already built up a certain amount of stochastic calculus theory. Secondly, the results stated here are indispensable to the further study of semimartingales. In particular, standard semimartingale decompositions require some knowledge of predictable processes and predictable stopping times.

Recall that a stopping time ${\tau}$ is said to be predictable if there exists a sequence of stopping times ${\tau_n\le\tau}$ increasing to ${\tau}$ and such that ${\tau_n < \tau}$ whenever ${\tau > 0}$. Also, the predictable sigma-algebra ${\mathcal{P}}$ is defined as the sigma-algebra generated by the left-continuous and adapted processes. Stated like this, these two concepts can appear quite different. However, as was previously shown, stochastic intervals of the form ${[\tau,\infty)}$ for predictable times ${\tau}$ are all in ${\mathcal{P}}$ and, in fact, generate the predictable sigma-algebra.

The main result (Theorem 1) of this post is to show that a converse statement holds, so that ${[\tau,\infty)}$ is in ${\mathcal{P}}$ if and only if the stopping time ${\tau}$ is predictable. This rather simple sounding result does have many far-reaching consequences. We can use it show that all cadlag predictable processes are locally bounded, local martingales are predictable if and only if they are continuous, and also give a characterization of cadlag predictable processes in terms of their jumps. Some very strong statements about stopping times also follow without much difficulty for certain special stochastic processes. For example, if the underlying filtration is generated by a Brownian motion then every stopping time is predictable. Actually, this is true whenever the filtration is generated by a continuous Feller process. It is also possible to give a surprisingly simple characterization of stopping times for filtrations generated by arbitrary non-continuous Feller processes. Precisely, a stopping time ${\tau}$ is predictable if the process is almost surely continuous at time ${\tau}$ and is totally inaccessible if the underlying Feller process is almost surely discontinuous at ${\tau}$.

As usual, we work with respect to a complete filtered probability space ${(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\in{\mathbb R}_+},{\mathbb P})}$. I now give a statement and proof of the main result of this post. Note that the equivalence of the four conditions below means that any of them can be used as alternative definitions of predictable stopping times. Often, the first condition below is used instead. Stopping times satisfying the definition used in these notes are sometimes called announceable, with the sequence ${\tau_n\uparrow\tau}$ said to announce ${\tau}$ (this terminology is used by, e.g., Rogers & Williams). Stopping times satisfying property 3 below, which is easily seen to be equivalent to 2, are sometimes called fair. Then, the following theorem says that the sets of predictable, fair and announceable stopping times all coincide.

Theorem 1 Let ${\tau}$ be a stopping time. Then, the following are equivalent.

1. ${[\tau]\in\mathcal{P}}$.
2. ${\Delta M_\tau1_{[\tau,\infty)}}$ is a local martingale for all local martingales M.
3. ${{\mathbb E}[1_{\{\tau < \infty\}}\Delta M_\tau]=0}$ for all cadlag bounded martingales M.
4. ${\tau}$ is predictable.

Lévy Processes

Continuous-time stochastic processes with stationary independent increments are known as Lévy processes. In the previous post, it was seen that processes with independent increments are described by three terms — the covariance structure of the Brownian motion component, a drift term, and a measure describing the rate at which jumps occur. Being a special case of independent increments processes, the situation with Lévy processes is similar. However, stationarity of the increments does simplify things a bit. We start with the definition.

Definition 1 (Lévy process) A d-dimensional Lévy process X is a stochastic process taking values in ${{\mathbb R}^d}$ such that

• independent increments: ${X_t-X_s}$ is independent of ${\{X_u\colon u\le s\}}$ for any ${s.
• stationary increments: ${X_{s+t}-X_s}$ has the same distribution as ${X_t-X_0}$ for any ${s,t>0}$.
• continuity in probability: ${X_s\rightarrow X_t}$ in probability as s tends to t.

More generally, it is possible to define the notion of a Lévy process with respect to a given filtered probability space ${(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\ge0},{\mathbb P})}$. In that case, we also require that X is adapted to the filtration and that ${X_t-X_s}$ is independent of ${\mathcal{F}_s}$ for all ${s < t}$. In particular, if X is a Lévy process according to definition 1 then it is also a Lévy process with respect to its natural filtration ${\mathcal{F}_t=\sigma(X_s\colon s\le t)}$. Note that slightly different definitions are sometimes used by different authors. It is often required that ${X_0}$ is zero and that X has cadlag sample paths. These are minor points and, as will be shown, any process satisfying the definition above will admit a cadlag modification.

The most common example of a Lévy process is Brownian motion, where ${X_t-X_s}$ is normally distributed with zero mean and variance ${t-s}$ independently of ${\mathcal{F}_s}$. Other examples include Poisson processes, compound Poisson processes, the Cauchy process, gamma processes and the variance gamma process.

For example, the symmetric Cauchy distribution on the real numbers with scale parameter ${\gamma > 0}$ has probability density function p and characteristic function ${\phi}$ given by,

 $\displaystyle \setlength\arraycolsep{2pt} \begin{array}{rl} &\displaystyle p(x)=\frac{\gamma}{\pi(\gamma^2+x^2)},\smallskip\\ &\displaystyle\phi(a)\equiv{\mathbb E}\left[e^{iaX}\right]=e^{-\gamma\vert a\vert}. \end{array}$ (1)

From the characteristic function it can be seen that if X and Y are independent Cauchy random variables with scale parameters ${\gamma_1}$ and ${\gamma_2}$ respectively then ${X+Y}$ is Cauchy with parameter ${\gamma_1+\gamma_2}$. We can therefore consistently define a stochastic process ${X_t}$ such that ${X_t-X_s}$ has the symmetric Cauchy distribution with parameter ${t-s}$ independent of ${\{X_u\colon u\le t\}}$, for any ${s < t}$. This is called a Cauchy process, which is a purely discontinuous Lévy process. See Figure 1.

Lévy processes are determined by the triple ${(\Sigma,b,\nu)}$, where ${\Sigma}$ describes the covariance structure of the Brownian motion component, b is the drift component, and ${\nu}$ describes the rate at which jumps occur. The distribution of the process is given by the Lévy-Khintchine formula, equation (3) below.

Theorem 2 (Lévy-Khintchine) Let X be a d-dimensional Lévy process. Then, there is a unique function ${\psi\colon{\mathbb R}\rightarrow{\mathbb C}}$ such that

 $\displaystyle {\mathbb E}\left[e^{ia\cdot (X_t-X_0)}\right]=e^{t\psi(a)}$ (2)

for all ${a\in{\mathbb R}^d}$ and ${t\ge0}$. Also, ${\psi(a)}$ can be written as

 $\displaystyle \psi(a)=ia\cdot b-\frac{1}{2}a^{\rm T}\Sigma a+\int _{{\mathbb R}^d}\left(e^{ia\cdot x}-1-\frac{ia\cdot x}{1+\Vert x\Vert}\right)\,d\nu(x)$ (3)

where ${\Sigma}$, b and ${\nu}$ are uniquely determined and satisfy the following,

1. ${\Sigma\in{\mathbb R}^{d^2}}$ is a positive semidefinite matrix.
2. ${b\in{\mathbb R}^d}$.
3. ${\nu}$ is a Borel measure on ${{\mathbb R}^d}$ with ${\nu(\{0\})=0}$ and,
 $\displaystyle \int_{{\mathbb R}^d}\Vert x\Vert^2\wedge 1\,d\nu(x)<\infty.$ (4)

Furthermore, ${(\Sigma,b,\nu)}$ uniquely determine all finite distributions of the process ${X-X_0}$.

Conversely, if ${(\Sigma,b,\nu)}$ is any triple satisfying the three conditions above, then there exists a Lévy process satisfying (2,3).

Processes with Independent Increments

In a previous post, it was seen that all continuous processes with independent increments are Gaussian. We move on now to look at a much more general class of independent increments processes which need not have continuous sample paths. Such processes can be completely described by their jump intensities, a Brownian term, and a deterministic drift component. However, this class of processes is large enough to capture the kinds of behaviour that occur for more general jump-diffusion processes. An important subclass is that of Lévy processes, which have independent and stationary increments. Lévy processes will be looked at in more detail in the following post, and includes as special cases, the Cauchy process, gamma processes, the variance gamma process, Poisson processes, compound Poisson processes and Brownian motion.

Recall that a process ${\{X_t\}_{t\ge0}}$ has the independent increments property if ${X_t-X_s}$ is independent of ${\{X_u\colon u\le s\}}$ for all times ${0\le s\le t}$. More generally, we say that X has the independent increments property with respect to an underlying filtered probability space ${(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\ge0},{\mathbb P})}$ if it is adapted and ${X_t-X_s}$ is independent of ${\mathcal{F}_s}$ for all ${s < t}$. In particular, every process with independent increments also satisfies the independent increments property with respect to its natural filtration. Throughout this post, I will assume the existence of such a filtered probability space, and the independent increments property will be understood to be with regard to this space.

The process X is said to be continuous in probability if ${X_s\rightarrow X_t}$ in probability as s tends to t. As we now state, a d-dimensional independent increments process X is uniquely specified by a triple ${(\Sigma,b,\mu)}$ where ${\mu}$ is a measure describing the jumps of X, ${\Sigma}$ determines the covariance structure of the Brownian motion component of X, and b is an additional deterministic drift term.

Theorem 1 Let X be an ${{\mathbb R}^d}$-valued process with independent increments and continuous in probability. Then, there is a unique continuous function ${{\mathbb R}^d\times{\mathbb R}_+\rightarrow{\mathbb C}}$, ${(a,t)\mapsto\psi_t(a)}$ such that ${\psi_0(a)=0}$ and

 $\displaystyle {\mathbb E}\left[e^{ia\cdot (X_t-X_0)}\right]=e^{i\psi_t(a)}$ (1)

for all ${a\in{\mathbb R}^d}$ and ${t\ge0}$. Also, ${\psi_t(a)}$ can be written as

 $\displaystyle \psi_t(a)=ia\cdot b_t-\frac{1}{2}a^{\rm T}\Sigma_t a+\int _{{\mathbb R}^d\times[0,t]}\left(e^{ia\cdot x}-1-\frac{ia\cdot x}{1+\Vert x\Vert}\right)\,d\mu(x,s)$ (2)

where ${\Sigma_t}$, ${b_t}$ and ${\mu}$ are uniquely determined and satisfy the following,

1. ${t\mapsto\Sigma_t}$ is a continuous function from ${{\mathbb R}_+}$ to ${{\mathbb R}^{d^2}}$ such that ${\Sigma_0=0}$ and ${\Sigma_t-\Sigma_s}$ is positive semidefinite for all ${t\ge s}$.
2. ${t\mapsto b_t}$ is a continuous function from ${{\mathbb R}_+}$ to ${{\mathbb R}^d}$, with ${b_0=0}$.
3. ${\mu}$ is a Borel measure on ${{\mathbb R}^d\times{\mathbb R}_+}$ with ${\mu(\{0\}\times{\mathbb R}_+)=0}$, ${\mu({\mathbb R}^d\times\{t\})=0}$ for all ${t\ge 0}$ and,
 $\displaystyle \int_{{\mathbb R}^d\times[0,t]}\Vert x\Vert^2\wedge 1\,d\mu(x,s)<\infty.$ (3)

Furthermore, ${(\Sigma,b,\mu)}$ uniquely determine all finite distributions of the process ${X-X_0}$.

Conversely, if ${(\Sigma,b,\mu)}$ is any triple satisfying the three conditions above, then there exists a process with independent increments satisfying (1,2).

Bessel Processes

A random variable ${N=(N^1,\ldots,N^n)}$ has the standard n-dimensional normal distribution if its components ${N^i}$ are independent normal with zero mean and unit variance. A well known fact of such distributions is that they are invariant under rotations, which has the following consequence. The distribution of ${Z\equiv\Vert N+\boldsymbol{\mu}\Vert^2}$ is invariant under rotations of ${\boldsymbol{\mu}\in{\mathbb R}^n}$ and, hence, is fully determined by the values of ${n\in{\mathbb N}}$ and ${\mu=\Vert\boldsymbol{\mu}\Vert^2\in{\mathbb R}_+}$. This is known as the noncentral chi-square distribution with n degrees of freedom and noncentrality parameter ${\mu}$, and denoted by ${\chi^2_n(\mu)}$. The moment generating function can be computed,

 $\displaystyle M_Z(\lambda)\equiv{\mathbb E}\left[e^{\lambda Z}\right]=\left(1-2\lambda\right)^{-\frac{n}{2}}\exp\left(\frac{\lambda\mu}{1-2\lambda}\right),$ (1)

which holds for all ${\lambda\in{\mathbb C}}$ with real part bounded above by 1/2.

A consequence of this is that the norm ${\Vert B_t\Vert}$ of an n-dimensional Brownian motion B is Markov. More precisely, letting ${\mathcal{F}_t=\sigma(B_s\colon s\le t)}$ be its natural filtration, then ${X\equiv\Vert B\Vert^2}$ has the following property. For times ${s, conditional on ${\mathcal{F}_s}$, ${X_t/(t-s)}$ is distributed as ${\chi^2_n(X_s/(t-s))}$. This is known as the n-dimensional’ squared Bessel process, and denoted by ${{\rm BES}^2_n}$.

Alternatively, the process X can be described by a stochastic differential equation (SDE). Applying integration by parts,

 $\displaystyle dX = 2\sum_iB^i\,dB^i+\sum_id[B^i].$ (2)

As the standard Brownian motions have quadratic variation ${[B^i]_t=t}$, the final term on the right-hand-side is equal to ${n\,dt}$. Also, the covarations ${[B^i,B^j]}$ are zero for ${i\not=j}$ from which it can be seen that

 $\displaystyle W_t = \sum_i\int_0^t1_{\{B\not=0\}}\frac{B^i}{\Vert B\Vert}\,dB^i$

is a continuous local martingale with ${[W]_t=t}$. By Lévy’s characterization, W is a Brownian motion and, substituting this back into (2), the squared Bessel process X solves the SDE

 $\displaystyle dX=2\sqrt{X}\,dW+ndt.$ (3)

The standard existence and uniqueness results for stochastic differential equations do not apply here, since ${x\mapsto2\sqrt{x}}$ is not Lipschitz continuous. It is known that (3) does in fact have a unique solution, by the Yamada-Watanabe uniqueness theorem for 1-dimensional SDEs. However, I do not need and will not make use of this fact here. Actually, uniqueness in law follows from the explicit computation of the moment generating function in Theorem 5 below.

Although it is nonsensical to talk of an n-dimensional Brownian motion for non-integer n, Bessel processes can be extended to any real ${n\ge0}$. This can be done either by specifying its distributions in terms of chi-square distributions or by the SDE (3). In this post I take the first approach, and then show that they are equivalent. Such processes appear in many situations in the theory of stochastic processes, and not just as the norm of Brownian motion. It also provides one of the relatively few interesting examples of stochastic differential equations whose distributions can be explicitly computed.

The ${\chi^2_n(\mu)}$ distribution generalizes to all real ${n\ge0}$, and can be defined as the unique distribution on ${{\mathbb R}_+}$ with moment generating function given by equation (1). If ${Z_1\sim\chi_m(\mu)}$ and ${Z_2\sim\chi_n(\nu)}$ are independent, then ${Z_1+Z_2}$ has moment generating function ${M_{Z_1}(\lambda)M_{Z_2}(\lambda)}$ and, therefore, has the ${\chi^2_{m+n}(\mu+\nu)}$ distribution. That such distributions do indeed exist can be seen by constructing them. The ${\chi^2_n(0)}$ distribution is a special case of the Gamma distribution and has probability density proportional to ${x^{n/2-1}e^{-x/2}}$. If ${Z_1,Z_2,\ldots}$ is a sequence of independent random variables with the standard normal distribution and T independently has the Poisson distribution of rate ${\mu/2}$, then ${\sum_{i=1}^{2T}Z_i^2\sim\chi_0^2(\mu)}$, which can be seen by computing its moment generating function. Adding an independent ${\chi^2_n(0)}$ random variable Y to this produces the ${\chi^2_n(\mu)}$ variable ${Z\equiv Y+\sum_{i=1}^{2T}Z_i^2}$.

The definition of squared Bessel processes of any real dimension ${n\ge0}$ is as follows. We work with respect to a filtered probability space ${(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\ge0},{\mathbb P})}$.

Definition 1 A process X is a squared Bessel process of dimension ${n\ge0}$ if it is continuous, adapted and, for any ${s, conditional on ${\mathcal{F}_s}$, ${X_t/(t-s)}$ has the ${\chi^2_n\left(X_s/(t-s)\right)}$ distribution.

Properties of Feller Processes

In the previous post, the concept of Feller processes was introduced. These are Markov processes whose transition function ${\{P_t\}_{t\ge0}}$ satisfies certain continuity conditions. Many of the standard processes we study satisfy the Feller property, such as standard Brownian motion, Poisson processes, Bessel processes and Lévy processes as well as solutions to many stochastic differential equations. It was shown that all Feller processes admit a cadlag modification. In this post I state and prove some of the other useful properties satisfied by such processes, including the strong Markov property, quasi-left-continuity and right-continuity of the filtration. I also describe the basic properties of the infinitesimal generators. The results in this post are all fairly standard and can be found, for example, in Revuz and Yor (Continuous Martingales and Brownian Motion).

As always, we work with respect to a filtered probability space ${(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\ge0},{\mathbb P})}$. Throughout this post we consider Feller processes X and transition functions ${\{P_t\}_{t\ge0}}$ defined on the lccb (locally compact with a countable base) space E which, taken together with its Borel sigma-algebra, defines a measurable space ${(E,\mathcal{E})}$.

Recall that the law of a homogeneous Markov process X is described by a transition function ${\{P_t\}_{t\ge0}}$ on some measurable space ${(E,\mathcal{E})}$. This specifies that the distribution of ${X_t}$ conditional on the history up until an earlier time ${s is given by the measure ${P_{t-s}(X_s,\cdot)}$. Equivalently,

$\displaystyle {\mathbb E}[f(X_t)\mid\mathcal{F}_s]=P_{t-s}f(X_s)$

for any bounded and measurable function ${f\colon E\rightarrow{\mathbb R}}$. The strong Markov property generalizes this idea to arbitrary stopping times.

Definition 1 Let X be an adapted process and ${\{P_t\}_{t\ge 0}}$ be a transition function.

Then, X satisfies the strong Markov property if, for each stopping time ${\tau}$, conditioned on ${\tau<\infty}$ the process ${\{X_{\tau+t}\}_{t\ge0}}$ is Markov with the given transition function and with respect to the filtration ${\{\mathcal{F}_{\tau+t}\}_{t\ge0}}$.

As we see in a moment, Feller processes satisfy the strong Markov property. First, as an example, consider a standard Brownian motion B, and let ${\tau}$ be the first time at which it hits a fixed level ${K>0}$. The reflection principle states that the process ${\tilde B}$ defined to be equal to B up until time ${\tau}$ and reflected about K afterwards, is also a standard Brownian motion. More precisely,

$\displaystyle \tilde B_t=\begin{cases} B_t,&\textrm{if }t\le\tau,\\ 2K-B_t,&\textrm{if }t\ge\tau, \end{cases}$

is a Brownian motion. This useful idea can be used to determine the distribution of the maximum ${B^*_t=\max_{s\le t}B_s}$. If ${B^*_t\ge K}$ then either the process itself ends up above K or it hits K and then drops below this level by time t, in which case ${\tilde B_t>K}$. So, by the reflection principle,

$\displaystyle {\mathbb P}(B^*_t\ge K)={\mathbb P}(B_t\ge K)+{\mathbb P}(\tilde B_t> K)=2{\mathbb P}(B_t\ge K).$

Feller Processes

The definition of Markov processes, as given in the previous post, is much too general for many applications. However, many of the processes which we study also satisfy the much stronger Feller property. This includes Brownian motion, Poisson processes, Lévy processes and Bessel processes, all of which are considered in these notes. Once it is known that a process is Feller, many useful properties follow such as, the existence of cadlag modifications, the strong Markov property, quasi-left-continuity and right-continuity of the filtration. In this post I give the definition of Feller processes and prove the existence of cadlag modifications, leaving the further properties until the next post.

The definition of Feller processes involves putting continuity constraints on the transition function, for which it is necessary to restrict attention to processes lying in a topological space ${(E,\mathcal{T}_E)}$. It will be assumed that E is locally compact, Hausdorff, and has a countable base (lccb, for short). Such spaces always possess a countable collection of nonvanishing continuous functions ${f\colon E\rightarrow{\mathbb R}}$ which separate the points of E and which, by Lemma 6 below, helps us construct cadlag modifications. Lccb spaces include many of the topological spaces which we may want to consider, such as ${{\mathbb R}^n}$, topological manifolds and, indeed, any open or closed subset of another lccb space. Such spaces are always Polish spaces, although the converse does not hold (a Polish space need not be locally compact).

Given a topological space E, ${C_0(E)}$ denotes the continuous real-valued functions vanishing at infinity. That is, ${f\colon E\rightarrow{\mathbb R}}$ is in ${C_0(E)}$ if it is continuous and, for any ${\epsilon>0}$, the set ${\{x\colon \vert f(x)\vert\ge\epsilon\}}$ is compact. Equivalently, its extension to the one-point compactification ${E^*=E\cup\{\infty\}}$ of E given by ${f(\infty)=0}$ is continuous. The set ${C_0(E)}$ is a Banach space under the uniform norm,

$\displaystyle \Vert f\Vert\equiv\sup_{x\in E}\vert f(x)\vert.$

We can now state the general definition of Feller transition functions and processes. A topological space ${(E,\mathcal{T}_E)}$ is also regarded as a measurable space by equipping it with its Borel sigma algebra ${\mathcal{B}(E)=\sigma(\mathcal{T})}$, so it makes sense to talk of transition probabilities and functions on E.

Definition 1 Let E be an lccb space. Then, a transition function ${\{P_t\}_{t\ge 0}}$ is Feller if, for all ${f\in C_0(E)}$,

1. ${P_tf\in C_0(E)}$.
2. ${t\mapsto P_tf}$ is continuous with respect to the norm topology on ${C_0(E)}$.
3. ${P_0f=f}$.

A Markov process X whose transition function is Feller is a Feller process.