Predictable Stopping Times

Although this post is under the heading of `the general theory of semimartingales’ it is not, strictly speaking, about semimartingales at all. Instead, I will be concerned with a characterization of predictable stopping times. The reason for including this now is twofold. First, the results are too advanced to have been proven in the earlier post on predictable stopping times, and reasonably efficient self-contained proofs can only be given now that we have already built up a certain amount of stochastic calculus theory. Secondly, the results stated here are indispensable to the further study of semimartingales. In particular, standard semimartingale decompositions require some knowledge of predictable processes and predictable stopping times.

Recall that a stopping time {\tau} is said to be predictable if there exists a sequence of stopping times {\tau_n\le\tau} increasing to {\tau} and such that {\tau_n < \tau} whenever {\tau > 0}. Also, the predictable sigma-algebra {\mathcal{P}} is defined as the sigma-algebra generated by the left-continuous and adapted processes. Stated like this, these two concepts can appear quite different. However, as was previously shown, stochastic intervals of the form {[\tau,\infty)} for predictable times {\tau} are all in {\mathcal{P}} and, in fact, generate the predictable sigma-algebra.

The main result (Theorem 1) of this post is to show that a converse statement holds, so that {[\tau,\infty)} is in {\mathcal{P}} if and only if the stopping time {\tau} is predictable. This rather simple sounding result does have many far-reaching consequences. We can use it show that all cadlag predictable processes are locally bounded, local martingales are predictable if and only if they are continuous, and also give a characterization of cadlag predictable processes in terms of their jumps. Some very strong statements about stopping times also follow without much difficulty for certain special stochastic processes. For example, if the underlying filtration is generated by a Brownian motion then every stopping time is predictable. Actually, this is true whenever the filtration is generated by a continuous Feller process. It is also possible to give a surprisingly simple characterization of stopping times for filtrations generated by arbitrary non-continuous Feller processes. Precisely, a stopping time {\tau} is predictable if the process is almost surely continuous at time {\tau} and is totally inaccessible if the underlying Feller process is almost surely discontinuous at {\tau}.

As usual, we work with respect to a complete filtered probability space {(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\in{\mathbb R}_+},{\mathbb P})}. I now give a statement and proof of the main result of this post. Note that the equivalence of the four conditions below means that any of them can be used as alternative definitions of predictable stopping times. Often, the first condition below is used instead. Stopping times satisfying the definition used in these notes are sometimes called announceable, with the sequence {\tau_n\uparrow\tau} said to announce {\tau} (this terminology is used by, e.g., Rogers & Williams). Stopping times satisfying property 3 below, which is easily seen to be equivalent to 2, are sometimes called fair. Then, the following theorem says that the sets of predictable, fair and announceable stopping times all coincide.

Theorem 1 Let {\tau} be a stopping time. Then, the following are equivalent.

  1. {[\tau]\in\mathcal{P}}.
  2. {\Delta M_\tau1_{[\tau,\infty)}} is a local martingale for all local martingales M.
  3. {{\mathbb E}[1_{\{\tau < \infty\}}\Delta M_\tau]=0} for all cadlag bounded martingales M.
  4. {\tau} is predictable.

Before moving on to the proof of the theorem note that, for any stopping time {\tau}, {1_{(\tau,\infty)}} is adapted and left-continuous, hence predictable. Since {[\tau]=[\tau,\infty)\setminus(\tau,\infty)}, the first statement of the theorem can equivalently by written as {[\tau,\infty)\in\mathcal{P}}.

Proof: 1 implies 2: As {1_{[\tau]}} is a bounded predictable process, {N\equiv\int1_{[\tau]}\,dM} is a local martingale. We need to show that {N=\Delta M_\tau1_{[\tau,\infty)}}. Although this might seem intuitively obvious, it does require some rather non-trivial properties of the stochastic integral (see the notes below). First,

\displaystyle  N^\tau=\int 1_{(0,\tau]}1_{[\tau]}\,dM=\int1_{[\tau]}\,dM = N,

so N is constant on {[\tau,\infty)}. Secondly, {\Delta N_t=1_{\{t=\tau\}}\Delta M_\tau}, so {N_t=N_{\tau-} + \Delta M_\tau} for all {t\ge\tau}. For any time {t\ge0}, note that {1_{[\tau]}} vanishes over the interval {[0,t]} whenever {\tau > t}. As previously shown, this means that {N_t=\int_0^t1_{[\tau]}\,dM=0} whenever {\tau > t}. So, {N=0} on {[0,\tau)} and, hence, {N_t=1_{\{t\ge\tau\}}\Delta M_\tau}.

2 implies 3: If M is bounded, or just dominated in {L^1}, then {N\equiv \Delta M_\tau1_{[\tau,\infty)}} is an {L^1}-dominated local martingale, and hence a proper martingale,

\displaystyle  {\mathbb E}\left[1_{\{\tau < \infty\}}\Delta M_\tau\right]={\mathbb E}\left[N_\infty\right]={\mathbb E}[N_0]=0.

3 implies 4: To show that {\tau} is predictable, we need to construct a sequence of stopping times {\tau_n} increasing to {\tau} such that {\tau_n < \tau} whenever {\tau > 0}. The idea is simple enough. First define a right-continuous process giving roughly the expected time remaining before {\tau} and use the debut theorem to construct {\tau_n}. To do this, start by choosing a continuous bounded and strictly increasing function {f\colon{\mathbb R}_+\rightarrow{\mathbb R}_+}. Then define the martingale

\displaystyle  M_t={\mathbb E}\left[f(\tau)\;\vert\mathcal{F}_t\right].

Then, {M_t-f(t)={\mathbb E}[f(\tau)-f(t)\;\vert\mathcal{F}_t]} tells us (roughly) how much longer we have to wait until time {\tau}. In order to be able to choose a cadlag version of M, it is necessary to assume that the filtration is right-continuous, so that {\mathcal{F}_{t+}\equiv\bigcap_{s > t}\mathcal{F}_{s}} is equal to {\mathcal{F}_t}. The result does still hold without the assumption of right-continuity and, to be complete, the extension to the general case is given further below. Assuming that M is cadlag, define the following increasing sequence of stopping times,

\displaystyle  \tau_n=\inf\left\{t\ge0\colon M_t-f(t)\le1/n\right\}.

As {M_\tau=f(\tau)}, we have that {\tau_n\le\tau} for all n. Also, by optional sampling,

\displaystyle  {\mathbb E}[f(\tau)]={\mathbb E}[M_{\tau_n}]\le{\mathbb E}[f(\tau_n)+1/n]

So, {{\mathbb E}[f(\tau)-f(\tau_n)]\le1/n} decreases to zero, showing that {f(\tau)-f(\tau_n)} tends to zero almost surely. As f was taken to be strictly increasing, this shows that {\tau_n} increases to {\tau}. It only remains to show that {\tau_n} is strictly less that {\tau} whenever {\tau > 0}. The definition of M gives {M_t\ge f(t)} for all times {t\le\tau}. So,

\displaystyle  \Delta M_\tau=f(\tau)-M_{\tau-}\le f(\tau)-f(\tau-)=0.

Together with condition 3, which says that {{\mathbb E}[\Delta M_\tau]=0}, this implies that {\Delta M_\tau} is almost surely zero. So, {M_{\tau-}=f(\tau)}. However, by construction,

\displaystyle  M_{\tau_n-}\ge f(\tau_n)+1/n\not=f(\tau_n)

whenever {\tau_n > 0}, so, {\tau_n\not=\tau} as required.

4 implies 1: As previously shown, {[\tau,\infty)} is predictable for all predictable times {\tau} and, therefore, so is {[\tau]=[\tau,\infty)\setminus(\tau,\infty)}. ⬜

An immediate consequence of this result is that the conclusion of the debut theorem can be strengthened in the case of right-continuous and predictable processes. This gives a significant generalization of the much simpler result that hitting times of continuous adapted processes are predictable.

Corollary 2 Let X be a right-continuous and predictable process. Then, for each constant K, the stopping time

\displaystyle  \tau=\inf\left\{t\ge0\colon X_t\ge K\right\}

is predictable.

Proof: The left-continuous and adapted process {1_{[0,\tau]}} is predictable. So,

\displaystyle  [\tau]=[0,\tau]\cap X^{-1}([K,\infty))

is predictable and, by Theorem 1, {\tau} is predictable. ⬜

A process X is locally bounded if there exists a sequence of stopping times {\tau_n} increasing to infinity such that the stopped processes {1_{\{\tau_n > 0\}}X^{\tau_n}} are each uniformly bounded. Continuous processes are easily seen to be locally bounded simply by stopping them as soon as they hit a level {K > 0}. This generalizes to cadlag predictable processes.

Lemma 3 All cadlag predictable processes are locally bounded.

Proof: Supposing that X is cadlag, define the sequence of stopping times

\displaystyle  \tau_m=\inf\left\{t\ge 0\colon\vert X_t\vert\ge m\right\}.

These are increasing to infinity and, by Corollary 2, are predictable. So, for each m, there is a sequence {\{\tau_{mn}\}_{n=1,2,\ldots}} of stopping times increasing to {\tau_m} such that {\tau_{mn} < \tau_m} whenever {\tau_m > 0}. Then, {\sigma_n\equiv\tau_{1n}\vee\tau_{2n}\vee\cdots\vee\tau_{nn}} are stopping times increasing to infinity. Also, {\sigma_n < \tau_n} whenever {\tau_n > 0}. So, {1_{\{\sigma_n > 0\}}X^{\sigma_n}} is bounded by n and, hence, X is locally bounded. ⬜

Theorem 1 enables us to state several equivalent conditions for a cadlag adapted process X to be predictable. Note that the process {X_-} of left-limits is automatically predictable, being left-continuous and adapted. For brevity, I write {X_\tau} for the value of a process at a random time, even though this is not well defined when {\tau=\infty}. In that case, I take {X_\tau} to be zero whenever {\tau} is infinite, so {X_\tau\equiv1_{\{\tau < \infty\}}X_\tau} (setting it to any {\mathcal{F}_\infty}-measurable value will not change any of the statements below). I also use {\{\Delta X\not=0\}} as shorthand for the (progressively measurable and optional) set of times at which X is discontinuous which, more precisely, consists of the {(t,\omega)\in{\mathbb R}_+\times\Omega} for which {\Delta X_t(\omega)} is nonzero. The following notation will be used in the proofs. Given a stopping time {\tau} and a set {A\in\mathcal{F}_\tau}, the random time {\tau_A\colon\Omega\rightarrow\bar{\mathbb R}_+} is defined by

\displaystyle  \tau_A(\omega)\equiv\begin{cases} \tau(\omega),&\textrm{if }\omega\in A,\\ \infty,&\textrm{if }\omega\not\in A. \end{cases}

It is clear that this defines a stopping time. The statement and proof of the equivalent conditions for a cadlag process to be predictable can now be given.

Lemma 4 If X is a cadlag adapted process then the following are equivalent.

  1. X is predictable.
  2. {\Delta X} is predictable.
  3. {X_\tau} is {\mathcal{F}_{\tau-}} measurable for all predictable stopping times {\tau}, and {\Delta X_\tau=0} (almost surely) whenever {\tau} is totally inaccessible.
  4. there exists a sequence of predictable stopping times {\{\tau_n\}_{n=1,2,\ldots}} such that {\{\Delta X\not=0\}\subseteq\bigcup_n[\tau_n]} and {X_{\tau_n}} is {\mathcal{F}_{\tau_n-}}-measurable for each n.
  5. there exists a sequence of predictable stopping times {\{\tau_n\}_{n=1,2,\ldots}} with disjoint graphs ({[\tau_m]\cap[\tau_n]=\emptyset} for {m\not=n}) such that {\{\Delta X\not=0\}=\bigcup_n[\tau_n]} and {X_{\tau_n}} is {\mathcal{F}_{\tau_n-}}-measurable for each n.

Proof: 1 implies 4: Let {s_n,\epsilon_n} be a sequence running over the pairs of positive rational numbers, and define the stopping times

\displaystyle  \tau_n=\inf\left\{t\ge s_n\colon\vert X_t-X_{s_n}\vert\ge\epsilon_n\right\}. (1)

Corollary 2 implies that these are predictable. Looking at the individual sample paths of the process, consider a time t for which {\Delta X_t\not=0}. Then, {\tau_n=t} whenever {\epsilon_n < \vert\Delta X_t\vert} and {s_n} approximates t closely enough from below. It follows that t is contained in the set of times {\tau_n}, so {\{\Delta X\not=0\}\subseteq\bigcup_n[\tau_n]} as required. Also, as X is predictable, {X_{\tau_n}} will be {\mathcal{F}_{\tau_n-}}-measurable.

4 implies 5: Let {\tau_n} be stopping times satisfying condition 4. As these are predictable and {\Delta X_{\tau_n}} is {\mathcal{F}_{\tau_n-}}-measurable, the processes

\displaystyle  Y^n\equiv\Delta X_{\tau_n}1_{[\tau_n]\setminus\bigcup_{m < n}[\tau_m]}

are predictable. So, the set {A_n\equiv\{Y^n_{\tau_n}\not=0\}} is {\mathcal{F}_{\tau_n-}}-measurable. We can therefore define new predictable stopping times {\sigma_n=(\tau_n)_{A_n}}. By construction, the graphs of {\sigma_n} are disjoint and

\displaystyle  \bigcup_n[\sigma_n]=\bigcup_n[\tau_n]\cap\{\Delta X\not=0\}=\{\Delta X\not=0\}.

5 implies 2: If 5 holds then {\Delta X_{\tau_n}1_{[\tau_n]}} is predictable. So, the same is true of {\Delta X=\sum_n\Delta X_{\tau_n}1_{[\tau_n]}}.

2 implies 1: Simply write X as the sum {X_-+\Delta X} of two predictable processes.

5 implies 3: We have already shown that 5 implies that X is predictable, so {X_{\tau}} is {\mathcal{F}_{\tau-}}-measurable for any stopping time {\tau}. Also, for any totally inaccessible stopping time {\tau}, then {{\mathbb P}(\tau_n=\tau)=0} by definition. So, {\tau} is not in {\bigcup_n[\tau_n]=\{\Delta X\not=0\}} (almost surely) and, therefore, {\Delta X_\tau=0}.

3 implies 4: Defining the sequence of stopping times {\tau_n} by (1), we again have {\{\Delta X\not=0\}\subseteq\bigcup_n[\tau_n]}. By the decomposition of stopping times, there exists sets {A_n\in\mathcal{F}_{\tau_n}} such that {(\tau_n)_{A_n}} is accessible and {(\tau_n)_{A_n^c}} is totally inaccessible. By condition 3, we have {\Delta X_{(\tau_n)_{A_n^c}}=0} and, therefore, {\{\Delta X\not=0\}} is almost surely contained in the graphs of {(\tau_n)_{A_n}}. By the definition of accessible stopping stopping times, this is contained in the union {\bigcup_{m,n}[\tau_{nm}]} of predictable stopping times {\tau_{nm}}. Finally, {X_{\tau_{nm}}} is {\mathcal{F}_{\tau_{nm}-}}-measurable by condition 3. ⬜

Applying this characterization to local martingales shows that, for such processes, predictability and continuity are equivalent. In the following, and throughout these notes, statements about the paths of processes are only intended in the almost sure sense. We do not care about what the sample paths look like on zero probability sets.

Lemma 5 A local martingale is predictable if and only if it is continuous.

Proof: As continuous processes are predictable by definition, only the converse needs to be shown. Suppose that M is a predictable local martingale and {\tau} is a predictable stopping time. Then, for any {A\in\mathcal{F}_{\tau-}}, the stopping time {\tau_A} is predictable so, by Theorem 1, the process {Y\equiv1_A1_{[\tau,\infty)}\Delta M_\tau=1_{[\tau_A,\infty)}\Delta M_\tau} is a nonnegative local martingale. Taking {A=\{\Delta M_\tau\ge0\}}, so that Y is nonnegative and hence a supermartingale, we can take expectations to get

\displaystyle  {\mathbb E}[(\Delta M_\tau)_+]={\mathbb E}[Y_\tau]\le{\mathbb E}[Y_0]=0.

So, {\Delta M_\tau\le0} almost surely. Also applying this to {-M} shows that {\Delta M_\tau=0} for all predictable stopping times {\tau}. However, condition 4 of Lemma 4 shows that the jumps of M are contained in the graphs of a countable set of predictable stopping times, so M is almost surely continuous. ⬜

Note that for a filtration generated by a standard Brownian motion B, the martingale representation theorem implies that all local martingales are continuous, So, the third condition of Theorem 1 is trivially satisfied, giving the remarkable consequence that all stopping times are predictable. More generally, the property that all local martingales are continuous is equivalent to all stopping times being predictable.

Lemma 6 With respect to a complete filtered probability space {(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\ge0},{\mathbb P})}, the following are equivalent.

  1. All stopping times are predictable.
  2. All cadlag adapted processes are predictable.
  3. All local martingales are continuous.

Proof: 1 implies 2: Let X be a cadlag adapted process. We can use the characterization of cadlag predictable processes given in the third condition of Lemma 4. It is immediate that {\Delta X_\tau=0} (almost surely) for all totally inaccessible {\tau} since, by assumption, all stopping times are predictable. Now suppose that {\tau} is any stopping time. Then, for {A\in\mathcal{F}_\tau}, consider the stopping time {\tau_A}. By the condition, this is predictable, so the process {Y\equiv1_A1_{[\tau,\infty)}=1_{[\tau_A,\infty)}} is predictable. Consequently {Y_\tau=1_{A\cap\{\tau < \infty\}}} is {\mathcal{F}_{\tau-}}-measurable. So {A\in\mathcal{F}_{\tau-}} and {\mathcal{F}_{\tau-}=\mathcal{F}_\tau}. So, {X_\tau} is {\mathcal{F}_{\tau-}}-measurable for all stopping times {\tau}, and X is predictable.

2 implies 3: Any local martingale M is cadlag and adapted, hence is predictable. So, M is continuous by Lemma 5.

3 implies 1: If {\tau} is a stopping time then any cadlag bounded martingale is continuous, so {{\mathbb E}[\Delta M_\tau]=0}. Theorem 1 says that {\tau} is predictable. ⬜

As noted above, for the filtration generated by a Brownian motion, the martingale representation theorem has the consequence that all stopping times are predictable.

Corollary 7 Let {\{\mathcal{F}_t\}_{t\ge0}} be the complete filtration generated by a d-dimensional Brownian motion B. Then, every {\mathcal{F}_\cdot}-stopping time is predictable.

Rather than using the martingale representation theorem, there is an alternative way to approach Corollary 7. Brownian motion is an example of a Feller process and, in fact, it can be shown that Corollary 7 extends to all continuous Feller processes.

Stopping Times of Feller Processes

For Feller processes it is possible to give a precise characterization of predictable and totally inaccessible stopping times. This follows from the following description of the times at which a local martingale can be discontinuous.

Lemma 8 Let X be a cadlag Feller process and {\{\mathcal{F}_t\}_{t\ge0}} be its completed natural filtration, and suppose that M is a cadlag {\mathcal{F}_\cdot}-local martingale.

Then, with probability one, {\Delta M_t=0} for all times t at which X is continuous.

Proof: By localization, it is enough to prove this result for all cadlag martingales of the form

\displaystyle  M_t={\mathbb E}[U\vert\mathcal{F}_t] (2)

where {U} is an {\mathcal{F}_\infty}-measurable random variable. Let us use {\mathcal{U}\subseteq L^1(\Omega,\mathcal{F}_\infty,{\mathbb P})} to denote the set of random variables U such that the cadlag martingale defined by (2) is almost-surely continuous wherever X is continuous. We need to show that {\mathcal{U}} is equal to the whole of {L^1(\mathcal{F}_\infty)}. Clearly, {\mathcal{U}} is closed under linear combinations. Furthermore, if {U_n} is a sequence in {\mathcal{U}} converging in {L^1} to a limit U then the cadlag martingales {M^n_t\equiv{\mathbb E}[U\vert\mathcal{F}_t]} converge in the ucp topology to a martingale M which will satisfy (2) and, by ucp convergence, is continuous at all times that {M^n} are continuous. So, U is in {\mathcal{U}}. Therefore, {\mathcal{U}} is a closed subspace of {L^1(\mathcal{F}_\infty)}.

Supposing that the Feller process X is defined by the transition function {P_t} on the lccb space E, consider U of the form {U=f(X_s)} for some {f\in C_0(E)}. Then, by the definition of Feller transition functions, {(t,x)\mapsto P_tf(x)} defines a continuous real-valued function on {{\mathbb R}_+\times E}. Then,

\displaystyle  M_t\equiv P_{(s-t)_+}f(X_t)={\mathbb E}[U\vert\mathcal{F}_t]

is a martingale which is continuous at all times that X is continuous. So {U\in\mathcal{U}}. Next, consider {U=f_1(X_{s_1})f_2(X_{s_2})\cdots f_n(X_{s_n})} for a sequence of times {0=s_0\le s_1\le\cdots s_n} and {f_k\in C_0(E)}, and let M be defined by (2). Then, for each {k=1,\ldots,n}, there exists a {g_k\in C_0(E)} with

\displaystyle  M_t=f_1(X_{s_1})\cdots f_{k-1}(X_{s_{k-1}}){\mathbb E}\left[g_k(X_{s_k})\;\vert\mathcal{F}_t\right]

for all {s_{k-1}\le t\le s_k} (simply take {g_n=f_n} and {g_k=f_kP_{s_{k+1}-s_k}g_{k+1}} for {k < n}). As shown above, {g_k(X_{s_k})\in\mathcal{U}}, so M has a cadlag modification on {(s_{k-1},s_k]} which is continuous wherever X is continuous. Therefore, {U\in\mathcal{U}}. Finally, by the monotone class theorem, the set of linear combinations of U of this form is dense in {L^1(\mathcal{F}_\infty)}, so {\mathcal{U}=L^1(\mathcal{F}_\infty)} as required. ⬜

Applying Theorem 1 to this result, we obtain the promised characterization of predictable and totally inaccessible stopping times of a Feller process.

Theorem 9 Let X be a cadlag Feller process and {\{\mathcal{F}_t\}_{t\ge0}} be its completed natural filtration. If {\tau} is an {\mathcal{F}_\cdot}-stopping time then,

  • {\tau} is predictable if and only if {X_{\tau-}=X_\tau} almost surely, whenever {\tau < \infty}.
  • {\tau} is totally inaccessible if and only if {X_{\tau-}\not=X_{\tau}} almost surely, whenever {\tau < \infty}.

Proof: As cadlag Feller processes are quasi-left-continuous, if {\tau} is predictable then {X_{\tau-}=X_\tau} almost surely. Conversely, if {X_{\tau-}=X_\tau} almost surely then, by Lemma 8, {\Delta M_\tau=0} for any cadlag bounded martingale M. Then, {{\mathbb E}[\Delta M_\tau]=0} and Theorem 1 says that {\tau} is predictable.

For the second statement, suppose that {X_{\tau-}\not=X_\tau} whenever {\tau < \infty}. Then, as shown above, {X_{\sigma-}=X_\sigma} (almost surely) for any predictable stopping time {\sigma} and consequently {{\mathbb P}(\sigma=\tau < \infty)=0}. So, {\tau} is totally inaccessible. Conversely, suppose that {\tau} is totally inaccessible and set {A=\{X_{\tau-}=X_\tau\}\in\mathcal{F}_\tau}. Then {\tau_A} is a stopping time for which {X_{\tau_A-}=X_{\tau_A}} and, hence, this is predictable so {{\mathbb P}(A)={\mathbb P}(\tau_A=\tau < \infty)=0}. Therefore, {X_{\tau-}\not=X_\tau} whenever {\tau < \infty} (almost surely). ⬜

One simple but surprising consequence of Theorem 9 is that, for Feller processes, the concepts of predictable and accessible stopping times actually coincide.

Corollary 10 Let {\{\mathcal{F}_t\}_{t\ge0}} be the complete filtration generated by a Feller process. Then, an {\mathcal{F}_\cdot}-stopping time is predictable if and only if it is accessible.

Proof: Let X be a cadlag version of the Feller process. If {\tau} is an accessible stopping time then the second statement of Theorem 9 says that {X_{\tau-}=X_\tau} whenever {\tau < \infty}. So, by the first statement of Theorem 9, {\tau} is predictable. ⬜

For continuous Feller processes, Theorem 9 simply states that, if X is a continuous process, then every stopping time is predictable. This gives the promised extension of Corollary 7 above to all continuous Feller processes.

Corollary 11 Let {\{\mathcal{F}_t\}_{t\ge0}} be the complete filtration generated by a continuous Feller process. Then, every {\mathcal{F}_\cdot}-stopping time is predictable.

Non-Right-Continuous Filtrations

In Theorem 1 given above, the right-continuity of the filtration was required in the proof that the third condition implies the fourth. However, right-continuity is not required for the result to hold, and I will give an extension to the non-right-continuous case here. The idea is that we can apply Theorem 1 under the right-continuous filtration {\mathcal{F}_{t+}\equiv\bigcap_{s > t}\mathcal{F}_s} and show that both the third and fourth statements of the theorem are unchanged under replacing {\mathcal{F}_{t+}} by {\mathcal{F}_t}. First, we re-state the following simple lemma which was proven in the post on the Bichteler-Dellacherie theorem.

Lemma 12 Suppose that M is a cadlag and square-integrable {\mathcal{F}_{t+}}-martingale. Then, there exists a countable subset {S\subset{\mathbb R}_+} such that {{\mathbb P}(\Delta X_t\not=0)=0} for all {t\in{\mathbb R}_+\setminus S}. Furthermore, {\int1_{{\mathbb R}_+\setminus S}\,dM} is an {\mathcal{F}_t}-martingale.

We can now give the proof that 3 implies 4 in Theorem 1 without the assumption that the filtration is right-continuous. So, suppose that 3 holds. Then, for any cadlag bounded {\mathcal{F}_{\tau+}}-martingale M, let S be the countable set of times at which {{\mathbb P}(\Delta M_t\not=0) > 0}. Letting {N=\int1_{{\mathbb R}_+\setminus S}\,dM} then, as this is an {\mathcal{F}_t}-martingale, we have {{\mathbb E}[\Delta N_\tau]=0}. Next, for any fixed time t, if {U} is an {\mathcal{F}_t}-measurable and bounded random variable, then {\tilde N\equiv (U-{\mathbb E}[U\vert\mathcal{F}_{t-}])1_{[t,\infty)}} is easily seen to be a martingale. So, by assumption,

\displaystyle  {\mathbb E}\left[1_{\{\tau=t\}}(U-{\mathbb E}[U\vert\mathcal{F}_{t-}])\right]={\mathbb E}[\Delta\tilde N_\tau]=0.

This implies that {\{\tau=t\}} is in {\mathcal{F}_{t-}}. Hence, {{\mathbb E}[1_{\{\tau=t\}}\Delta M_t]=0}. Putting this together gives

\displaystyle  {\mathbb E}[\Delta M_\tau]={\mathbb E}[\Delta N_\tau]+\sum_{t\in S}{\mathbb E}[1_{\{\tau=t\}}\Delta M_t] =0.

So, {\tau} satisfies condition 3 with respect to the right-continuous filtration {\mathcal{F}_{t+}}. Applying Theorem 1 in this case shows that {\tau} is predictable with respect to {\mathcal{F}_{t+}}. As shown previously, this implies that it is predictable with respect to the original filtration {\mathcal{F}_t}.

Notes on the Proof of Theorem 1

It is worth pausing here to consider the technical difficulties which had to be overcome in the proof of Theorem 1 above. The equivalence of statements 2 and 3 is easy to show without applying any advanced techniques, as is the fact that 4 implies 1. The proof that the third statement implies the fourth (fair stopping times are announceable) was a bit trickier to show but, still, constructing the sequence of stopping times announcing {\tau} was achieved without too much difficulty, although extending the result to non-right-continuous filtrations gets a bit messy.

The proof that the first statement implies the second (predictable times are fair) can be the most technically demanding part of the proof of Theorem 1, so I will discuss some of the various approaches in this section. We managed to deal with this very efficiently in this post by making use of the identity

\displaystyle  \int1_{[\tau]}\,dM=\Delta M_\tau1_{[\tau,\infty)} (3)

and using the fact that stochastic integration preserves the local martingale property. Although (3) seems intuitively obvious by thinking about the integral in a pathwise sense, and is easy to prove for Riemann-Stieltjes integrals, it is much harder to show that the stochastic integral satisfies this identity. It does not follow easily from the defining properties of the stochastic integral — namely the bounded or dominated convergence theorem and the explicit expression for elementary integrands. Instead, we had to make use of the result that stochastic integrals coincide on any event for which the integrands coincide. This does seem like a simple enough statement, which we would expect to hold. However, the proof of this result required showing that semimartingales remain as semimartingales when the filtration is enlarged by adding a set to {\mathcal{F}_0}. This, in turn, required the characterization of semimartingales in terms of boundedness in probability of elementary integrals, which was rather demanding to prove, and was restated as part of the Bichteler-Dellacherie theorem. So, the seemingly simple statement that 1 implies 2 in Theorem 1 actually required some rather advanced stochastic calculus, and we would have been hard-pressed to give a short proof in these notes before the stochastic integral had been developed.

It is interesting to compare with the proof given in Rogers & Williams (Diffusions, Markov Processes, and Martingales, Volume 2, §VI.16.4-13), where the implication is denoted by P ⇒ F (predictable implies fair). They open with the following paragraph.

Proof that P ⇒ F. If you wish to understand the subject properly, you need to understand the proof of the section theorems, and this proof that P ⇒ F, much of which is based on §IV.76 of Dellacherie and Meyer, is a good introduction to the methods required. We did try for some time to find a quick proof that P ⇒ F, but though many `proofs’ would immediately spring to the mind of anyone familiar with stochastic-integral theory, they all presuppose that P ⇒ A (although it sometimes takes a little thought to spot exactly where!).”

It seems likely that the proofs mentioned that would immediately spring to mind are those based on identity (3). If it is assumed that the stopping time {\tau} is announceable, then this identity is easy to prove (hence, the presupposition that P ⇒ A). Fortunately, the approach that has been taken in these notes means that we were able to prove (3) without any such presupposition, so we could give a quick stochastic integration based proof that {[\tau]\in\mathcal{P}} implies that {\tau} is fair.

An alternative proof of the result is to apply the section theorems, as suggested by Rogers & Williams in the quote above. The section theorems are very powerful results on which much of the historical development of stochastic calculus depended, although their proofs are rather demanding and are based on predictive set theory and analytic sets. The predictable section theorem in particular implies that if {[\tau]} is a predictable set then there exist predictable stopping times {\tau_n} with {{\mathbb P}(\tau_n\in [\tau])} tending to {{\mathbb P}(\tau < \infty)}. This implies that {\tau} is predictable (i.e., that it is announceable), giving a proof that statement 1 implies 4 in Theorem 1. The proof given by Rogers & Williams does not use the section theorems themselves, but does involve ideas from their proofs, and is essentially based on the Choquet capacity theorem. However, in these notes I have taken a different approach to stochastic calculus, attempting to develop the stochastic integral in a more direct and slightly more intuitive way which avoids any use of the section theorems.

There do exist other methods of proving that stopping times satisfying {[\tau]\in\mathcal{P}} are fair which avoid the use both of identity (3) and of the section theorems. For example, Metivier & Pellaumail (Stochastic Integration, 1980) give a relatively direct proof of the Doob-Meyer decomposition theorem without invoking any major machinery which, in particular, implies that every integrable increasing predictable process A and bounded cadlag martingale M with {M_0=0} satisfies the identity

\displaystyle  {\mathbb E}\left[\int_0^t M_{s-}\,dA_s\right]={\mathbb E}[M_t A_t].

Applying this to the predictable process {A=1_{[\tau,\infty)}} implies that {\tau} is fair. More details on this will be mentioned in a later post on compensators.

12 thoughts on “Predictable Stopping Times

  1. Hi,

    So first,

    In your proof of theorem 1, in the “2 implies 3” part, you mention the fact that if a local martinglae M is L^1-dominated then it is a true martingale. As I couldn’t find in your notes this precise statement, would it be right to say the following ?

    As L^1-domination entails that M is locally integrable (by hypotheses \exists X \in L^1 such that sup_{s \ge 0} M_s  <X), which in turn implies that M is of class DL (see lemma 9 in your post "Localization"), and so the local martingale M is a true martingale (by theorem 1 in your post on "Local Martingales"). Moreover here N is uniformly integrable (or U.I.) as a L^1-dominated family of random variable is U.I.. So you can say that N_\infty exists almost surely and use it in the equallity following the statement about the martingale property of N.

    Maybe a more elementary proof exists but I missed it, otherwise if the result exists in your blog you consider hyperlinking it with the claim.

    Best regards

    1. I think you’ve more or less got the argument, except, for the locally integrable part. Local integrability is too week to imply that it is a true martingale. In fact, all local martingales are locally integrable anyway. Lemma 9 in the post mentioned only implies that it is locally of class (DL), not actually of class DL itself.

      Instead, if M is L1-dominated (by X), then the set of random variables {XTT is a finite stopping time} must also be dominated by X, so uniformly integrable. Hence, M is of class (D) (so also of class DL). Now use Theorem 1 from the post on local martingales. And, class (D) implies that it is UI, so N exists by martingale convergence, and the martingale property E[Nt|Fs] = Ns also holds for t = ∞.

      I tend not to spell out all the small steps once it gets down to just what is relatively standard manipulations that occur all over the place. Maybe in some places these steps are a bit big, depending on how familiar or not you are with this stuff. If I can make it easier to follow without making arguments which I think should be short and direct seem long and complicated, then I will (when I have time to go back and clean up posts).

      1. Hi, first thank you for responding so fast,

        I got your point.

        Regarding spelling out small steps, well I think it depends on the size of one’s legs, and some kind of human factor is involved in the process.

        I know there is always a trade-off between synthetized fancy style exposition of a mathematical proof and elementary exhaustive but tedious rigourous demonstration, as “élégance” is an determinant factor in the process.

        At this game, your are particularly gifted in my opinion but as it is generally hard for high skilled people to get what is hard to get for not as smart guys, I only point out from time to time those steps which are sometimes too high for me.

        Best regards

  2. Hi,


    I think there is a typo in “3 implies 4”. You wrote :

    \mathbb{E}[f(\tau_n)]=\mathbb{E}[M_{\tau_n}]\le \mathbb{E}[f(\tau)+1/n]

    Where I think it is :

    \mathbb{E}[f(\tau)]=\mathbb{E}[M_{\tau_n}] \le \mathbb{E}[f(\tau_n)+1/n]

    Shortly after that, using M_t \ge f(t) you show that :

    \Delta M_\tau  = f(\tau) - M_{\tau-} \le f(\tau) - f(\tau-)=0 (by continuity of f)

    Besides, I don’t see why then condition 3 is required to ensure that \Delta M_\tau is almost surely 0.

    Best regards

    1. Yes, regarding the typo, you’re correct. I fixed this, thanks.

      Condition 3 is certainly required to ensure that ΔM is almost surely zero, otherwise it would prove that every stopping time is predictable!

      What happens if you try the same argument for a non-predictable stopping time is that ΔM can be strictly negative. Then, with positive probability, you have M_\tau < M_{\tau-} - 1/n for some n, and then \tau_m=\tau for m ≥ n, so the sequence \tau_n does not announce \tau strictly from below.

  3. Hi,

    For the proof in lemma 3, unless I missed something I think that the argument is a little too fast as \sigma_n is non deacreasing but can be forced through (not so) appropriate choice of \tau_{m,n} to make \sigma_n to be “constant” (meaning that is the same random variable over and over),so a proper choice of the index n (depending on m and \omega) has to be done, or alternatively some condition over the construction of \tau_{m,n} has to be imposed for every m, for example \tau_{m,1}>\tau_{m-1} almost surely (I think this is possible with no arm).

    In the proof of lemma 4, for the sake of exhaustivity you might also consider hyperlinking in the end of the “1 implies 4” argument, the sentence :
    “Also, as X is predictable, \Delta X_{\tau_n} will be \mathcal{F}_{\tau_{n-}}-measurable,” to lemma 1 of the post “Sigma Algebras at a Stopping Time” (appears also in the “5 implies 3” argument).

    Hope this is not too much, but as I come almost at the end of my lecture of your notes the rythm of those comments should slow down soon dramatically.

    Best regards

    1. Regarding Lemma 3. Yes, there was a mistake, which I’ve fixed. The correct choice of \sigma_n is actually much easier than you might expect when you first think about it (I think this is what I had in my head when I wrote this post, but it came out wrong).

      I added the links you suggested.

      And, no, its not too much at all! I like having the feedback on these notes. Any comments which help me improve them is much appreciated. Besides, I never normally know if anyone has actually read through the details of the more in-depth proofs. You’ve almost read through all of my notes? That’s pretty good going!

      1. Hi,

        Yes I must admit I did read almost all of them and had a great pleasure doing so, and as you don’t feel annoyed by my comments I will keep up making some when I feel I have to.

        Best regards

  4. Hi,
    I’ve got a question. I take a continuous process and it’s natural filtration. Is it possible to show, that all optional processes are predictable wrt to my filtration?
    Best regards

  5. hi, i am asking what is application of indicator function on stopping time and why also indicator function is used on writing jump process

    1. I am not sure precisely what your question is, but 1_{[\tau,\infty)} is the process equal to zero before stopping time \tau and equal to 1 at and after the stopping time, so is a very basic jump process.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s