Properties of Optional and Predictable Projections

Having defined optional and predictable projections in an earlier post, I now look at their basic properties. The first nontrivial property is that they are well-defined in the first place. Recall that existence of the projections made use of the existence of cadlag modifications of martingales, and uniqueness relied on the section theorems. By contrast, once we accept that optional and predictable projections are well-defined, everything in this post follows easily. Nothing here requires any further advanced results of stochastic process theory.

Optional and predictable projections are similar in nature to conditional expectations. Given a probability space {(\Omega,\mathcal F,{\mathbb P})} and a sub-sigma-algebra {\mathcal G\subseteq\mathcal F}, the conditional expectation of an ({\mathcal F}-measurable) random variable X is a {\mathcal G}-measurable random variable {Y={\mathbb E}[X\,\vert\mathcal G]}. This is defined whenever the integrability condition {{\mathbb E}[\lvert X\rvert\,\vert\mathcal G] < \infty} (a.s.) is satisfied, only depends on X up to almost-sure equivalence, and Y is defined up to almost-sure equivalence. That is, a random variable {X^\prime} almost surely equal to X has the same conditional expectation as X. Similarly, a random variable {Y^\prime} almost-surely equal to Y is also a version of the conditional expectation {{\mathbb E}[X\,\vert\mathcal G]}.

The setup with projections of stochastic processes is similar. We start with a filtered probability space {(\Omega,\mathcal F,\{\mathcal F_t\}_{t\ge0},{\mathbb P})}, and a (real-valued) stochastic process is a map

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} &\displaystyle X\colon{\mathbb R}^+\times\Omega\rightarrow{\mathbb R},\smallskip\\ &\displaystyle (t,\omega)\mapsto X_t(\omega) \end{array}

which we assume to be jointly-measurable. That is, it is measurable with respect to the Borel sigma-algebra {\mathcal B({\mathbb R})} on the image, and the product sigma-algebra {\mathcal B({\mathbb R})\otimes\mathcal F} on the domain. The optional and predictable sigma-algebras are contained in the product,

\displaystyle  \mathcal P\subseteq\mathcal O\subseteq \mathcal B({\mathbb R})\otimes\mathcal F.

We do not have a reference measure on {({\mathbb R}^+\times\Omega,\mathcal B({\mathbb R})\otimes\mathcal F)} in order to define conditional expectations with respect to {\mathcal O} and {\mathcal P}. However, the optional projection {{}^{\rm o}\!X} and predictable projection {{}^{\rm p}\!X} play similar roles. Assuming that the necessary integrability properties are satisfied, then the projections exist. Furthermore, the projection only depends on the process X up to evanescence (i.e., up to a zero probability set), and {{}^{\rm o}\!X} and {{}^{\rm p}\!X} are uniquely defined up to evanescence.

In what follows, we work with respect to a complete filtered probability space. Processes are always only considered up to evanescence, so statements involving equalities, inequalities, and limits of processes are only required to hold outside of a zero probability set. When we say that the optional projection of a process exists, we mean that the integrability condition in the definition of the projection is satisfied. Specifically, that {{\mathbb E}[1_{\{\tau < \infty\}}\lvert X_\tau\rvert\,\vert\mathcal F_\tau]} is almost surely finite. Similarly for the predictable projection.

The following lemma gives a list of initial properties of the optional projection. Other than the statement involving stopping times, they all correspond to properties of conditional expectations.

Lemma 1

  1. X is optional if and only if {{}^{\rm o}\!X} exists and is equal to X.
  2. If the optional projection of X exists then,
    \displaystyle  {}^{\rm o}({}^{\rm o}\!X)={}^{\rm o}\!X. (1)
  3. If the optional projections of X and Y exist, and {\lambda,\mu} are {\mathcal{F}_0}-measurable random variables, then,
    \displaystyle  {}^{\rm o}(\lambda X+\mu Y) = \lambda\,^{\rm o}\!X + \mu\,^{\rm o}Y. (2)
  4. If the optional projection of X exists and U is an optional process then,
    \displaystyle  {}^{\rm o}(UX) = U\,^{\rm o}\!X (3)
  5. If the optional projection of X exists and {\tau} is a stopping time then, the optional projection of the stopped process {X^\tau} exists and,
    \displaystyle  1_{[0,\tau]}{}^{\rm o}(X^\tau)=1_{[0,\tau]}{}^{\rm o}\!X. (4)
  6. If {X\le Y} and the optional projections of X and Y exist then, {{}^{\rm o}\!X\le{}^{\rm o}Y}.

Before proceeding with the proof of the lemma, I briefly note that, throughout this post, stochastic processes will be assumed to take the value 0 at time {\infty}. This merely saves a bit of writing, as we can write {X_\tau} in place of {1_{\{\tau < \infty\}}X_\tau}.

Proof: For the first statement, {{}^{\rm o}\!X} is optional by definition, so X is optional if {X={}^{\rm o}\!X}. Conversely, if X is optional then {X_\tau} is {\mathcal{F}_\tau}-measurable for each stopping time {\tau}. So, {{\mathbb E}[\lvert X_\tau\rvert\,\vert\mathcal{F}_\tau]=\lvert X_\tau\rvert} is finite, and the optional projection of X exists. Similarly, {{\mathbb E}[X_\tau\,\vert\mathcal{F}_\tau]=X_\tau}, so the optional projection of X is equal to itself.

For the second statement, {{}^{\rm o}\!X} is optional, so (1) follows from the first statement with {{}^{\rm o}\!X} in place of X.

Moving on to the third statement, linearity of conditional expectations gives

\displaystyle  {\mathbb E}[\lvert\lambda X_\tau+\mu Y_\tau\rvert\,\vert\mathcal{F}_\tau] \le\lvert\lambda\rvert{\mathbb E}[\lvert X_\tau\rvert\,\vert\mathcal{F}_\tau]+\lvert\mu\rvert{\mathbb E}[ \lvert Y_\tau\rvert\,\vert\mathcal{F}_\tau]

which is finite, if the optional projections of X and Y exist. So, the optional projection of {\lambda X+\mu Y} exists. Similarly,

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle {\mathbb E}[\lambda X_\tau+\mu Y_\tau\,\vert\mathcal{F}_\tau] &\displaystyle =\lambda{\mathbb E}[X_\tau\,\vert\mathcal{F}_\tau]+\mu{\mathbb E}[Y_\tau\,\vert\mathcal{F}_\tau]\smallskip\\ &\displaystyle =\lambda{}^{\rm o}\!X_\tau+\mu{}^{\rm o}Y_\tau. \end{array}

As {\lambda{}^{\rm o}\!X+\mu{}^{\rm o}Y} is optional, this shows that it is the optional projection of {\lambda X + \mu Y}.

For the fourth statement, as U is optional, {U_\tau} is {\mathcal{F}_\tau}-measurable. So,

\displaystyle  {\mathbb E}[\lvert U_\tau X_\tau\rvert\,\vert\mathcal{F}_\tau]=\lvert U_\tau\rvert{\mathbb E}[\lvert X_\tau\rvert\,\vert\mathcal{F}_\tau]

is almost surely finite, and the optional projection of {UX} exists.

\displaystyle  {\mathbb E}[U_\tau X_\tau\,\vert\mathcal{F}_\tau]=U_\tau{\mathbb E}[X_\tau\,\vert\mathcal{F}_\tau]=U_\tau{}^{\rm o}\!X_\tau.

As {U\,{}^{\rm o}\!X} is optional, this shows that it is the optional projection of {UX}.

For the fifth statement, consider a stopping time {\sigma}. As the optional projection of X exists,

\displaystyle  {\mathbb E}[{\mathbb E}[\lvert X^\tau_\sigma\rvert\,\vert\mathcal{F}_\sigma]\,\vert\mathcal{F}_{\tau\wedge\sigma}] ={\mathbb E}[\lvert X_{\tau\wedge\sigma}\rvert\,\vert\mathcal{F}_{\tau\wedge\sigma}]

is almost surely finite. So, {{\mathbb E}[\lvert X^\tau_\sigma\rvert\,\vert\mathcal{F}_\sigma]} is almost surely finite, and the optional projection of {X^\tau} exists. As {1_{[0,\tau]}} is left-continuous and adapted, it is predictable and, in particular, is optional. Applying (3)

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle 1_{[0,\tau]}{}^{\rm o}(X^\tau) &\displaystyle ={}^{\rm o}(1_{[0,\tau]}X^\tau)={}^{\rm o}(1_{[0,\tau]}X)\smallskip\\ &\displaystyle =1_{[0,\tau]}{}^{\rm o}\!X. \end{array}

By optional section, this proves (4).

For the final statement, if {X \le Y} then, by monotonicity of conditional expectations,

\displaystyle  {}^{\rm o}\!X_\tau={\mathbb E}[X_\tau\,\vert\mathcal{F}_\tau]\le{\mathbb E}[Y_\tau\,\vert\mathcal{F}_\tau]={}^{\rm o}Y_\tau

almost surely. By optional section, {{}^{\rm o}\!X \le {}^{\rm o}Y}. ⬜

Unsurprisingly, the list of properties above also apply to predictable projections.

Lemma 2

  1. X is predictable if and only if {{}^{\rm p}\!X} exists and is equal to X.
  2. If the predictable projection of X exists then,

    \displaystyle  {}^{\rm p}({}^{\rm p}\!X)={}^{\rm p}\!X.

  3. If the predictable projections of X and Y exist, and {\lambda,\mu} are {\mathcal{F}_0}-measurable random variables, then,

    \displaystyle  {}^{\rm p}(\lambda X+\mu Y) = \lambda\,^{\rm p}\!X + \mu\,^{\rm p}Y.

  4. If the predictable projection of X exists and U is a predictable process then,

    \displaystyle  {}^{\rm p}(UX) = U\,^{\rm p}\!X

  5. If the predictable projection of X exists and {\tau} is a stopping time then, the predictable projection of {X^\tau} exists and,

    \displaystyle  1_{[0,\tau]}{}^{\rm p}(X^\tau)=1_{[0,\tau]}{}^{\rm p}\!X.

  6. If {X\le Y} and the predictable projections of X and Y exist then, {{}^{\rm p}\!X\le{}^{\rm p}Y}.

Proof: Each statement follows in exactly the same way as for Lemma 1, replacing `optional’ by `predictable’ and sigma algebras {\mathcal{F}_\tau} by {\mathcal{F}_{\tau-}}. ⬜

Next, for sigma-algebras {\mathcal G\subseteq\mathcal H\subseteq\mathcal F}, the tower rule for conditional expectations states that

\displaystyle  {\mathbb E}[X\,\vert\mathcal G] ={\mathbb E}[{\mathbb E}[X\,\vert\mathcal H]\,\vert\mathcal G] ={\mathbb E}[{\mathbb E}[X\,\vert\mathcal G]\,\vert\mathcal H].

Similarly for the projections of a stochastic process, we have the following.

Lemma 3 If the predictable and optional projections of X exist then,

\displaystyle  {}^{\rm p}\!X = {}^{\rm p}({}^{\rm o}\!X) = {}^{\rm o}({}^{\rm p}\!X)

Proof: As {{}^{\rm p}\!X} is predictable, it is optional, so the equality {{}^{\rm p}\!X={}^{\rm o}({}^{\rm p}\!X)} is immediate. For a predictable stopping time {\tau}

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle {\mathbb E}[\lvert{}^{\rm o}\!X_\tau\rvert\,\vert\mathcal{F}_{\tau-}] &\displaystyle = {\mathbb E}\left[\left\lvert{\mathbb E}[X_\tau\,\vert\mathcal{F}_\tau]\right\rvert\,\vert\mathcal{F}_{\tau-}\right]\smallskip\\ &\displaystyle \le{\mathbb E}[\lvert X_\tau\rvert\,\vert\mathcal{F}_{\tau-}]. \end{array}

As the predictable projection exists, this is almost surely finite. So, the predictable projection of {{}^{\rm o}\!X} exists and,

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle {\mathbb E}[{}^{\rm o}\!X_\tau\,\vert\mathcal{F}_{\tau-}] &\displaystyle = {\mathbb E}\left[{\mathbb E}[X_\tau\,\vert\mathcal{F}_\tau]\,\vert\mathcal{F}_{\tau-}\right]\smallskip\\ &\displaystyle ={\mathbb E}[ X_\tau\,\vert\mathcal{F}_{\tau-}]. \end{array}

So, {{}^{\rm p}({}^{\rm o}\!X)={}^{\rm p}\!X}. ⬜

As with expectations and conditional expectations, the projections satisfy dominated convergence.

Lemma 4 (Dominated Convergence) Let {\{X^n\}_{n=1,2,\ldots}} be a sequence of processes such that {X^n\rightarrow X} as n goes to infinity, and Y be a process with {\lvert X^n\rvert\le Y} for all n.

  • If the optional projection of Y exists, then {{}^{\rm o}\!X^n\rightarrow{}^{\rm o}\!X}.
  • If the predictable projection of Y exists, then {{}^{\rm p}\!X^n\rightarrow{}^{\rm p}\!X}.

Proof: It is standard that the set on which a sequence of random variables converges to a given limit is measurable. Specifically, a sequence {X^n} converges to a limit X whenever, for each {\epsilon > 0} there exists positive m such that {\lvert X^n-X\rvert < \epsilon} for all {n\ge m}. It is enough to consider {\epsilon = 1/r} so that,

\displaystyle  \left\{X^n\rightarrow X\right\} =\bigcap_{r=1}^\infty\bigcup_{m=1}^\infty\bigcap_{n=m}^\infty\left\{\lvert X^n-X\rvert < 1/r\right\}.

Applying this to the optional projections, {S\equiv\{{}^{\rm o}\!X^n\rightarrow{}^{\rm o}\!X\}} is optional. For any stopping time {\tau}, applying dominated convergence to the conditional expectations,

\displaystyle  {}^{\rm o}\!X^n_\tau={\mathbb E}[X^n_\tau\,\vert\mathcal F_\tau]\rightarrow{\mathbb E}[X_\tau\,\vert\mathcal F_\tau]={}^{\rm o}\!X_\tau

almost surely. So {\tau\in S} whenever {\tau < \infty} (a.s.). This means that {1_S{}^{\rm o}\!X^n} and {1_S{}^{\rm o}\!X} are optional projections of {X^n} and X respectively. Hence, by uniqueness of optional projections,

\displaystyle  {}^{\rm o}\!X^n=1_S{}^{\rm o}\!X^n\rightarrow1_S{}^{\rm o}\!X=X

up to evanescence.

The same argument applies to predictable projections, replacing `optional’ by `predictable’, `stopping time’ by `predictable stopping time’, and {\mathcal F_\tau} by {\mathcal F_{\tau-}}. ⬜

We now look at special cases where the projections can be described explicitly. We use {M_-} to denote the left-limits of a process {M_{t-}=\lim_{s\uparrow\uparrow t}M_s}.

Lemma 5 Suppose that X is the constant process {X_t=U}, for some integrable random variable U. Then, the optional and predictable projections of X exist and,

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} &\displaystyle {}^{\rm o}\!X=M,\smallskip\\ &\displaystyle {}^{\rm p}\!X=M_-. \end{array}

Here, M is the martingale defined by {M_t={\mathbb E}[U\,\vert\mathcal F_t]}. In the case where the underlying filtration is right-continuous, we choose the cadlag version of M. More generally, if a cadlag version does not exist, we can take M to be the version with left and right limits everywhere, and right-continuous outside a countable subset of {{\mathbb R}^+}.

Proof: This is just a restatement of lemma 7 of the post on the projection theorems, where it was used to prove the existence of optional and predictable projections. ⬜

The predictable projection of a local martingale is given by its left limits.

Lemma 6 If M is a local martingale then its predictable projection exists and,

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} &\displaystyle {}^{\rm p}\!M=M_-,\smallskip\\ &\displaystyle {}^{\rm p}\!\Delta M=0. \end{array}

Proof: We proved this previously in the post on constructing martingales with prescribed jumps. However, the result was established there by an application of a much stronger statement. As the result is really quite elementary, I will give a more direct proof here.

First, as {M_-} is left-continuous and adapted, it is predictable and, so, {{}^{\rm p}\!M_-=M_-}. Next, as M is locally integrable, the conditional expectation {{\mathbb E}[M_\tau\,\vert\mathcal F_{\tau-}]} is almost-surely finite for each predictable stopping time {\tau}. So, the predictable projections of M and {\Delta M=M-M_-} exist.

Suppose now that M is a proper martingale, {\tau} is a bounded predictable stopping time, and {\tau_n} are stopping times announcing {\tau}. For any {m\ge1} and {A\in\mathcal F_{\tau_m}},

\displaystyle  {\mathbb E}[1_A\Delta M_\tau]=\lim_{n\rightarrow\infty}{\mathbb E}[1_A(M_\tau-M_{\tau_n})]=0. (5)

The first equality uses the fact that {M_{\tau_n}} is uniformly integrable and, the second equality is an application of optional sampling, which applies for {n\ge m} (so that {A\in\mathcal F_{\tau_n}}) and, hence, holds for the limit {n\rightarrow\infty}. As (5) holds for all A in {\bigcup_m\mathcal F_{\tau_m}}, which generates {\mathcal F_{\tau-}} as a sigma-algebra, the monotone class theorem implies that (5) holds for all {A\in\mathcal F_{\tau-}}. By definition, this means that {{\mathbb E}[\Delta M_\tau\,\vert\mathcal F_{\tau-}]=0} almost surely.

If M is a local martingale, let {\tau_n\uparrow\infty} be a localizing sequence of stopping times, so that {M^{\tau_n}} are proper martingales. Then, using dominated convergence for the conditional expectations, for a bounded predictable stopping time {\tau},

\displaystyle  {\mathbb E}\left[\Delta M_{\tau}\,\vert\mathcal F_{\tau-}\right] =\lim_{n\rightarrow\infty} {\mathbb E}\left[\Delta M^{\tau_n}_{\tau}\,\vert\mathcal F_{\tau-}\right]=0.

Hence, {{}^{\rm p}\Delta M=0}. Finally,

\displaystyle  {}^{\rm p}\!M={}^{\rm p}(M_-+\Delta M)=M_-.

Optional projections of progressively measurable processes are particularly simple, and always exist, without requiring any integrability conditions.

Lemma 7 If X is progressive then its optional projection exists and is the unique optional process satisfying {{}^{\rm o}\!X_\tau=X_\tau} whenever {\tau < \infty} (almost surely) for every stopping time {\tau}.

Proof: For any stopping time {\tau}, {X_\tau} is measurable w.r.t. {\mathcal F_\tau}. Then, {{\mathbb E}[\lvert X_\tau\rvert\,\vert\mathcal F_\tau]=\lvert X_\tau\rvert} is finite. So, from the definition, the optional projection exists and is the unique process satisfying

\displaystyle  {}^{\rm o}\!X_\tau = {\mathbb E}[X_\tau\,\vert\mathcal F_\tau]=X_\tau

almost surely, for all stopping times {\tau}. ⬜

Thin Processes

Recall that a subset of {{\mathbb R}\times\Omega} is thin if it is equal to the union of the graphs of a countable sequence of stopping times, and that a process X is thin if it is optional and {\{X\not=0\}} is a thin set. For example, for a cadlag adapted process, its jump process {\Delta X} is thin. Lemma 6 above looked at one special case of the predictable projection of a thin process, specifically that the jumps of a local martingale has predictable projection equal to zero.

For the theory of optional and predictable processes, we are concerned with processes which are not necessarily adapted to the underlying filtration. So, we generalise the concept a bit. A subset S of {{\mathbb R}^+\times\Omega} will be called a raw thin set if it is the union of the graphs of a sequence of {\mathcal F}-measurable times {\tau_n\colon\Omega\rightarrow\bar{\mathbb R}^+},

\displaystyle  S=\bigcup_{n=1}^\infty[\tau_n]. (6)

A raw thin process X will refer to any jointly measurable process such that {\{X\not=0\}} is a raw thin set. We note that, in the definition of raw thin sets, it is enough for (6) to hold with the inequality {\subseteq} in place of equality, and for S to be jointly measurable. In that case, we can replace the random times {\tau_n} by {\tilde\tau_n} defined such that {\tilde\tau_n(\omega)=\tau_n(\omega)} whenever {(\tau_n(\omega),\omega)\in S} and equal to {\infty} elsewhere. These will be measurable, so long as S is jointly measurable, and we recover equality in (6).

In the current situation, the relevant property is that optional and predictable projection preserves thinness.

Theorem 8 The optional and predictable projections of a raw thin process are thin, whenever they exist.

I do not prove this immediately. Instead, in a moment, we will give constructions of the projection of a raw thin process, which will demonstrate that it is thin. For now, we note the following corollary.

Corollary 9 If the optional and predictable projections of X exists then {{}^{\rm o}\!X-{}^{\rm p}\!X} is thin.

Proof: As {{}^{\rm o}\!X} is optional, it can be expressed as a sum {Y+H} of a predictable process Y and a thin process H. Then, the predictable process Y has predictable projection equal to itself. Using the identity {{}^{\rm p}\!X={}^{\rm p}({}^{\rm o}\!X)},

\displaystyle  {}^{\rm o}\!X-{}^{\rm p}\!X={}^{\rm o}\!X-Y-{}^{\rm p}\!H=H-{}^{\rm p}\!H.

Theorem 8 says that this is thin. ⬜

The first step in showing that the optional projection of a raw thin process X is thin is to find a sequence of stopping times {\tau_n} such that {{\mathbb P}(X_\tau\not=0,\tau\not=\tau_n\,\forall n)=0} for all stopping times {\tau}. It would then follow that {\{{}^{\rm o}\!X\not=0\}} is contained in {\bigcup_n[\tau_n]} and that {{}^{\rm o}\!X} is thin. For the predictable projection, we need to do the same thing but with `predictable stopping time’ replacing `stopping time’. Consider a collection {\mathcal T} of measurable times {\tau\colon\Omega\rightarrow\bar{\mathbb R}^+}. We will say that a jointly measurable subset S of {{\mathbb R}\times\Omega} is inaccessible by {\mathcal T} if {{\mathbb P}(\tau\in S)=0} for all {\tau\in\mathcal T}. The following simple result will be applied in the cases where {\mathcal T} is the collection of all stopping times, and when it is the collection of all predictable stopping times.

Lemma 10 Let {S\subseteq{\mathbb R}\times\Omega} be a raw thin set and {\mathcal T} be any collection of measurable times {\tau\colon\Omega\rightarrow\bar{\mathbb R}^+}. Then, there exists a sequence {\tau_n\in\mathcal T} such that {S\setminus\bigcup_n[\tau_n]} is inaccessible by {\mathcal T}.

Proof: As S is equal to {\bigcup_n[\sigma_n]} for a sequence {\sigma_n} of random times, we can define a finite measure {\mu} on {({\mathbb R}^+\times\Omega,\mathcal B({\mathbb R}^+)\otimes\mathcal F)} by

\displaystyle  \mu(A)=\sum_{n=1}^\infty2^{-n}{\mathbb P}(\sigma_n\in A).

Define

\displaystyle  \alpha = \sup\left\{\mu\left(\bigcup\nolimits_n[\tau_n]\right)\colon\tau_1,\tau_n,\ldots\in\mathcal T\right\}.

This supremum can be attained as follows. Choose sequences {\tau_{mn}\in\mathcal T} with {\mu(\bigcup_n[\tau_{mn}])} approaching {\alpha} as m goes to infinity. It follows that {\mu(\bigcup_{m,n}[\tau_{m,n}])=\alpha}. Rearranging the doubly indexed sequence {\tau_{mn}} into a singly indexed one, {\tau_{m_1n_1},\tau_{m_2n_2},\ldots}, we can write {\tau_r=\tau_{m_rn_r}}. Then, {\tau_r} is in {\mathcal T} and,

\displaystyle  \mu\left(\bigcup\nolimits_r[\tau_r]\right)=\alpha.

Writing {T=\bigcup_r[\tau_r]}, it remains to be shown that {S\setminus T} is inaccessible by {\mathcal T}. So, consider any {\tau\in\mathcal T}. By construction, we have {\mu(T\cup[\tau])=\alpha=\mu(T)} and, hence,

\displaystyle  {\mathbb P}(\sigma_n\in[\tau]\setminus T)={\mathbb P}(\sigma_n\in T\cup[\tau])-{\mathbb P}(\sigma_n\in T)=0

for all n. Therefore,

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle{\mathbb P}(\tau\in S\setminus T)&\displaystyle\le\sum_{n=1}^\infty{\mathbb P}(\tau\in[\sigma_n]\setminus T)\smallskip\\ &\displaystyle=\sum_{n=1}^\infty {\mathbb P}(\sigma_n\in[\tau]\setminus T)=0 \end{array}

as required. ⬜

We now compute the optional projection of a raw thin process. Note that the existence of the sequence, {\tau_n}, of stopping times, is guaranteed by lemma 10.

Lemma 11 Let X be a raw thin process and {\tau_n} be a sequence of stopping times such that {\{X\not=0\}\setminus\bigcup_n[\tau_n]} is inaccessible by stopping times.

The optional projection of X exists if and only if {{\mathbb E}[\lvert X_{\tau_n}\rvert\,\vert\mathcal F_{\tau_n}]} is almost surely finite for each n. In that case, {{}^{\rm o}\!X} is the unique process with

  • {\left\{{}^{\rm o}\!X\not=0\right\}\subseteq\bigcup_{n=1}^\infty[\tau_n]}.
  • {{}^{\rm o}\!X_{\tau_n}={\mathbb E}[X_{\tau_n}\,\vert\mathcal F_{\tau_n}]} almost surely, for each n.

Proof: First, from the definition, the condition that {{\mathbb E}[\lvert X_{\tau_n}\rvert\,\vert\mathcal F_{\tau_n}]} is almost surely finite is necessary for the optional projection to exist. We show that it is also sufficient. Consider any stopping time {\tau}. Using the fact that {\{\tau=\tau_n\}} is in both {\mathcal F_\tau} and {\mathcal F_{\tau_n}}, and that the sigma-algebras agree on this set,

\displaystyle  1_{\{\tau=\tau_n\}}{\mathbb E}[\lvert X_\tau\rvert\,\vert\mathcal F_\tau]= 1_{\{\tau=\tau_n\}}{\mathbb E}[\lvert X_{\tau_n}\rvert\,\vert\mathcal F_{\tau_n}] < \infty

almost surely. Also, letting {S=\bigcup_n[\tau_n]}, the condition that {\{X\not=0\}\setminus S} is inaccessible by stopping times gives

\displaystyle  1_{\{\tau\not\in S\}}{\mathbb E}[\lvert X_\tau\rvert\,\vert\mathcal F_\tau]= {\mathbb E}[1_{\{\tau\not\in S\}}\lvert X_\tau\rvert\,\vert\mathcal F_\tau]=0

almost surely. In any case, this shows that {{\mathbb E}[\lvert X_\tau\rvert\,\vert\mathcal F_\tau]} is almost surely finite, so the optional projection of X exists.

Now, for any stopping time {\tau} then, as shown above for {\lvert X\rvert},

\displaystyle  1_{\{\tau\not\in S\}}{}^{\rm o}\!X_\tau=1_{\{\tau\not\in S\}}{\mathbb E}[X_\tau\,\vert\mathcal F_\tau]=0

so, by optional section, {1_{S^c}{}^{\rm o}\!X=0} or, equivalently, {\{X\not=0\}\subseteq S}. The final property stated for {{}^{\rm o}\!X} is immediate from the definition of the optional projection.

Finally, it is clear that the two statements of the lemma uniquely determine {{}^{\rm o}\!X} (up to evanescence). The first statement defines {{}^{\rm o}\!X} outside of S and the second defines it on S. ⬜

Next, we can compute the predictable projection of a raw thin process in a similar fashion as for the optional projection in lemma 11. Again, the existence of the sequence, {\tau_n}, of predictable stopping times, is guaranteed by lemma 10.

Lemma 12 Let X be a raw thin process and {\tau_n} be a sequence of predictable stopping times such that {\{X\not=0\}\setminus\bigcup_n[\tau_n]} is inaccessible by predictable stopping times.

The predictable projection of X exists if and only if {{\mathbb E}[\lvert X_{\tau_n}\rvert\,\vert\mathcal F_{\tau_n-}]} is almost surely finite for each n. In that case, {{}^{\rm p}\!X} is the unique process with

  • {\left\{{}^{\rm p}\!X\not=0\right\}\subseteq\bigcup_{n=1}^\infty[\tau_n]}.
  • {{}^{\rm p}\!X_{\tau_n}={\mathbb E}[X_{\tau_n}\,\vert\mathcal F_{\tau_n-}]} almost surely, for each n.

Proof: This follows using the same argument as for lemma 11, replacing `optional’ by `predictable’, `stopping time’ by `predictable stopping time’, and sigma-algebras {\mathcal F_\tau} by {\mathcal F_{\tau-}}. ⬜

Finally, theorem 8 is an immediate consequence of lemmas 11 and 12.

4 thoughts on “Properties of Optional and Predictable Projections

  1. Dear George

    In the proof of lemma 6, when you treat the case of local martingales, on the second to last string of equalities you say that in order to prove it you use dominated convergence for conditional expectations, can you please explain what exactly mean by that ?

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s