Pathwise Properties of Optional and Predictable Projections

Recall that the the optional and predictable projections of a process are defined, firstly, by a measurability property and, secondly, by their values at stopping times. Namely, the optional projection is measurable with respect to the optional sigma-algebra, and its value is defined at each stopping time by a conditional expectation of the original process. Similarly, the predictable projection is measurable with respect to the predictable sigma-algebra and its value at each predictable stopping time is given by a conditional expectation. While these definitions can be powerful, and many properties of the projections follow immediately, they say very little about the sample paths. Given a stochastic process X defined on a filtered probability space {(\Omega,\mathcal F,\{\mathcal F_t\}_{t\ge0},{\mathbb P})} with optional projection {{}^{\rm o}\!X} then, for each {\omega\in\Omega}, we may be interested in the sample path {t\mapsto{}^{\rm o}\!X_t(\omega)}. For example, is it continuous, right-continuous, cadlag, etc? Answering these questions requires looking at {{}^{\rm o}\!X_t(\omega)} simultaneously at the uncountable set of times {t\in{\mathbb R}^+}, so the definition of the projection given by specifying its values at each individual stopping time, up to almost-sure equivalence, is not easy to work with. I did establish some of the basic properties of the projections in the previous post, but these do not say much regarding sample paths.

I will now establish the basic properties of the sample paths of the projections. Although these results are quite advanced, most of the work has already been done in these notes when we established some pathwise properties of optional and predictable processes in terms of their behaviour along sequences of stopping times, and of predictable stopping times. So, the proofs in this post are relatively simple and will consist of applications of these earlier results.

Before proceeding, let us consider what kind of properties it is reasonable to expect of the projections. Firstly, it does not seem reasonable to expect the optional projection {{}^{\rm o}\!X} or the predictable projection {{}^{\rm p}\!X} to satisfy properties not held by the original process X. Therefore, in this post, we will be concerned with the sample path properties which are preserved by the projections. Consider a process with constant paths. That is, {X_t=U} at all times t, for some bounded random variable U. This has about as simple sample paths as possible, so any properties preserved by the projections should hold for the optional and predictable projections of X. However, we know what the projections of this process are. Letting M be the martingale defined by {M_t={\mathbb E}[U\,\vert\mathcal F_t]} then, assuming that the underlying filtration is right-continuous, M has a cadlag modification and, furthermore, this modification is the optional projection of X. So, assuming that the filtration is right-continuous, the optional projection of X is cadlag, meaning that it is right-continuous and has left limits everywhere. So, we can hope that the optional projection preserves these properties. If the filtration is not right-continuous, then M need not have a cadlag modification, so we cannot expect optional projection to preserve right-continuity in this case. However, M does still have a version with left and right limits everywhere, which is the optional projection of X. So, without assuming right-continuity of the filtration, we may still hope that the optional projection preserves the existence of left and right limits of a process. Next, the predictable projection is equal to the left limits, {{}^{\rm p}\!X_t=M_{t-}}, which is left-continuous with left and right limits everywhere. Therefore, we can hope that predictable projections preserve left-continuity and the existence of left and right limits. The existence of cadlag martingales which are not continuous, such as the compensated Poisson process, imply that optional projections do not generally preserve left-continuity and the predictable projection does not preserve right-continuity.

Recall that I previously constructed a version of the optional projection and the predictable projection for processes which are, respectively, right-continuous and left-continuous. This was done by defining the projection at each deterministic time and, then, enforcing the respective properties of the sample paths. We can use the results in those posts to infer that the projections do indeed preserve these properties, although I will now more direct proofs in greater generality, and using the more general definition of the optional and predictable projections.

We work with respect to a complete filtered probability space {(\Omega,\mathcal F,\{\mathcal F_t\}_{t\ge0},{\mathbb P})}. As usual, we say that the sample paths of a process satisfy any stated property if they satisfy it up to evanescence. Since integrability conditions will be required, I mention those now. Recall that a process X is of class (D) if the set of random variables {X_\tau}, over stopping times {\tau}, is uniformly integrable. It will be said to be locally of class (D) if there is a sequence {\tau_n} of stopping times increasing to infinity and such that {1_{\{\tau_n > 0\}}1_{[0,\tau_n]}X} is of class (D) for each n. Similarly, it will be said to be prelocally of class (D) if there is a sequence {\tau_n} of stopping times increasing to infinity and such that {1_{[0,\tau_n)}X} is of class (D) for each n.

Theorem 1 Let X be pre-locally of class (D), with optional projection {{}^{\rm o}\!X}. Then,

  • if X has left limits, so does {{}^{\rm o}\!X}.
  • if X has right limits, so does {{}^{\rm o}\!X}.

Furthermore, if the underlying filtration is right-continuous then,

  • if X is right-continuous, so is {{}^{\rm o}\!X}.
  • if X is cadlag, so is {{}^{\rm o}\!X}.

Proof: First, as X is prelocally of class (D), there exists stopping times {\tau_n} increasing to infinity such that {X^n\equiv1_{[0,\tau_n)}X} is of class (D) for each n. As {[0,\tau_n)} is optional, the optional projection satisfies,

\displaystyle  1_{[0,\tau_n)}{}^{\rm o}\!X={}^{\rm o}\!X^n.

As {X^n} satisfies any of the conditions considered in the theorem (left limits, right limits, right-continuity) whenever X does, it is enough to prove the result for {X^n}. That is, we may suppose that X is of class (D).

Using theorem 3 of the post on pathwise regularity of optional and predictable processes, we can now run through the statements of the theorem.

Let {\tau_n} be a uniformly bounded and decreasing sequence of stopping times. If X has right limits then {X_{\tau_n}} converges and, under the class (D) assumption, is uniformly integrable. So,

\displaystyle  {\mathbb E}[{}^{\rm o}\!X_{\tau_n}]={\mathbb E}[X_{\tau_n}]\rightarrow{\mathbb E}[\lim\nolimits_nX_{\tau_n}]. (1)

The first equality is straight from the definition of the optional projection. So, {{\mathbb E}[{}^{\rm o}\!X_{\tau_n}]} converges to a limit and, as {{}^{\rm o}\!X} is optional, this implies that it has right limits everywhere. Similarly, if X has left limits, then we can consider a uniformly bounded increasing sequence of stopping times, {\tau_n}. As above, the limit (1) holds, and we conclude that {{}^{\rm o}\!X} has left limits.

We now suppose that X is right-continuous, and consider a uniformly bounded sequence of stopping times {\tau_n} decreasing to a limit {\tau}. If the underlying filtration is right-continuous, {\tau} will also be a stopping time. So, using the fact that {X_{\tau_n}} is uniformly integrable and converges to {X_\tau},

\displaystyle  {\mathbb E}[{}^{\rm o}\!X_{\tau_n}]={\mathbb E}[X_{\tau_n}]\rightarrow{\mathbb E}[X_\tau]={\mathbb E}[{}^{\rm o}\!X_\tau].

As {{}^{\rm o}\!X} is optional, we conclude that it is right-continuous.

Finally, if the filtration is right-continuous and X is cadlag, then {{}^{\rm o}\!X} will be cadlag, by combining the results above for right-continuity and the existence of left limits. ⬜

The predictable projection behaves similarly, except that instead of right-continuity, it preserves left-continuity.

Theorem 2 Let X be a locally of class (D), with predictable projection {{}^{\rm p}\!X}. Then,

  • if X has left limits, so does {{}^{\rm p}\!X}.
  • if X has right limits, so does {{}^{\rm p}\!X}.
  • if X is left-continuous, so is {{}^{\rm p}\!X}.
  • if X is caglad, so is {{}^{\rm p}\!X}.

Proof: The proof follows similar lines as that given above for the optional projection. As X is locally of class (D), there exists a sequence of stopping times {\tau_n} increasing to infinity, and such that {X^n\equiv1_{\{\tau_n > 0\}}1_{[0,\tau_n]}X} is of class (D). As {1_{\{\tau_n > 0\}}1_{[0,\tau_n]}} is predictable,

\displaystyle  1_{\{\tau_n > 0\}}1_{[0,\tau_n]}{}^{\rm p}\!X={}^{\rm p}\!X^n.

As {X^n} satisfies any of the properties of the theorem (left limits, right limits, left-continuity) whenever X does, we just need to prove the result for each {X^n}. That is, we may suppose that X is of class (D).

Using theorem 4 from the post on regularity of the paths of optional and predictable processes, we run through the statements of the theorem.

Let {\tau_n} be a uniformly bounded and decreasing sequence of predictable stopping times. If X has right limits then the sequence {X_{\tau_n}} converges and, using the class (D) property, is uniformly integrable. So, from the definition of the predictable projection,

\displaystyle  {\mathbb E}[{}^{\rm p}\!X_{\tau_n}]={\mathbb E}[X_{\tau_n}]\rightarrow{\mathbb E}[\lim\nolimits_nX_{\tau_n}]. (2)

So, {{\mathbb E}[{}^{\rm p}\!X_{\tau_n}]} converges and, as {{}^{\rm p}\!X} is predictable, this implies that {{}^{\rm p}\!X} has right limits. Similarly, if X has left limits, then we can let {\tau_n} be a uniformly bounded increasing sequence of predictable stopping times. As above, the limit (2) holds, and we conclude that {{}^{\rm p}\!X} has left limits.

Now suppose that X is left-continuous and let {\tau_n} be a uniformly bounded sequence of predictable stopping times increasing to a limit {\tau}. Then, {\tau} is also a predictable stopping time. Using the fact that {X_{\tau_n}} converges to {X_\tau} and, by the class (D) property, is uniformly integrable, the definition of the predictable projections gives,

\displaystyle  {\mathbb E}[{}^{\rm p}\!X_{\tau_n}]={\mathbb E}[X_{\tau_n}]\rightarrow{\mathbb E}[X_\tau]={\mathbb E}[{}^{\rm p}\!X_\tau].

As {{}^{\rm p}\!X} is predictable, we conclude that it is left-continuous.

Finally, recall that a process X is caglad if it is left-continuous with right limits. The above results applied separately for left-continuity and right limits implies that {{}^{\rm p}\!X} will be caglad. ⬜

Necessity of the Integrability Conditions

Above, we imposed integrability conditions before showing that the optional and predictable projections preserve various pathwise properties. Specifically, for the optional projection, it was required that the process is prelocally of class (D) and, for the predictable projection, it was required to be locally of class (D). By the means of a simple example, I will now demonstrate that the results are not true if these conditions are dropped. I construct a continuous process whose optional and predictable projections exist, but do not have left and right limits everywhere (so, also, are neither right-continuous nor left-continuous).

In the following example, we will let the underlying filtration be trivial, so that {\mathcal F_t} consists of the events of probability 0 or 1. In that case, every {\mathcal F_\cdot}-stopping time will be almost-surely constant. It then follows that the predictable and optional projection of a process X exist if and only if {X_t} is integrable at each time, and the projections are equal to the deterministic process

\displaystyle  {}^{\rm o}\!X_t={}^{\rm p}\!X_t = {\mathbb E}[X_t].

Now, suppose that there is a nonnegative random variable, U, defined on the probability space, with infinite expectation. I will consider the case where {{\mathbb P}(U\ge x)=1/x} for all {x\ge1}, which can be constructed by {U=1/V} for V uniformly distributed on the unit interval. It can be seen that {{\mathbb E}[ K\wedge U]=\log K} for all {K > 0}. Starting with a continuous function {\varphi\colon{\mathbb R}_{>0}\rightarrow{\mathbb R}^+}, define a process

\displaystyle  X_t = \min\left(\varphi(\lvert 1-t\rvert),\lvert 1-t\rvert U\right).

This is continuous with {X_1=0} and, for all {s > 0},

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle{}^{\rm o}\!X_{1\pm s} ={}^{\rm p}\!X_{1\pm s} &\displaystyle={\mathbb E}[\min(\varphi(s),sU)]\smallskip\\ &\displaystyle=s\log(\varphi(s)/s). \end{array}

For example, taking {\varphi(s)=s\exp(s^{-2})}, we see that {{}^{\rm o}\!X_{1\pm s}} and {{}^{\rm p}\!X_{1\pm s}} diverge to infinity as s goes to zero. This would contradict theorems 1 and 2 above, if the integrability conditions were dropped.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s