In filtering theory, we have a filtered probability space and a signal process . The sigma-algebra represents the collection of events which are observable up to and including time t. The process X is not assumed to be adapted, so need not be directly observable. For example, we may only be able to measure an observation process , which incorporates some noise , and generates the filtration , so is adapted. The problem, then, is to compute an estimate for based on the observable data at time t. Looking at the expected value of X conditional on the observable data, we obtain the following estimate for X at each time ,
The process Y is adapted. However, as (1) only defines Y up to a zero probability set, it does not give us the paths of Y, which requires specifying its values simultaneously at the uncountable set of times in . Consequently, (1) does not tell us the distribution of Y at random times. So, it is necessary to specify a good version for Y.
Optional projection gives a uniquely defined process which satisfies (1), not just at every time t in , but also at all stopping times. The full theory of optional projection for jointly measurable processes requires the optional section theorem. As I will demonstrate, in the case where X is right-continuous, optional projection can be done by more elementary methods.
Throughout this post, it will be assumed that the underlying filtered probability space satisfies the usual conditions, meaning that it is complete and right-continuous, . Stochastic processes are considered to be defined up to evanescence. That is, two processes are considered to be the same if they are equal up to evanescence. In order to apply (1), some integrability requirements need to imposed on X. Often, to avoid such issues, optional projection is defined for uniformly bounded processes. For a bit more generality, I will relax this requirement a bit and use prelocal integrability. Recall that, in these notes, a process X is prelocally integrable if there exists a sequence of stopping times increasing to infinity and such that
is integrable. This is a strong enough condition for the conditional expectation (1) to exist, not just at each fixed time, but also whenever t is a stopping time. The main result of this post can now be stated.
Theorem 1 (Optional Projection) Let X be a right-continuous and prelocally integrable process. Then, there exists a unique right-continuous process Y satisfying (1).
Uniqueness is immediate, as (1) determines Y, almost-surely, at each fixed time, and this is enough to uniquely determine right-continuous processes up to evanescence. Existence of Y is the important part of the statement, and the proof will be left until further down in this post.
The process defined by Theorem 1 is called the optional projection of X, and is denoted by . That is, is the unique right-continuous process satisfying
for all times t. In practise, the process X will usually not just be right-continuous, but will also have left limits everywhere. That is, it is cadlag.
Theorem 2 Let X be a cadlag and prelocally integrable process. Then, its optional projection is cadlag.
A simple example of optional projection is where is constant in t and equal to an integrable random variable U. Then, is the cadlag version of the martingale .
The proof of Theorem 2 will also be left until later. Existence of the optional projection for cadlag processes is simpler than for right-continuous processes, and I will prove the existence of the optional projection separately for cadlag and right-continuous processes below. For now, we show the basic properties of the optional projection assuming that it exists. Linearity of the projection is straightforward.
Lemma 3 Let X and Y be right-continuous and prelocally integrable processes. Then, for -measurable random variables , ,
Proof: It is clear that is right-continuous and prelocally integrable. Setting we have,
almost surely. So, by definition, Z is the optional projection of . ⬜
It is also easy to show that optional projection commutes with multiplication by an adapted process.
Lemma 4 Let U and X be right-continuous processes such that U is adapted and X and UX are prelocally integrable. Then,
Proof: The process is right-continuous and, as U is adapted,
almost surely. So, by definition, Z is the optional projection of UX. ⬜
One of the most important properties of the optional projection is the fact that equation (3) still holds when t is generalized to be any stopping time.
Theorem 5 If X is a right-continuous and prelocally integrable process then, for all stopping times ,
Proof: In order to avoid having to keep multiplying by the term in expressions such as (5), I will take all processes to be 0 at infinity. Suppose, first, that is integrable. Let U be a bounded -measurable random variable. If is a simple stopping time taking values in a finite set then, using the fact that is -measurable for all times t,
over simple stopping times is uniformly integrable.
Next, let be an arbitrary stopping time and be a sequence of simple stopping times decreasing to . Then, using right-continuity of X and and the fact that U is -measurable,
This proves (5) in the case where is integrable.
and the above shows that
(almost surely). Letting increase to infinity gives the result. ⬜
Finally, we show that optional projection is indeed a projection operator. That is, applying it twice gives the same result as applying it once.
Lemma 6 If X is a right-continuous and prelocally integrable process, then so is its optional projection and,
Proof: The fact that is adapted and right-continuous means that it is equal to its own optional projection, just so long as we can show that it satisfies the requirement of prelocal integrability. Supposing first that is integrable, we have the bound,
The right hand side is a martingale so, taking a cadlag version, is locally integrable and hence prelocally integrable. Now, for arbitrary prelocally integrable X, let be a sequence of stopping times increasing to infinity such that (2) is integrable. Then, using Lemma (4),
is prelocally integrable for each n. So, is prelocally integrable as required. ⬜
Optional Projection of Increasing Processes
Existence of the optional projection for increasing processes is relatively simple, and follows from the existence of cadlag modifactions of submartingales.
Lemma 7 Let X be an integrable, right-continuous and increasing process. Then, its optional projection exists and is a cadlag submartingale.
Proof: As X is integrable, the process Y can de defined at each time t by (1). For times , the fact that X is increasing gives
So, Y is a submartingale. If is a sequence of times decreasing to then, by dominated convergence,
as n goes to infinity. Therefore, Y has a cadlag modification which, by definition, is the optional projection of X. ⬜
Optional Projection of Integrable Variation Processes
Now, suppose that X is right-continuous with finite variation over each bounded interval. By the Jordan decomposition, can be represented as the difference of increasing and right-continuous processes , , starting from 0,
and the sum of and is the variation of X,
It can be seen that and are measurable, by writing
where the limit is taken over a sequence of partitions with mesh going to zero. This decomposition gives a quick method of extending Lemma 7 to processes with integrable variation.
Lemma 8 Let X be a right-continuous integrable process with integrable variation over all finite time periods. Then, its optional projection exists and is cadlag.
Proof: As X has integrable variation over finite time periods, (6) shows that its increasing and decreasing components are integrable. So, applying Lemma 7, the optional projections of and exist and are cadlag submartingales. The optional projection of X can then be constructed as
This proof tells us more than just that exists and is cadlag. It expresses as the difference of submartingales. Although I will not make use of it here, this additional information is incorporated by the following lemma.
Optional Projection of Cadlag Processes
I will construct the optional projection of cadlag processes by taking limits of integrable variation processes, and apply the results above. The approximation by processes of integrable variation is given by the following lemma.
Lemma 10 Let X be a cadlag process such that is integrable. Then, there exists a sequence of cadlag processes of integrable variation such that, for each ,
as n goes to infinity.
Proof: For any fixed and , it is enough to construct an integrable variation process Y such that
Applying this to a sequence of times increasing to infinity and decreasing to zero will then give the required processes . Choosing , define the sequence of random times,
As X is not adapted, these will not be stopping times. However, the debut theorem for right-continuous processes, applied with respect to the constant filtration , shows that are -measurable. Furthermore, the sequence increases to infinity. Otherwise, as X has left limits everywhere, the sequence would converge to a finite limit, contradicting the inequality .
Now, for each , define the process
These are cadlag with variation
which is integrable. Also, by construction, we have for and otherwise. So, using dominated convergence,
Taking for large enough n gives (8). ⬜
Now, the optional projection of a cadlag process can be constructed as the limit of optional projections of integrable variation processes. We start with the case where X is dominated in .
Lemma 11 Let X be a cadlag process such that is integrable. Then the optional projection of X exists and is cadlag.
Proof: Let be a sequence of cadlag integrable variation processes satisfying (7). Lemma 8 guarantees that the optional projections exist and are cadlag. Looking at the difference of the optional projections at a time ,
The right hand side of this inequality is a martingale, so Doob’s maximal inequality can be applied for any .
and Y is the optional projection of X. ⬜
To complete the construction of the optional projection for cadlag processes, Lemma 11 needs to be extended to the case where X is only prelocally integrable. This generalization is done by the following.
Lemma 12 Let X be a prelocally integrable right-continuous process, and suppose that are stopping times increasing to infinity such that the optional projections of exist. Then, the optional projection of X exists and
Proof: For , as , equation (4) gives
So, the optional projections agree at times , and we can define a process Y by for . So, Y is right-continuous on the interval and, taking the limit as n goes to infinity, Y is right-continuous. Also,
Finally, for prelocally integrable cadlag processes, we complete the construction of the optional projection, proving Theorem 2.
Lemma 13 Let X be a cadlag and prelocally integrable process. Then, its optional projection exists and is cadlag.
Proof: As X is prelocally integrable, there exists a sequence of stopping times increasing to infinity such that (2) is integrable. Lemma 11 says that has a cadlag optional projection. So, by Lemma 12, the optional projection of X exists and satisfies (9). This shows that is cadlag over and, taking the limit as n goes to infinity, is cadlag. ⬜
Optional Projection of Right-Continuous Processes
I now look at the optional projection of right-continuous processes. This is more difficult than the cadlag case, because it is not possible to find integrable variation processes approximating X in the sense of Lemma 10. Indeed, the convergence in (7) implies uniform convergence in probability, and ucp limits of cadlag processes are necessarily cadlag. Instead, I will use sequences approximating X pointwise from above and below. In the following, for any processes X and Y, the inequality means that X is greater than Y up to evanescence. That is, outside of a zero probability set, for all t. Similarly, convergence of a sequence of processes to a limit X is to be taken as pointwise convergence up to evanescence. So, outside of a zero probability set, for all times t.
Lemma 14 Let X be a right-continuous process such that is integrable. Then, there exists a sequence, of cadlag integrable processes of integrable total variation such that is decreasing in n and tends to X as n goes to infinity.
Proof: Choose a sequence of finite sets such that for each n, and is dense in . For any given n we write as . Setting and , define the process by
for all t in . As is a refinement of for , the supremum in the definition of is taken over a subinterval of and, hence, . Also, as is dense in , right-continuity ensures that tends to as n goes to infinity.
It is straightforward to see that is right-continuous and decreasing across each interval , so it is cadlag. Furthermore, is bounded by , so the variation of across each of the intervals and at each of the times is bounded by . Hence, the total variation on is
which is integrable. ⬜
I will make use of the approximation of X by integrable variation processes given by Lemma 14. This is much weaker than the convergence stated in Lemma 10 for the cadlag case, so it makes the proof that their optional projections converge more difficult than the argument used in the proof of 11 above. I start by showing that it is enough to show convergence, almost surely, at each stopping time.
Lemma 15 Let be a sequence of non-negative right-continuous adapted processes, and suppose that it is decreasing in n. If, for all stopping times ,
as , then .
Proof: Choosing any it is enough to show that . Let denote the set of stopping times such that . Then is closed under taking the maximum of pairs of stopping times, and under taking the limit of increasing sequences. So, the essential supremum of is itself in . It just remains to show that almost surely.
Choosing any m, define the stopping time
So on the interval , implying that . By maximality of , this gives and, by right-continuity, almost surely whenever . If is finite with nonzero probability then this contradicts the condition that (almost surely). Therefore, almost surely. ⬜
The optional projection of a process at a stopping time is given simply by the conditional expectation as in (5). By applying this together with Lemma 15, it can be shown that optional projection satisfies a monotone convergence property.
Lemma 16 (Monotone Convergence) Let be a sequence of prelocally integrable right-continuous processes decreasing to 0 as n goes to infinity.
Then, supposing that their optional projections exist, they also decrease to 0.
Proof: For we have
almost surely. By right-continuity this implies that, outside of a zero probability set, is decreasing in n for all t. For any stopping time , Theorem 5 gives
almost surely. The limit here is using monotone convergence for the conditional expectation. Finally, Lemma 15 gives as required. ⬜
We can now construct the optional projection of a right-continuous process by approximating with processes of integrable variation, as in Lemma 14, and applying monotone convergence.
Lemma 17 Let X be a right-continuous process such that is integrable. Then, its right-continuous optional projection exists.
Proof: By Lemma 14, there exists a sequence of integrable cadlag processes of integrable variation decreasing to X as n goes to infinity. Applying the same result to , there also exists a sequence of integrable cadlag processes of integrable variation increasing to X. So, is decreasing to 0. Applying Lemma 8, the optional projections
exist. As and are respectively decreasing and increasing in n, the same holds for their optional projections. Lemma 16 says that is decreasing to zero. So, we can define a process W up to evanescence by
By dominated convergence,
almost surely. To show that W is the optional projection of X, only right-continuity remains. If is a sequence decreasing to t then,
as m goes to infinity, for each fixed n. Then letting n go to infinity gives
Therefore, as required. ⬜
All that remains is to extend Lemma 17 to prelocally integrable processes.
Proof of Theorem 1: As X is prelocally integrable, there exists a sequence of stopping times increasing to infinity such that (2) is integrable. Then, Lemma 17 says that the optional projection of exists for each n. Finally, Lemma 12 says that the optional projection of X exists. ⬜