Doob’s optional sampling theorem states that the properties of martingales, submartingales and supermartingales generalize to stopping times. For simple stopping times, which take only finitely many values in , the argument is a relatively basic application of elementary integrals. For simple stopping times , the stochastic interval and its indicator function are elementary predictable. For any submartingale , the properties of elementary integrals give the inequality

(1) |

For a set the following

is easily seen to be a stopping time. Replacing by extends inequality (1) to the following,

(2) |

As this inequality holds for all sets it implies the extension of the submartingale property to the random times. This argument applies to all simple stopping times, and is sufficient to prove the optional sampling result for discrete time submartingales. In continuous time, the additional hypothesis that the process is right-continuous is required. Then, the result follows by taking limits of simple stopping times.

Theorem 1Let be bounded stopping times. For any cadlag martingale, submartingale or supermartingale , the random variables are integrable and the following are satisfied.

- If is a martingale then,
- If is a submartingale then,
- If is a supermartingale then,

*Proof:* It is enough to prove the result for submartingales, and the martingale and supermartingale results follow from applying this to . So, suppose that is a submartingale.

The idea is to approximate from the right by sequences of simple stopping times . It is easily seen that this is achieved by the following

Then, . If it can be shown that are uniformly integrable sequences then their limits must be integrable, and commuting the limit with the expectation would give .

In the martingale case, the property for simple stopping times expresses as conditional expectations. This imples uniform integrability. To prove uniform integrability for submartingales, the following result can be applied: a submartingale sampled at a decreasing sequence of times which are bounded below is a uniformly integrable sequence. This was proven in the previous post using a simple `Doob-style’ decomposition. More generally, the submartingale property at simple stopping times implies that the process is a submartingale with time running over the negative integers, and with respect to the filtration . Taking this extends to , bounding the negative integers from below. So, is a uniformly integrable sequence. Similarly for . As shown above, this gives the inequality .

As with the argument above for simple stopping times, inequality (2) follows, giving the result . ⬜

Similarly, the following optional stopping result shows that the class of cadlag martingales is closed under stopping at arbitrary stopping times.

Theorem 2Let be a cadlag martingale (resp. submartingale, supermartingale) and be a stopping time. Then, is also a martingale (resp. submartingale, supermartingale).

*Proof:* As above, it is sufficient to prove the result for submartingales, with the supermartingale and martingale cases following by applying it to . As is right-continuous and adapted (and hence, progressive), the stopped process will also be adapted. Choosing times and set ,

Theorem 1 applied to the stopping times says that this is nonnegative, so is indeed a submartingale. ⬜

#### First Exit Time of Standard Brownian motion

As an example of optional stopping, consider the first time at which standard Brownian motion exits an interval . Here, are arbitrary positive real numbers. It must almost surely exit this interval some time, so . Indeed, the probability that it is still in the interval at any time is given by the probability that the standard normal is in the interval , which goes to zero as .

One question we can ask is, what is the probability that it exits the interval at rather than at ? That is, what is the probability that standard Brownian motion hits before ? Using the fact that is a uniformly bounded martingale,

Rearranging this gives the following probabilities

(3) |

Similarly, the fact that the increment has mean zero and variance independently of shows that is a martingale. So, is a martingale giving,

Substituting in (3) gives the expected time that the Brownian motion exits the range

(4) |

Finally, note that letting go to infinity in equations (3,4) shows that Brownian motion will eventually hit any value with probability one, but the expected time when this first happens is infinite.

#### Non-Right-Continuous Martingales

The optional sampling theorems stated above have the important precondition that the martingale must have sample paths which are cadlag. It is clearly necessary to choose a good modification of a process in order for these results to hold, since the distribution of an arbitrary process at a continuously distributed time need not be related in any way to its distribution at fixed times. The restriction to cadlag processes is not usually a problem, as the relatively weak condition of right-continuity in probability is sufficient to guarantee the existence of a cadlag modification and, furthermore, if the underlying filtration is right-continuous then *every* martingale has a cadlag version.

There are, however, some situations where we might want to relax the right-continuity constraint. For instance, if we are looking at martingales with respect to a non-right-continuous filtration or have submartingales or supermartingales which are not guaranteed to be right-continuous in probability, then cadlag versions might not exist. Even so, it is still possible to modify any martingale, submartingale or supermartingale to have left and right limits everywhere, and to be right-continuous everywhere outside of a fixed countable set of times. Fortunately, the results above still hold in this generality.

Theorem 3LetXbe a martingale, submartingale or supermartingale whose sample paths are right-continuous everywhere outside of a fixed countable set of times . Then, for any bounded stopping times , the following are satisfied.

- If is a martingale then,
- If is a submartingale then,
- If is a supermartingale then,

*Proof:* We proceed in a similar way to the proof of Theorem 1 above. Choose sequences of simple stopping times decreasing respectively to and . Also, as *S* is countable, we can choose a sequence of finite subsets increasing to *S*. Then define the times

These are again simple stopping times decreasing to and respectively. Writing

which is in , we see that are indeed stopping times, and similarly for . Furthermore, whenever we have that eventually. Together with right-continuity of the sample paths outside of *S*, this gives and, similarly, . The result now follows as in Theorem 1. Just considering the case where *X* is a submartingale,

from which the result follows. ⬜

Similarly, the optional stopping result stated in Theorem 2 above follows across to the more general situation.

Theorem 4LetXbe a martingale (resp. submartingale or supermartingale) whose sample paths are right-continuous everywhere outside of a fixed countable set of times . For any stopping time , then is also a martingale (resp. submartingale, supermartingale).

*Proof:* We first need to show that is adapted. The proof of Theorem 3 constructed simple stopping times such that . Then,

As is a simple stopping time, is measurable and, is -measurable. So, is adapted.

The remainder of the proof is identical to that given above for Theorem 2, except that we apply Theorem 3 instead of 1. ⬜

Hello Mr. Lowther

I’ve been working on a Brownian motion problem that requires me to use the joint distribution of two distinct hitting times. The distribution of a single hitting time can easily be computed via an appropriate superposition of the brownian motion with its reflection at the barrier. On the contrary, to achieve the same for the “double” case I reckon (from applications of the method of images to analogous cases I researched) that an infinite series of reflections is needed (just like two mirrors facing each other generate an infinite series of reflected images, whereas one only generates one image). Do you have any better ideas how to approach the problem?

Best regards

Tom,

Yes, you need to do something like that. I’m assuming that you are wanting the hitting times

T,_{a}Tof some levels_{b}a> 0 >b(ifaandbhad the same sign then it is easy). To get the joint distribution, you want to calculate P(T≤_{a}s,T≤_{b}t) in the cases≤tyou can split this into two terms. P(max(T,_{a}T) ≤_{b}s) is the probability of hitting either barrier by timesand can be calculated by double reflection. You can see my answer to this math.SE question or this other question might help. This is a well-known problem in finance, and searching for double barrier option pricing formula might also help. The other term, P(T≤_{a}s<T≤_{b}t) can be calculated by first calculating the joint distribution of {T≤_{a}s<T} and_{b}B. Then, calculate the probability of_{s}T≤_{b}tconditional onBand_{s}s<T. This only requires using the single-barrier solution. Then integrating over the possible values of_{b}Bshould give what you want. Hope that helps!_{s}George

Update:I added an extra section to this post containing Theorems 3 and 4, which extend optional sampling to processes which are not right-continuous everywhere.Why is tau’ easily seen to be a stopping time?

With probability 1 it’s equal to the hitting time of the set {-a,b} because following that BM crosses the horizontal line i.o. in any interval to the right, hence escapes the interval [-a.b].

A typo is spotted in the third line of the last section “Non-Right-Continuous Martingales”, where “in probably” shall be “in probability”?

Fixed, thanks.

I wonder how safe it is if one directly works on Cadlag version in general. A specific question is posted in mof (http://mathoverflow.net/questions/223465/is-it-safe-to-work-on-a-cadlag-modification-of-a-feller-process)

You need to assume that you have a good version of the process. For example, let be any random time without atoms (i.e., for all ). Define the process if and otherwise. Then, is almost surely 0 at each deterministic time , yet is almost surely equal to 1. More generally, can be anything, and need not be measurable, even when is almost surely 0 at each .

Dear George, Thanks for your reply. I just caught your post.

Before Eq. (3), what led to the conclusion ? Also, if I just define $\tau$ to be the first hitting time of a, certainly — so what make the difference here?

I used the fact that is uniformly bounded, which is not true in your example.