In the previous two posts of the stochastic calculus notes, I began by introducing the basic concepts of a stochastic process and filtrations. As we often observe stochastic processes at a random time, a further definition is required. A stopping time is a random time which is adapted to the underlying filtration. As discussed in the previous post, we are working with respect to a filtered probability space .
Definition 1 A stopping time is a map
such that
for each
.
This definition is equivalent to stating that the process is adapted. Equivalently, at any time
, the event
that the stopping time has already occurred is observable.
One common way in which stopping times appear is as the first time at which an adapted stochastic process hits some value. The debut theorem states that this does indeed give a stopping time.
Theorem 2 (Debut theorem) Let
be an adapted right-continuous stochastic process defined on a complete filtered probability space. If
is any real number then
defined by
(1) is a stopping time.
If is defined by equation (1) then it does seem intuitively obvious that it will be a stopping time. Clearly,
will be less than or equal to
precisely when
for some
,
(2) |
As is an adapted process, each of the sets inside the union on the right hand side is
-measurable, and it seems reasonable to conclude that
should also be
-measurable, so that
is a stopping time. However, the right hand side of (2) is an uncountable union, and sigma-algebras are only closed under countable unions and intersections in general. This result demonstrates the added difficulties in looking at continuous-time processes versus the discrete-time case. In discrete-time, the union in (2) is only over finitely many times and in that case the debut theorem follows easily.
For continuous adapted processes (e.g. Brownian motion), the debut theorem is relatively easy to prove. Continuous processes always achieve their supremum value on any compact interval, and it is enough to look at the maximum process . By continuity, this supremum can be restricted to the countable set of rational numbers. Equation (2) reduces to the following,
which expresses in terms of countable intersections of countable unions of
and hence is in
.
For right-continuous processes it is still true that is fully determined by its values at rational times, so it might seem that the debut theorem can be proved in a similar way as for continuous processes. However, this is not the case, and it is not possible to express
using countable unions and intersections of sets in
. In fact,
need not be measurable in general, and the completeness of the filtered probability space is required. Still, it is not difficult to prove using only elementary techniques, and I give a proof of this below.
The debut theorem for right-continuous processes is only a special case of a more general result for arbitrary progressively measurable processes. However, the more general case relies on properties of analytic sets, which is a subject going well outside of these notes (I added a proof of the general case to PlanetMath), and right-continuous processes are more than general enough for these notes.
The value of a jointly measurable stochastic process at a random time is a measurable random variable, as mentioned in the previous post. As well as simply observing the value at this time, as the name suggests, stopping times are often used to stop the process. A process stopped at the random time
is denoted by
,
It is important that stopping an adapted process at a stopping time preserves the basic measurability properties.
Lemma 3 Let
be a stopping time. If the stochastic process
satisfies any of the following properties then so does the stopped process
.
- left-continuous and adapted.
- right-continuous and adapted.
- predictable.
- optional.
- progressively measurable.
Proof: First, recall that if is jointly measurable and
is any random time then
is measurable (see here). It follows from the decomposition
that is also jointly measurable. Now suppose that
is progressive and
is any fixed time. By definition,
is
-measurable and, if
is a stopping time, then
is
-measurable. Then, by what we have just shown above, the stopped process
is -measurable. This shows that
is progressive.
Now let be left (resp. right) continuous and adapted. Then it is progressively measurable and, as has just been shown,
is progressive. So,
is adapted and it is clearly also left (resp. right) continuous.
Finally, we note that the collection of all processes such that
is predictable (resp. optional) includes the left (resp. right) continuous adapted processes and is closed under the limit of a sequence of processes. So, by the functional monotone class theorem it follows that
is predictable (resp. optional) whenever
is predictable (resp. optional). ⬜
Other than the proof of the debut theorem given below, this covers the main results on stopping times for this post. All that remains are some very useful lemmas which are almost trivial to prove. First, it is often useful to replace the inequality in the definition of a stopping time by a strict inequality. This can be done as long as the filtration is right-continuous.
Lemma 4 A map
is a stopping time with respect to the right-continuous filtration
if and only if
for each
.
Proof: For a stopping time using the fact that
for each
gives
Conversely, if for each time, then for any
,
As this is true for all it shows that
. ⬜
Finally, the class of stopping times is closed under basic operations such as taking the maximum or minimum of two times or, for right-continuous filtrations, taking the limit of a sequence of times.
Lemma 5
- If
are stopping times then so are
and
.
- Let
be a sequence of stopping times converging to a limit
and suppose that for each
,
for large enough
. Then
is a stopping time. Note, in particular, that this includes the case where
is increasing to the limit
.
- If
is a sequence of stopping times then
is a stopping time.
- If
is a sequence of stopping times and the filtration is right-continuous, then
and
are stopping times.
Proof: If are stopping times then
So, and
are stopping times.
Now let be a sequence of stopping times such that, for each
,
for large
. Then,
so that is a stopping time.
If is any sequence of stopping times then
so is a stopping time.
Finally, suppose that the filtration is right-continuous and that is a sequence of stopping times. Note that
whenever
infinitely often for some
, enabling us to write
If the filtration is right-continuous, the lemma above shows that is a stopping time. Similarly,
whenever
for all large
and some
giving
so is also a stopping time. ⬜
Lemma 6 Let X be a cadlag adapted process. Then, there exists a sequence of stopping times
such that
whenever
and
, and
Proof: For any positive real numbers define the random time
It can be seen that the union of the graphs over positive rationals
is equal to the set of times at which
so, to complete the proof of the lemma, it is enough to show that
is a stopping time. That is, the set
is
-measurable. For
we have
, so it is trivially measurable. So, we can suppose that
. In that case, letting
be any countable dense subset of
with
, set
As this is the supremum of a countable set of -measurable random variables, it is
-measurable. Also, using the cadlag property of
,
so this is -measurable. Then,
iff
, which is
-measurable. ⬜
Proof of the debut theorem
I now give a proof of the debut theorem for a right-continuous adapted process . For a fixed real number
, let
be the first time at which
. We need to show that this is a stopping time.
Given any stopping time , it is possible to define the larger time
Clearly, and it is easily seen that this is a stopping time. Indeed,
is less than or equal to a positive time
precisely when, for each
, there is a time
in the range
satisfying
. By right-continuity, it is enough to restrict to rational multiples of
giving,
Also, using right continuity, will be strictly greater than
whenever
.
The idea is to start with any stopping time bounded above by , for example
will do. Then, by iteratively replacing
by
, approach
from below by successively closer approximations. Unfortunately, right-continuous processes can be badly behaved enough that this can fail, even after infinitely many steps, and transfinite induction would be required. A quicker approach, which I use here, is to make use of the idea of the essential supremum of a set of random variables.
Let consist of the set of all stopping times
satisfying
. By properties of the essential supremum, there exists a sequence
such that
is an essential supremum of
. As shown above, this is a stopping time and, therefore,
. The stopping time
defined above satisfies
and is therefore in
. From the definition of the essential supremum, this implies that
and, therefore,
with probability one. However, as mentioned above,
whenever
, which therefore has zero probability.
We have shown that the stopping time satisfies
almost surely. Finally, completeness of the filtered probability space implies that
is a stopping time.
With reference to your note on stopping times, I understand that if we consider a countable sequence of stopping times, then both the supremum and the infimum of such a sequence are stopping times. Now if we consider an uncountable sequence of stopping times. Is the supremum and infimum of this sequence a stopping time? If not, whats a counter-example? ( I am aware that the sigma algebra properties require countable unions and intersections )
Hi. Counterexamples are,
– Take your probability space to be the unit interval with the standard Lebesgue measure, A ⊂ [0,1] be a non-measurable set. Let τx(ω) be 1 if ω = x and 0 otherwise. This is a stopping time, but supx∈Aτx is 1 precisely on the non-measurable set A so is not a stopping time.
– A more subtle example is that, on the space of cadlag processes (with the natural filtration), the first time τ at which which the coordinate process hits a level K is not a stopping time unless you complete the filtration (but it is the supremum of all stopping times T ≤ τ). This is harder to see though, but is true because the set {τ ≤ t} can be any analytic set, and need not be measurable (but is universally measurable).
[Also, moved your comment to the relevant post]
According to the definition of stopping time, the most simple example of it would be T = c, c a real number, right?
Right.
Dear George, could you tell what does the notation
means in the statement of Lemma 5? Also, in the last formula for
should it be
? I also have the following question: it seems that without an assumption on the completeness of the filtration we can at least state that there is a stopping time
such that the set
is a null set (not necessarily measurable). Is it right?
– Yes, I seem to have used the notation
without explaining what it means, which is a bit sloppy. It is standard notation though — it is a stochastic interval,
You can think of
as being a `random set’
, which depends on
.
– I fixed the last formula. Thanks.
– For your final question, yes. There will always be a stopping time
with
being
-null.
btw if you have two stopping times are their sum necessarily a stopping time?
Yes. From an intuitive point of view, this seems obvious. Given that times σ, τ are observable when they occur, can you tell when σ + τ occurs? Yes, you clearly can, because σ and τ will both have already been observed by then.
More precisely, given any measurable function f:R+ → R+ with f(s,t) ≥ max(s,t), then f(σ,τ) will be a stopping time whenever σ and τ are. In particular this holds for f(s,t) = s + t. More generally, you just need to require that f(s,t)∧u = f(s∧u,t∧u) holds.
Dear Almost Sure,
I thank you for your blog, it is very interesting and I’ve found here a lot of explanations. I’m reading the book “Stochastic calculus and financial applications” of Michael Steele and I have a doubt about the theorem “Doob’s Continuous-Time Stopping Theorem” on page 51. This is the statement: “Suppose {Mt} is a continuous martingale w.r.t. a filtration {Ft} that satisfies the usual conditions. If tau is a stopping time for {Ft}, the process Xt=M_min(t,tau) is also a continuous martingale w.r.t. {Ft}”
My doubt is about the condition that the filtration needs to satisfy the usual conditions. Your lemma 3 is a more general version of the theorem and you haven’t assumed the satisfation of the usual conditions. I don’t have very strong theoretical skills and so I’m worried that there is something that I don’t understand. In the proof of the theorem I am not able to find the place where the assumption is necessary. If you can give your opinion, I’ll appreciate it a lot. Thanks a lot
Sorry, I read badly and your lemma 3 is not a generalization of the theorem “Continuous-time stopping theorem”. The lemma doesn’t assure the martingale property of the stopped process.
No, my Lemma 3 here does not involve martingales. However, I do prove the stopping theorem in a later post, and usual conditions are not required at all.
Dear George, I am from engineering back ground. I see that when considering stopping times for continuous stochastic processes, you considered only over rational numbers. Also I see this in many texts when considering stopping time proof. I am quite confused about considring only rational numbers. Can you kindly clarify with some references also if possible.
It is because in probability and measure theory, you can only work easily with operations on countably many sets (or events) at once. This is because of countable additivity of measure. Unions of uncountably many sets can give non-measurable sets. As the set of reals is uncountable, you need to restrict to a countable subset in many probabilistic arguments.
In the proof of lemma 5, you define a stopping time. But how do you know that what you define actually is a stopping time?
You are right to ask – I did skip over the proof of that rather quickly. We need to show that
is
measurable. One way is is to set
. Then,
iff
eventually.
Just one question regarding this, how are you allowed to use max, instead of sup? How do we know that the max exists?
Actually, I should have written sup rather than max. I’ll update the post. The argument should be unchanged though. Actually, the max of
does exist over any bounded innterval, but that is not important to the result.
I updated the proof to include this argument.
Thank you very much. I have been looking for arguments that jumps really are stopping times, and functions of jumps really are measurable etc.. Do you know if any book has these arguments? Or did you come up with it yourself? Anyway, great blog, really good work!
It is quite standard, although the precise arguments in these notes are my own. I can check my references for published statements of such facts.
I’m struggling to understand the proof of Lemma 5.1 regarding
Shouldn’t the set notations after the respective equal signs be reversed ?
I think they are correct. As
is the maximum of
and
, it is bounded above by t if and only if both of
and
are.
You are right. This notation (for minimum and maxmum) was new to me, but getting to know it makes it clear that your statements are true. Thank you for your answer (and your incredible blog!).
Hi! I think the last part of the first line after Definition 1 should read as: “
is adapted” instead of “
is adapted”, right? That is,
is adapted instead of
, which is time independent.
actually, there was a typo which I fixed now. Maybe you misunderstand the notation.
is equivalent to
.
Thanks for answering. The typo, [0, \tau] instead of [\tau,\infty), made me think that that particular notation was not being used! All clear then.