As was mentioned in the initial post of these stochastic calculus notes, it is important to choose good versions of stochastic processes. In some cases, such as with Brownian motion, it is possible to explicitly construct the process to be continuous. However, in many more cases, it is necessary to appeal to more general results to assure the existence of such modifications.
The theorem below guarantees that many of the processes studied in stochastic calculus have a right-continuous version and, furthermore, these versions necessarily have left limits everywhere. Such processes are known as càdlàg from the French for “continu à droite, limites à gauche” (I often drop the accents, as seems common). Alternative terms used to refer to a cadlag process are rcll (right-continuous with left limits), R-process and right process. For a cadlag process , the left limit at any time is denoted by (and ). The jump at time is denoted by .
We work with respect to a complete filtered probability space .
Theorem 1 below provides us with cadlag versions under the condition that elementary integrals of the processes cannot, in a sense, get too large. Recall that elementary predictable processes are of the form
for times , -measurable random variable and -measurable random variables . Its integral with respect to a stochastic process is
An elementary predictable set is a subset of which is a finite union of sets of the form for and for nonnegative reals and . Then, a process is an indicator function of some elementary predictable set if and only if it is elementary predictable and takes values in .
The following theorem guarantees the existence of cadlag versions for many types of processes. The first statement applies in particular to martingales, submartingales and supermartingales, whereas the second statement is important for the study of general semimartingales.
Theorem 1 Let X be an adapted stochastic process which is right-continuous in probability and such that either of the following conditions holds. Then, it has a cadlag version.
- X is integrable and, for every ,
is bounded.
- For every the set
is bounded in probability.
This existence of cadlag versions stated by this theorem is a special case of the following slightly more general result, which drops the requirement for the process to be right-continuous in probability. In the following, the modification has a right limit at every point. If the process is right-continuous in probability, then at each time, with probability one. By countable additivity, this remains true simultaneously at all times in any given countable set and therefore is a cadlag version.
Theorem 2 Let X be an adapted stochastic process, and suppose that either of the two conditions of Theorem 1 holds. Then, it has a version Y which has left and right limits everywhere and such that there is a countable subset for which is right-continuous at every .
A proof of this theorem is given below, using the ideas on upcrossings of a process as discussed in the previous post.
Cadlag Martingales
The main result for existence of cadlag martingales is as follows.
Theorem 3 Let X be a martingale, submartingale or supermartingale which is right-continuous in probability. Then, it has a cadlag version.
Proof: By applying the statement to , it suffices to prove the result for submartingales. However, in this case, for any elementary predictable set ,
(1) |
The first condition of Theorem 1 is satisfied, showing that cadlag versions exist. ⬜
Even if the condition of right-continuity in probability is dropped, then it is still possible to pass to well-behaved modifications. Although a martingale can fail to have a cadlag version, it is always the case that there exists a version which is cadlag everywhere outside of a fixed countable set of times.
Theorem 4 Let X be a martingale, submartingale or supermartingale. Then, it has a version Y which has left and right limits everywhere and such that there is a countable subset for which is right-continuous at every .
Proof: By applying the statement to , we may suppose that the process is a submartingale. Then, inequality (1) applies, so the first statement of Theorem 1 holds. Theorem 2 gives the modification Y. ⬜
Often, the underlying filtrations used are assumed to satisfy the usual conditions. That is, they are required to be right-continuous as well as being complete. In this case the existence of cadlag versions is particularly general. Every martingale has a cadlag version.
Theorem 5 Suppose that the filtration is right-continuous. Then, every martingale has a cadlag version.
More generally, a submartingale or supermartingale X has a cadlag version if and only if is right-continuous.
Proof: By applying the result to , we may suppose that the process is a submartingale. Furthermore, for martingales the function , being constant, is trivially right-continuous. So, it suffices to prove the second more general statement. Theorem 4 gives a version Y which has left and right limits everywhere and is cadlag outside of some countable set . It only remains to be shown that , almost surely, for each .
Choose a sequence strictly decreasing to . Then , by the submartingale property. The idea is to commute the limit as with the conditional expectation to obtain
(2) |
This can be done under the condition that the sequence is uniformly integrable. For a martingale, these are all conditional expectations , so uniform integrability is guaranteed. In fact, Lemma 6 states that this sequence is uniformly integrable whenever is a submartingale, so inequality (2) holds.
Similarly, using uniform integrability together with the right-continuity of gives
so that is a nonnegative random variable with zero expectation. This shows that almost surely. Finally, the right-continuity of the filtration, , is applied. As is necessarily -measurable, . ⬜
The proof above made use of the following simple, but useful, statement regarding uniform integrability of submartingales.
Lemma 6 Let be a submartingale with respect to a filtered probability space .
Then, is uniformly integrable for any decreasing sequence bounded below in .
Proof: The idea is to apply a `Doob-style decomposition’ of the submartingale into a martingale and an increasing process at the sequence of times. Let be a lower bound for and set
which, by the submartingale property, is a sum of nonnegative terms. Furthermore, by monotone convergence
so are integrable random variables, and is decreasing in . Furthermore, from the definition, the martingale property
is satisfied, so is a uniform integrable sequence. Finally, are uniformly integrable, and the result follows for . ⬜
Proof of Cadlag Versions
I now give a proof of Theorem 2. This will make use of the ideas from previous post on upcrossings and downcrossings to show that, when restricted to a countable set of times, has left and right limits everywhere. The result will follow from this.
For any , let be a finite subset of . The number of upcrossings of an interval for times in satisfies the bound
(3) |
for some elementary set . Furthermore, letting be the first time in at which , the stochastic interval is elementary and,
Applying the same idea to shows that there are elementary predictable sets such that
(4) |
Now suppose that the first condition of the theorem holds, so that is bounded by some positive constant for all elementary sets . Taking expectations of inequalities (3) and (4) gives
Letting increase to a countably infinite subset of and applying monotone convergence, these inequalities generalize to all countable subsets . In particular, the number of upcrossings of and the supremum of are almost surely bounded on .
Alternatively, suppose that the second statement of the theorem holds. Then, there exists a function with as and for all and elementary predictable sets . Inequalities (3) and (4) give
Again, letting increase to a countably infinite subset of and applying monotone convergence extends these inequalities to all countable subsets . Letting go to infinity then shows that the number of upcrossings of and the supremum of are bounded on .
In either case, applying countable additivity, the above shows that for time restricted to a countable subset the process , with probability one, is bounded and has finitely many upcrossings of for all rational on bounded time intervals. Replacing by the identically zero process outside this set of probability one, it can be assumed that this holds everywhere.
By the results of the previous post, this implies that for time restricted to such a countable set , almost surely has left and right limits everywhere. Assuming that is also dense in (e.g., ), this defines a cadlag process
(5) |
for all , and where is restricted to in this limit.
Note that enlarging by any countable set will not change the process (up to a set of probability zero), by the arbitrariness of sequences in (5).
It remains to be shown that outside of some countable set of times. In fact, for any positive integer , the set of times at which is finite. If not, there would be an increasing or decreasing sequence of such times and, enlarging to include this sequence, would converge to zero, giving the contradiction as . Consequently, letting , there are only countably many times at which . Without loss of generality, we may suppose that includes all such times.
Finally, the process is defined by
As with time restricted to the set has left and right limits everywhere, it follows that also has left and right limits everywhere. Furthermore, is cadlag outside of the countable set .
Dear George,
I have a question concerning Theorem 41: Given a right-continuous filtration, is any left-continuous martingale in fact continuous, since it has a cadlag version? Is this the reason why one does usually not consider left-continuous martingales?
Thanks for your Stochastic Calculus Notes, I enjoy reading them!
Pyramus
1 G.L.: this is now theorem 5, since the previous edit.
Hi.
No, it is not true that a left-continuous martingale has to be continuous. Consider, for example, a compensated Poisson process X. Then, X is just a Poisson process with a constant drift subtracted, so its jumps are the same as for a homogeneous Poisson process (which have zero probability of occuring at any given fixed time). So, X is a cadlag martingale and its left limit Yt = Xt– = lims↑↑t Xs is a left-continuous process. As P(Yt = Xt) = 1, it follows that Y is a left-continuous martingale, but is not continuous.
However, there is some truth in which you suggest. A left-continuous martingale has to be continuous, with probability one, at any given fixed time. It can only jump at times which are continuously distributed. There are also other issues with using left-continuous martingales. Optional sampling would not hold. That is, the martingale property would not extend to bounded stopping times at which the process can jump, so E[XT] = E[X0] would not hold. In fact, the only left-continuous martingales for which this holds are the continuous ones. Also, the definition of local martingales would not make a lot of sense, since it relies on the fact that the space of cadlag martingales is stable under optional stopping which, as with the optional sampling theorem, doesn’t hold for general left-continuous martingales.
Hope that helps!
George.
Update: I added an extra statement to this post, Theorem 4, which provides sufficiently nice versions of martingales in the case where they are not right-continuous in probability.
Dear George,
I am curious whether completeness of the filtration is required for the existence of measurable cadlag modifications. The proof you present does not seem to require such an assumption, but books such as Karatzas and Shreve include that assumption. There is a paper by Hans Follmer on exit times of supermartingales which doesn’t require completeness, but which has additional requirements on the filtration. Hence I am curious about your view on the matter (whether completeness is required).
Thank you again for these excellent posts.
Note 1: I think you need the filtration to be right-continuous in your proof in order to get an adapted modification.
Note 2: It wasn’t fully clear why the sequence of times you construct in the second to last paragraph would have a subsequence converging from the right. Convergence from the right of t_k to some limit point t seems a key requirement in order to get
(i) \tilde{X}_(t_k) to converge to \tilde{X}_(t)
and
(ii) X_(t_k) to converge to \tilde{X}_(t)
the combination of which leads to the contradiction you mention.
No, you don’t need completeness of the filtration for the existence of a cadlag modification, but it is necessary to assume this (or a similar condition) if you want the modification to be adapted. I did assume completeness throughout this post. Rather than completeness, you only need the weaker condition that contains all sets in with zero probability. Right-continuity of the filtration is not required except in special cases, such as Theorem 5 above where it is needed to guarantee that all martingales are right-continuous in probability.
In fact, suppose that X is any adapted process which has a cadlag modification with respect to the completion of the filtration (or, with respect to any enlargement of the filtration). Then we can define
As X does have a cadlag modification, with respect to some enlargement of the filtration, it must be equal to almost surely. So, is almost-surely cadlag. Also, as almost surely (for each fixed t) and is -measurable, it follows that will be an adapted and almost-surely cadlag modification of X so long as contains all sets in with zero probability. If we want a modification which is actually cadlag rather than just almost surely cadlag, you can set to be identically 0 in the event where it is not cadlag (which can be seen -measurable by expressing it in terms of upcrossings of X on ).
So, this shows that: for a cadlag modification, no assumptions on the filtration are required. For a cadlag adapted modification, requiring to contain all sets in of zero probability is enough. For an adapted and almost-surely cadlag modification, requiring to contain all zero probability sets in is sufficient.
For example, let ε1, ε2 be a sequence of Bernoulli IID random variables, each equal to 1 and -1 with probability 1/2. Let t1, t2 be a sequence of times strictly increasing to 1. Define,
(and set Xt to zero if this sum does not converge). This is right-continuous and has left limits everywhere except, possibly, at t = 1 in the case where fails to converge. By martingale convergence of uniformly square integrable martingales, this converges almost surely, so X is almost surely cadlag – but not necessarilly cadlag. If is a modification of X and adapted to the natural filtration of X, then for each $t < 1$ (and not just almost surely, because is finite with no nonempty zero probability sets). Then, $\tilde X_t$ fails to have a limit at t = 1 whenever X does. In fact, the natural filtration here is already right-continuous, showing that even in the right-continuous case we need to enlarge the filtration by adding zero probability sets to to get a modification which is right-continuous.
For another example, let Y be a Poisson process and . It can be seen that Y is a cadlag modification of X but is not adapted with respect to the natural filtration of X. To get an adapted modification which is almost surely cadlag, you need to enlarge the filtration by adding zero probability sets of to (for almost all times t).
In answer to,
Note 1: No, right-continuity is not required except where explicitly stated (e.g., Theorem 5). Hopefully my comment above helps explain this – otherwise could you state where you think right continuity is required?
Note 2: Any uncountable set of times will contain a strictly decreasing subsequence. This is not required though, as the argument will also work for increasing sequences. Letting t be the limit of the sequence tk, we are using the fact that X has left and right limits at t, so both and converge to the left (resp. right) limit if the sequence is increasing (resp. decreasing).
Thanks for this detailed answer!
In note 1, I meant you need right continuity for to be adapted.
In note 2, I still think you need a decreasing sequence (even if left limits exist) since you define as the limit from the right and since you are comparing the value of Xt with to get the contradiction. But maybe I am missing a detail and haven’t thought it through carefully enough yet 🙂
In note 1: we have almost surely. Combining this with completeness of the filtration, the fact that X is adapted implies that is adapted. So, right-continuity is not required.
In note 2: I think the bit that is concerning you is where I state that “ would converge to zero”. This is true, even for increasing sequences. Fix a sample path which has left limits on S (which is true for almost all sample paths). Consider choosing a sequence of times with for k even and for k odd. The fact that this is an increasing sequence means that converges to a limit. So, converges to 0. However, by choosing very close to , we can ensure that is bounded by 1/k and, hence, tends to zero. So, tends to zero. Maybe that was a bit of a big step to make in the proof…
Dear George,
Thanks a lot for the interesting blog. Concerning Theorems 1 and 2, I was looking for references in the literature, but so far I haven’t found any. Can you help here?
Marcus
Dear George : Motivated by a question on MO(https://mathoverflow.net/questions/298195/cadlag-modifications?noredirect=1#comment742088_298195) , I was mulling over the fact that the theorem 3 and 4 do state the existence of a càdlàg version but don’t say anything about the (sub-,super-)martingale property of those modifications. Do they hold true with respect to the original filtration or do we have to “right”-augment these to get it right ?
Hi
I have a question, please. If I have a cadlag which could cross this increasing sequences of intervals say [0, Sn], in the paper that I am reading, it says that this cadlag cannot cross this interval infinitely many times in finite time, it should cross in infinite time. I did not understand the statement, can you explain it please ?
Thanks
Hi. I am not exactly what the statement is that you are refering to, but it is true that a cadlag process cannot cross an interval [a,b] for a < b infinitely often in finite time. Otherwise, you would be able to find a bounded increasing sequence of times with for even n and for odd n, which would contradict the property existence of left limits, as would not converge.
Hi George,
I am not sure if you received a message from me (see below). Based on a review of your responses to other readers, I think my problem can be resolved by simply taking lim sup in place of the usual limit. With the right continuity of the filtration, X_{t+} will be adapted to F_{t+}=F_t because of the lim sup used in the definition. Let me know what you think. The only problem now with this approach is that X_{t+} defined using lim sup can take extended real values infinity or minus infinity (of course, with probability zero).
My older post (not sure if you got it):
Hi George,
I also have two questions on the completeness of F_t (or inclusion of P-null sets of F_\infty) for cadlag modifications.
1. Books like Dellacheri-Meyer (part B-page 67), Karatzas-Shreve (page 16), and Revuz-Yor (page 64) start with a super martingale (X_t, F_t) and produce another process X_{t+} by using regularization of X_t outside a set of measure zero. Without assuming the right continuity of F_t or assuming completeness in any way, they argue that (X_{t+}, F_{t+}) is a supermartingale. In my view, X_{t+} is not adapted to F_{t+} because we need the P-null sets. So, I never understand why such a result is true. In books like Rogers-Williams (page 171) and also your blog, you guys are more careful.
2. In books like Lipster and Shiryaev (page 61), there is a result of cadlag modification of a martingale that can be written as X_t = E[X|F_t]. Here F_t is right continuous but not necessarily complete (no P-null sets). Again, it is claimed that this process has a cadlag modification.
Any thoughts on these? Did I miss something here?
Hi Taposh,
This subject has various subtleties that I cannot do justice in a quick comment. However, in order to construct continuous or cadlag modifications, I think that you do need to include the zero probability sets in each F_t, in order to modify the process on zero probability sets at which it fails to have a left-limit. When you take right-limits to obtain X_{t+} as an F_{t+}-measurable random variable, then that should work. However, I would expect that the resulting process t -> X_{t+} would only be cadlag/continuous outside of a zero probability set.
Hi!
I have a question. For example, Compensated Poisson process is already a Cadlag process, then why do we care about the existence of its Cadlag modifications? Or is it that a Cadlag modification of the original process on a different topological space may have benefits? Can you please explain?
I think there two things to say here. (1) In many situations we should choose the cadlag version of our process (e.g., stochastic integrals, for optional stopping and sampling). (2) For many processes, including (sub/super)martingales, cadlag processes do indeed exist.
The second of these is a rather strong and very useful mathematical result. In some cases, such as compensated Poisson processes, you already know that it is cadlag so he result is not so helpful. However, the fact still remains that this cadlag version should be used in many applications.
Hi,
I have a question related to the regularity of processes.
Considering a processes X such that for all the limits and exist (everywhere, but not necessarily equal) and for every can we claim that is right-continuous? Why?
Yes! assuming you mean that these properties hold everywhere (rather than just almost surely at each point). For any decreasing sequence u, you can choose decreasing r in Q such that and convergence for Y implies the same for X.
Hi,
The properties hold everywhere.
In this case, is $Y$ right-continuous on $\mathbb{R}$ or $\mathbb{Q}$?
Thank you.
Also if on $\mathbb{R},$ why is that true?
yes, it holds on R. If t_n is a sequence of times decreasing to t then, by definition, X(t_n) is equal to the limit of X(s) over s in Q, as s decreases to t_n. Hence, can choose rational s_n > t_n as close as we like such that |X(s_n)-X(t_n)| < 1/n. By definition, X(s_n) tends to X(t) and this implies that X(s) tends to t.
In Theorem 1, what does it mean for an adapted process $X$ to be right continuous in probability? Is it that for every $t$, $P(X_t = X_{t+}) = 1$?
It means that, for any sequence t_n decreasing to t, then X_{t_n} -> X_t in probability. This is equivalent to the map t -> X_t being continuous using convergence in probability for the topology on random variables.
Or, if X_t already has right limits, X_{t+} (under convergence in probability, or stronger topology), it is equivalent to P(X_t=X_{t+})=1