Cadlag Modifications

As was mentioned in the initial post of these stochastic calculus notes, it is important to choose good versions of stochastic processes. In some cases, such as with Brownian motion, it is possible to explicitly construct the process to be continuous. However, in many more cases, it is necessary to appeal to more general results to assure the existence of such modifications.

The theorem below guarantees that many of the processes studied in stochastic calculus have a right-continuous version and, furthermore, these versions necessarily have left limits everywhere. Such processes are known as càdlàg from the French for “continu à droite, limites à gauche” (I often drop the accents, as seems common). Alternative terms used to refer to a cadlag process are rcll (right-continuous with left limits), R-process and right process. For a cadlag process {X}, the left limit at any time {t>0} is denoted by {X_{t-}} (and {X_{0-}\equiv X_0}). The jump at time {t} is denoted by {\Delta X_t=X_t-X_{t-}}.

We work with respect to a complete filtered probability space {(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\ge 0},{\mathbb P})}.

Theorem 1 below provides us with cadlag versions under the condition that elementary integrals of the processes cannot, in a sense, get too large. Recall that elementary predictable processes are of the form

\displaystyle  \xi=Z_01_{\{t=0\}}+\sum_{k=1}^nZ_k1_{\{s_k<t\le t_k\}}

for times {s_k<t_k}, {\mathcal{F}_0}-measurable random variable {Z_0} and {\mathcal{F}_{s_k}}-measurable random variables {Z_k}. Its integral with respect to a stochastic process {X} is

\displaystyle  \int_0^t \xi\,dX=\sum_{k=1}^nZ_k(X_{t_k\wedge t}-X_{s_{k}\wedge t}).

An elementary predictable set is a subset of {{\mathbb R}_+\times\Omega} which is a finite union of sets of the form {\{0\}\times F} for {F\in\mathcal{F}_0} and {(s,t]\times F} for nonnegative reals {s<t} and {F\in\mathcal{F}_s}. Then, a process is an indicator function {1_A} of some elementary predictable set {A} if and only if it is elementary predictable and takes values in {\{0,1\}}.

The following theorem guarantees the existence of cadlag versions for many types of processes. The first statement applies in particular to martingales, submartingales and supermartingales, whereas the second statement is important for the study of general semimartingales.

Theorem 1 Let X be an adapted stochastic process which is right-continuous in probability and such that either of the following conditions holds. Then, it has a cadlag version.

  • X is integrable and, for every {t\in{\mathbb R}_+},

    \displaystyle  \left\{{\mathbb E}\left[\int_0^t1_A\,dX\right]\colon A\textrm{ is elementary}\right\}

    is bounded.

  • For every {t\in{\mathbb R}_+} the set

    \displaystyle  \left\{\int_0^t1_A\,dX\colon A\textrm{ is elementary}\right\}

    is bounded in probability.

This existence of cadlag versions stated by this theorem is a special case of the following slightly more general result, which drops the requirement for the process to be right-continuous in probability. In the following, the modification {Y} has a right limit {Y_{t+}} at every point. If the process is right-continuous in probability, then at each time, {Y_t=Y_{t+}} with probability one. By countable additivity, this remains true simultaneously at all times in any given countable set {S} and therefore {Y} is a cadlag version.

Theorem 2 Let X be an adapted stochastic process, and suppose that either of the two conditions of Theorem 1 holds. Then, it has a version Y which has left and right limits everywhere and such that there is a countable subset {S\subset{\mathbb R}_+} for which {Y_t} is right-continuous at every {t\not\in S}.

A proof of this theorem is given below, using the ideas on upcrossings of a process as discussed in the previous post.


Cadlag Martingales

The main result for existence of cadlag martingales is as follows.

Theorem 3 Let X be a martingale, submartingale or supermartingale which is right-continuous in probability. Then, it has a cadlag version.

Proof: By applying the statement to {-X}, it suffices to prove the result for submartingales. However, in this case, for any elementary predictable set {A},

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle0\le{\mathbb E}\left[\int_0^t1_A\,dX\right]&\displaystyle={\mathbb E}\left[X_t-X_0\right]-{\mathbb E}\left[\int_0^t1_{A^c}\,dX\right]\smallskip\\ &\displaystyle\le{\mathbb E}\left[X_t-X_0\right]. \end{array} (1)

The first condition of Theorem 1 is satisfied, showing that cadlag versions exist. ⬜

Even if the condition of right-continuity in probability is dropped, then it is still possible to pass to well-behaved modifications. Although a martingale can fail to have a cadlag version, it is always the case that there exists a version which is cadlag everywhere outside of a fixed countable set of times.

Theorem 4 Let X be a martingale, submartingale or supermartingale. Then, it has a version Y which has left and right limits everywhere and such that there is a countable subset {S\subset{\mathbb R}_+} for which {Y_t} is right-continuous at every {t\not\in S}.

Proof: By applying the statement to {-X}, we may suppose that the process is a submartingale. Then, inequality (1) applies, so the first statement of Theorem 1 holds. Theorem 2 gives the modification Y. ⬜

Often, the underlying filtrations used are assumed to satisfy the usual conditions. That is, they are required to be right-continuous as well as being complete. In this case the existence of cadlag versions is particularly general. Every martingale has a cadlag version.

Theorem 5 Suppose that the filtration is right-continuous. Then, every martingale has a cadlag version.



More generally, a submartingale or supermartingale X has a cadlag version if and only if {t\mapsto{\mathbb E}[X_t]} is right-continuous.

Proof: By applying the result to {-X}, we may suppose that the process is a submartingale. Furthermore, for martingales the function {t\mapsto{\mathbb E}[X_t]}, being constant, is trivially right-continuous. So, it suffices to prove the second more general statement. Theorem 4 gives a version Y which has left and right limits everywhere and is cadlag outside of some countable set {S\subset{\mathbb R}_+}. It only remains to be shown that {Y_{t+}=Y_{t}}, almost surely, for each {t\in S}.

Choose a sequence {t_n} strictly decreasing to {t}. Then {Y_t\le{\mathbb E}[Y_{t_n}\vert\mathcal{F}_t]}, by the submartingale property. The idea is to commute the limit as {n\rightarrow\infty} with the conditional expectation to obtain

\displaystyle  Y_t\le{\mathbb E}[Y_{t+}\vert\mathcal{F}_t]. (2)

This can be done under the condition that the sequence {X_{t_n}} is uniformly integrable. For a martingale, these are all conditional expectations {X_{t_n}={\mathbb E}[X_{t_1}\vert\mathcal{F}_{t_n}]}, so uniform integrability is guaranteed. In fact, Lemma 6 states that this sequence is uniformly integrable whenever {X} is a submartingale, so inequality (2) holds.

Similarly, using uniform integrability together with the right-continuity of {t\mapsto{\mathbb E}[X_t]} gives

\displaystyle  {\mathbb E}[Y_t]=\lim_n{\mathbb E}[Y_{t_n}]={\mathbb E}\left[\lim_n Y_{t_n}\right]={\mathbb E}[Y_{t+}]

so that {{\mathbb E}[Y_{t+}\vert\mathcal{F}_t]-Y_t} is a nonnegative random variable with zero expectation. This shows that {Y_t={\mathbb E}[Y_{t+}\vert\mathcal{F}_t]} almost surely. Finally, the right-continuity of the filtration, {\mathcal{F}_{t+}=\mathcal{F}_t}, is applied. As {Y_{t+}} is necessarily {\mathcal{F}_t}-measurable, {Y_{t+}={\mathbb E}[Y_{t+}\vert\mathcal{F}_t]=Y_t}. ⬜

The proof above made use of the following simple, but useful, statement regarding uniform integrability of submartingales.

Lemma 6 Let {\{X_t\}_{t\in\mathbb{T}}} be a submartingale with respect to a filtered probability space {(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\in\mathbb{T}},{\mathbb P})}.



Then, {X_{t_n}} is uniformly integrable for any decreasing sequence {t_n} bounded below in {\mathbb{T}}.

Proof: The idea is to apply a `Doob-style decomposition’ of the submartingale into a martingale and an increasing process at the sequence of times. Let {t_\infty\in\mathbb{T}} be a lower bound for {t_n} and set

\displaystyle  A_n=\sum_{k=n}^\infty {\mathbb E}[X_{t_k}-X_{t_{k+1}}\vert\mathcal{F}_{t_{k+1}}]

which, by the submartingale property, is a sum of nonnegative terms. Furthermore, by monotone convergence

\displaystyle  {\mathbb E}[A_n]=\lim_{m\rightarrow\infty}{\mathbb E}[X_{t_n}-X_{t_m}]\le{\mathbb E}[X_{t_1}-X_{t_\infty}]<\infty,

so {A_n} are integrable random variables, and is decreasing in {n}. Furthermore, from the definition, the martingale property

\displaystyle  X_{t_n}-A_n = {\mathbb E}\left[X_{t_1}-A_1\vert\mathcal{F}_{t_n}\right].

is satisfied, so {X_{t_n}-A_n} is a uniform integrable sequence. Finally, {|A_n|\le |A_1|} are uniformly integrable, and the result follows for {X_n=(X_{t_n}-A_n)+A_n}. ⬜


Proof of Cadlag Versions

I now give a proof of Theorem 2. This will make use of the ideas from previous post on upcrossings and downcrossings to show that, when restricted to a countable set of times, {X} has left and right limits everywhere. The result will follow from this.

For any {t>0}, let {S} be a finite subset of {[0,t]}. The number of upcrossings of an interval {[a,b]} for times in {S} satisfies the bound

\displaystyle  (b-a)U[a,b]\le\int_0^t1_A\,dX + \max(a-X_t,0) (3)

for some elementary set {A}. Furthermore, letting {\sigma} be the first time in {S} at which {X_\sigma> K}, the stochastic interval {(\sigma,t]} is elementary and,

\displaystyle  \int_0^t1_{(\sigma,t]}\,dX=1_{\{\sup_{s\in S}X_s>K\}}(X_t-X_\sigma)\le |X_t|-K1_{\{\sup_{s\in S}X_s>K\}}.

Applying the same idea to {-X} shows that there are elementary predictable sets {A,B} such that

\displaystyle  K1_{\{\sup_{s\in S}X_s>K\}}\le\vert X_t\vert - \int_0^t1_A\,dX,\ K1_{\{\inf_{s\in S}X_s<-K\}}\le \vert X_t\vert + \int_0^t1_B\,dX. (4)

Now suppose that the first condition of the theorem holds, so that {{\mathbb E}[\int_0^t1_A\,dX]} is bounded by some positive constant {L} for all elementary sets {A}. Taking expectations of inequalities (3) and (4) gives

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} &\displaystyle (b-a){\mathbb E}\left[U[a,b]\right]\le L+ {\mathbb E}[\max(a-X_t,0)],\smallskip\\ &\displaystyle K{\mathbb P}\left(\sup_{s\in S}|X_s|>K\right)\le 2L + 2 {\mathbb E}[\vert X_t\vert]. \end{array}

Letting {S} increase to a countably infinite subset of {[0,t]} and applying monotone convergence, these inequalities generalize to all countable subsets {S\subseteq[0,t]}. In particular, the number of upcrossings of {[a,b]} and the supremum of {X} are almost surely bounded on {S}.

Alternatively, suppose that the second statement of the theorem holds. Then, there exists a function {f\colon{\mathbb R}_+\rightarrow{\mathbb R}_+} with {f(K)\rightarrow 0} as {K\rightarrow\infty} and {{\mathbb P}(\vert\int_0^t 1_A\,dX\vert>K)<f(K)} for all {K>0} and elementary predictable sets {A}. Inequalities (3) and (4) give

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} &\displaystyle {\mathbb P}\left((b-a)U[a,b]>K\right)\le f(K/2) + {\mathbb P}\left(\max(a-X_t,0)>K/2\right),\smallskip\\ &\displaystyle {\mathbb P}\left(\sup_{s\in S}|X_s|>K\right)\le 2f(K/2)+2{\mathbb P}\left(|X_t|>K/2\right). \end{array}

Again, letting {S} increase to a countably infinite subset of {[0,t]} and applying monotone convergence extends these inequalities to all countable subsets {S\subseteq[0,t]}. Letting {K} go to infinity then shows that the number of upcrossings of {[a,b]} and the supremum of {X} are bounded on {S}.

In either case, applying countable additivity, the above shows that for time restricted to a countable subset {S\subseteq{\mathbb R}_+} the process {X}, with probability one, is bounded and has finitely many upcrossings of {[a,b]} for all rational {a<b} on bounded time intervals. Replacing {X} by the identically zero process outside this set of probability one, it can be assumed that this holds everywhere.

By the results of the previous post, this implies that for time restricted to such a countable set {S}, {X} almost surely has left and right limits everywhere. Assuming that {S} is also dense in {{\mathbb R}_+} (e.g., {S={\mathbb Q}_+}), this defines a cadlag process

\displaystyle  \tilde X_t\equiv \lim_{s\downarrow\downarrow t}X_s (5)

for all {t\in{\mathbb R}_+}, and where {s} is restricted to {S} in this limit.

Note that enlarging {S} by any countable set will not change the process {\tilde X} (up to a set of probability zero), by the arbitrariness of sequences {s\downarrow\downarrow t} in (5).

It remains to be shown that {{\mathbb P}(\tilde X_t=X_t)=1} outside of some countable set of times. In fact, for any positive integer {n}, the set of times {t\le n} at which {{\mathbb P}(\vert\tilde X_t-X_t\vert >1/n)>1/n} is finite. If not, there would be an increasing or decreasing sequence of such times {t_k} and, enlarging {S} to include this sequence, {X_{t_k}-\tilde X_{t_k}} would converge to zero, giving the contradiction {{\mathbb P}(\vert\tilde X_{t_k}-X_{t_k}\vert>1/n)\rightarrow 0} as {k\rightarrow\infty}. Consequently, letting {n\rightarrow\infty}, there are only countably many times at which {{\mathbb P}(\tilde X_t\not= X_t)>0}. Without loss of generality, we may suppose that {S} includes all such times.

Finally, the process {Y} is defined by

\displaystyle  Y_t = \begin{cases} X_t,&\textrm{if }t\in S,\\ \tilde X_t,&\textrm{otherwise}. \end{cases}

As {X} with time restricted to the set {S} has left and right limits everywhere, it follows that {Y} also has left and right limits everywhere. Furthermore, {Y=\tilde X} is cadlag outside of the countable set {S}.

23 thoughts on “Cadlag Modifications

  1. Dear George,

    I have a question concerning Theorem 41: Given a right-continuous filtration, is any left-continuous martingale in fact continuous, since it has a cadlag version? Is this the reason why one does usually not consider left-continuous martingales?

    Thanks for your Stochastic Calculus Notes, I enjoy reading them!

    Pyramus

    1 G.L.: this is now theorem 5, since the previous edit.

    1. Hi.

      No, it is not true that a left-continuous martingale has to be continuous. Consider, for example, a compensated Poisson process X. Then, X is just a Poisson process with a constant drift subtracted, so its jumps are the same as for a homogeneous Poisson process (which have zero probability of occuring at any given fixed time). So, X is a cadlag martingale and its left limit Yt = Xt = lims↑↑t Xs is a left-continuous process. As P(Yt = Xt) = 1, it follows that Y is a left-continuous martingale, but is not continuous.

      However, there is some truth in which you suggest. A left-continuous martingale has to be continuous, with probability one, at any given fixed time. It can only jump at times which are continuously distributed. There are also other issues with using left-continuous martingales. Optional sampling would not hold. That is, the martingale property would not extend to bounded stopping times at which the process can jump, so E[XT] = E[X0] would not hold. In fact, the only left-continuous martingales for which this holds are the continuous ones. Also, the definition of local martingales would not make a lot of sense, since it relies on the fact that the space of cadlag martingales is stable under optional stopping which, as with the optional sampling theorem, doesn’t hold for general left-continuous martingales.

      Hope that helps!
      George.

  2. Dear George,

    I am curious whether completeness of the filtration is required for the existence of measurable cadlag modifications. The proof you present does not seem to require such an assumption, but books such as Karatzas and Shreve include that assumption. There is a paper by Hans Follmer on exit times of supermartingales which doesn’t require completeness, but which has additional requirements on the filtration. Hence I am curious about your view on the matter (whether completeness is required).
    Thank you again for these excellent posts.

    Note 1: I think you need the filtration to be right-continuous in your proof in order to get an adapted modification.
    Note 2: It wasn’t fully clear why the sequence of times you construct in the second to last paragraph would have a subsequence converging from the right. Convergence from the right of t_k to some limit point t seems a key requirement in order to get
    (i) \tilde{X}_(t_k) to converge to \tilde{X}_(t)
    and
    (ii) X_(t_k) to converge to \tilde{X}_(t)
    the combination of which leads to the contradiction you mention.

    1. No, you don’t need completeness of the filtration for the existence of a cadlag modification, but it is necessary to assume this (or a similar condition) if you want the modification to be adapted. I did assume completeness throughout this post. Rather than completeness, you only need the weaker condition that \mathcal{F}_0 contains all sets in \mathcal{F}_\infty with zero probability. Right-continuity of the filtration is not required except in special cases, such as Theorem 5 above where it is needed to guarantee that all martingales are right-continuous in probability.

      In fact, suppose that X is any adapted process which has a cadlag modification with respect to the completion of the filtration (or, with respect to any enlargement of the filtration). Then we can define

      \displaystyle \tilde X_t=\limsup_{s\downarrow t,\;s\in\mathbb{Q}}X_s

      As X does have a cadlag modification, with respect to some enlargement of the filtration, it must be equal to \tilde X almost surely. So, \tilde X is almost-surely cadlag. Also, as \tilde X_t=X_t almost surely (for each fixed t) and \tilde X_t is \mathcal{F}_{t+}-measurable, it follows that \tilde X will be an adapted and almost-surely cadlag modification of X so long as \mathcal{F}_t contains all sets in \mathcal{F}_{t+} with zero probability. If we want a modification which is actually cadlag rather than just almost surely cadlag, you can set \tilde X to be identically 0 in the event where it is not cadlag (which can be seen \mathcal{F}_\infty-measurable by expressing it in terms of upcrossings of X on \mathbb{Q}).

      So, this shows that: for a cadlag modification, no assumptions on the filtration are required. For a cadlag adapted modification, requiring \mathcal{F}_0 to contain all sets in \mathcal{F}_\infty of zero probability is enough. For an adapted and almost-surely cadlag modification, requiring \mathcal{F}_t to contain all zero probability sets in \mathcal{F}_{t+} is sufficient.

      For example, let ε1, ε2 be a sequence of Bernoulli IID random variables, each equal to 1 and -1 with probability 1/2. Let t1, t2 be a sequence of times strictly increasing to 1. Define,

      \displaystyle X_t=\sum_{k=1}^\infty1_{\{t_k\le t\}}\frac{\epsilon_k}{k}

      (and set Xt to zero if this sum does not converge). This is right-continuous and has left limits everywhere except, possibly, at t = 1 in the case where \sum_k\epsilon_k/k fails to converge. By martingale convergence of uniformly square integrable martingales, this converges almost surely, so X is almost surely cadlag – but not necessarilly cadlag. If \tilde X is a modification of X and adapted to the natural filtration of X, then \tilde X_t=X_t for each $t < 1$ (and not just almost surely, because \mathcal{F}_t is finite with no nonempty zero probability sets). Then, $\tilde X_t$ fails to have a limit at t = 1 whenever X does. In fact, the natural filtration here is already right-continuous, showing that even in the right-continuous case we need to enlarge the filtration by adding zero probability sets to \mathcal{F}_t to get a modification which is right-continuous.

      For another example, let Y be a Poisson process and X_t=Y_{t-}. It can be seen that Y is a cadlag modification of X but is not adapted with respect to the natural filtration of X. To get an adapted modification which is almost surely cadlag, you need to enlarge the filtration by adding zero probability sets of \mathcal{F}_{t+} to \mathcal{F}_t (for almost all times t).

      In answer to,
      Note 1: No, right-continuity is not required except where explicitly stated (e.g., Theorem 5). Hopefully my comment above helps explain this – otherwise could you state where you think right continuity is required?
      Note 2: Any uncountable set of times will contain a strictly decreasing subsequence. This is not required though, as the argument will also work for increasing sequences. Letting t be the limit of the sequence tk, we are using the fact that X has left and right limits at t, so both X_{t_k} and \tilde X_{t_k} converge to the left (resp. right) limit if the sequence is increasing (resp. decreasing).

      1. Thanks for this detailed answer!

        In note 1, I meant you need right continuity for \widetilde{X}_t to be adapted.
        In note 2, I still think you need a decreasing sequence (even if left limits exist) since you define \widetilde{X}_t as the limit from the right and since you are comparing the value of Xt with \widetilde{X}_t to get the contradiction. But maybe I am missing a detail and haven’t thought it through carefully enough yet 🙂

        1. In note 1: we have \tilde X_t=X_t almost surely. Combining this with completeness of the filtration, the fact that X is adapted implies that \tilde X is adapted. So, right-continuity is not required.

          In note 2: I think the bit that is concerning you is where I state that “X_{t_k}-\tilde X_{t_k} would converge to zero”. This is true, even for increasing sequences. Fix a sample path which has left limits on S (which is true for almost all sample paths). Consider choosing a sequence of times s_k\in S with s_k=t_{k/2} for k even and t_{(k-1)/2} < s_k < t_{(k+1)/2} for k odd. The fact that this is an increasing sequence means that X_{s_k} converges to a limit. So, X_{t_k}-X_{s_{2k+1}}=X_{s_{2k}}-X_{s_{2k+1}} converges to 0. However, by choosing s_{2k+1} very close to t_k, we can ensure that \tilde X_{t_k}-X_{s_{2k+1}} is bounded by 1/k and, hence, tends to zero. So, X_{t_k}-\tilde X_{t_k} tends to zero. Maybe that was a bit of a big step to make in the proof…

  3. Dear George,

    Thanks a lot for the interesting blog. Concerning Theorems 1 and 2, I was looking for references in the literature, but so far I haven’t found any. Can you help here?

    Marcus

  4. Hi
    I have a question, please. If I have a cadlag which could cross this increasing sequences of intervals say [0, Sn], in the paper that I am reading, it says that this cadlag cannot cross this interval infinitely many times in finite time, it should cross in infinite time. I did not understand the statement, can you explain it please ?
    Thanks

    1. Hi. I am not exactly what the statement is that you are refering to, but it is true that a cadlag process cannot cross an interval [a,b] for a < b infinitely often in finite time. Otherwise, you would be able to find a bounded increasing sequence of times t_n with X_{t_n} \le a for even n and X_{t_n}\ge b for odd n, which would contradict the property existence of left limits, as X_{t_n} would not converge.

  5. Hi George,

    I am not sure if you received a message from me (see below). Based on a review of your responses to other readers, I think my problem can be resolved by simply taking lim sup in place of the usual limit. With the right continuity of the filtration, X_{t+} will be adapted to F_{t+}=F_t because of the lim sup used in the definition. Let me know what you think. The only problem now with this approach is that X_{t+} defined using lim sup can take extended real values infinity or minus infinity (of course, with probability zero).

    My older post (not sure if you got it):

    Hi George,

    I also have two questions on the completeness of F_t (or inclusion of P-null sets of F_\infty) for cadlag modifications.

    1. Books like Dellacheri-Meyer (part B-page 67), Karatzas-Shreve (page 16), and Revuz-Yor (page 64) start with a super martingale (X_t, F_t) and produce another process X_{t+} by using regularization of X_t outside a set of measure zero. Without assuming the right continuity of F_t or assuming completeness in any way, they argue that (X_{t+}, F_{t+}) is a supermartingale. In my view, X_{t+} is not adapted to F_{t+} because we need the P-null sets. So, I never understand why such a result is true. In books like Rogers-Williams (page 171) and also your blog, you guys are more careful.

    2. In books like Lipster and Shiryaev (page 61), there is a result of cadlag modification of a martingale that can be written as X_t = E[X|F_t]. Here F_t is right continuous but not necessarily complete (no P-null sets). Again, it is claimed that this process has a cadlag modification.

    Any thoughts on these? Did I miss something here?

    1. Hi Taposh,
      This subject has various subtleties that I cannot do justice in a quick comment. However, in order to construct continuous or cadlag modifications, I think that you do need to include the zero probability sets in each F_t, in order to modify the process on zero probability sets at which it fails to have a left-limit. When you take right-limits to obtain X_{t+} as an F_{t+}-measurable random variable, then that should work. However, I would expect that the resulting process t -> X_{t+} would only be cadlag/continuous outside of a zero probability set.

  6. Hi!
    I have a question. For example, Compensated Poisson process is already a Cadlag process, then why do we care about the existence of its Cadlag modifications? Or is it that a Cadlag modification of the original process on a different topological space may have benefits? Can you please explain?

    1. I think there two things to say here. (1) In many situations we should choose the cadlag version of our process (e.g., stochastic integrals, for optional stopping and sampling). (2) For many processes, including (sub/super)martingales, cadlag processes do indeed exist.
      The second of these is a rather strong and very useful mathematical result. In some cases, such as compensated Poisson processes, you already know that it is cadlag so he result is not so helpful. However, the fact still remains that this cadlag version should be used in many applications.

  7. Hi,
    I have a question related to the regularity of processes.
    Considering a processes X such that for all u\in \mathbb{Q}, the limits \lim_{r \downarrow u, r\in \mathbb{Q}}X_r and \lim_{r\uparrow u, r\in \mathbb{Q}}X_r exist (everywhere, but not necessarily equal) and for every u<v, \sup_{r \in [u,v] \cap \mathbb{Q}}|X_r|<\infty, can we claim that Y_u=\liminf_{r\downarrow u, r\in \mathbb{Q}}X_r is right-continuous? Why?

    1. Yes! assuming you mean that these properties hold everywhere (rather than just almost surely at each point). For any decreasing sequence u, you can choose decreasing r in Q such that \lvert X_u-Y_r\rvert\to 0 and convergence for Y implies the same for X.

      1. Hi,
        The properties hold everywhere.
        In this case, is $Y$ right-continuous on $\mathbb{R}$ or $\mathbb{Q}$?
        Thank you.

        1. yes, it holds on R. If t_n is a sequence of times decreasing to t then, by definition, X(t_n) is equal to the limit of X(s) over s in Q, as s decreases to t_n. Hence, can choose rational s_n > t_n as close as we like such that |X(s_n)-X(t_n)| < 1/n. By definition, X(s_n) tends to X(t) and this implies that X(s) tends to t.

  8. In Theorem 1, what does it mean for an adapted process $X$ to be right continuous in probability? Is it that for every $t$, $P(X_t = X_{t+}) = 1$?

    1. It means that, for any sequence t_n decreasing to t, then X_{t_n} -> X_t in probability. This is equivalent to the map t -> X_t being continuous using convergence in probability for the topology on random variables.
      Or, if X_t already has right limits, X_{t+} (under convergence in probability, or stronger topology), it is equivalent to P(X_t=X_{t+})=1

Leave a comment