Time-Changed Brownian Motion

From the definition of standard Brownian motion B, given any positive constant c, {B_{ct}-B_{cs}} will be normal with mean zero and variance c(ts) for times {t>s\ge 0}. So, scaling the time axis of Brownian motion B to get the new process {B_{ct}} just results in another Brownian motion scaled by the factor {\sqrt{c}}.

This idea is easily generalized. Consider a measurable function {\xi\colon{\mathbb R}_+\rightarrow{\mathbb R}_+} and Brownian motion B on the filtered probability space {(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\ge 0},{\mathbb P})}. So, {\xi} is a deterministic process, not depending on the underlying probability space {\Omega}. If {\theta(t)\equiv\int_0^t\xi^2_s\,ds} is finite for each {t>0} then the stochastic integral {X=\int\xi\,dB} exists. Furthermore, X will be a Gaussian process with independent increments. For piecewise constant integrands, this results from the fact that linear combinations of joint normal variables are themselves normal. The case for arbitrary deterministic integrands follows by taking limits. Also, the Ito isometry says that {X_t-X_s} has variance

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle{\mathbb E}\left[\left(\int_s^t\xi\,dB\right)^2\right]&\displaystyle={\mathbb E}\left[\int_s^t\xi^2_u\,du\right]\smallskip\\ &\displaystyle=\theta(t)-\theta(s)\smallskip\\ &\displaystyle={\mathbb E}\left[(B_{\theta(t)}-B_{\theta(s)})^2\right]. \end{array}

So, {\int\xi\,dB=\int\sqrt{\theta^\prime(t)}\,dB_t} has the same distribution as the time-changed Brownian motion {B_{\theta(t)}}.

With the help of Lévy’s characterization, these ideas can be extended to more general, non-deterministic, integrands and to stochastic time-changes. In fact, doing this leads to the startling result that all continuous local martingales are just time-changed Brownian motion.

Defining a stochastic time-change involves choosing a set of stopping times {\{\tau_t\colon t\ge 0\}} such that {\tau_s\le\tau_t} whenever {s\le t}. Then, {\mathcal{\tilde F}_t=\mathcal{F}_{\tau_t}} defines a new, time-changed, filtration. Applying the same change of time to a process X results in the new time-changed process {\tilde X_t=X_{\tau_t}}. If X is progressively measurable with respect to the original filtration then {\tilde X} will be {\mathcal{\tilde F}_{\cdot}}-adapted. The time-change is continuous if {t\mapsto\tau_t} is almost surely continuous. By the properties of quadratic variations, the following describes a continuous time-change.

Theorem 1 Any continuous local martingale X with {X_0=0} is a continuous time-change of standard Brownian motion (possibly under an enlargement of the probability space).

More precisely, there is a Brownian motion B with respect to a filtration {\{\mathcal{G}_t\}_{t\ge 0}} such that, for each {t\ge 0}, {\omega\mapsto[X]_t(\omega)} is a {\mathcal{G}_\cdot}-stopping time and {X_t=B_{[X]_t}}.

In particular, if W is a Brownian motion and {\xi} is W-integrable then the result can be applied to {X=\int\xi\,dW}. As {[X]_t=\int_0^t\xi^2_s\,ds} this gives

\displaystyle  \int_0^t\xi\,dW = B_{\int_0^t\xi^2_s\,ds}

for a Brownian motion B. This allows us to interpret the integral {\int\xi\,dW} as a Brownian motion with time run at rate {\xi^2}.

The conclusion of Theorem 1 only holds in a possible enlargement of the probability space. To see why this condition is necessary, consider the simple example of a local martingale X which is identically equal to zero. In this case, it is possible that the underlying probability space is trivial, so does not contain any non-deterministic random variables at all. It is therefore necessary to enlarge the probability space to be able to assert the existence of at least one Brownian motion. In fact, this will be necessary whenever {[X]_\infty} has nonzero probability of being finite. As stated in Theorem 1, only enlargements of the probability space are required, not of the filtration {\mathcal{F}_t}. That is, we consider a probability space {(\Omega^\prime,\mathcal{F}^\prime,{\mathbb P}^\prime)} and a measurable onto map {f\colon\Omega^\prime\rightarrow\Omega} preserving probabilities, so {{\mathbb P}(A)={\mathbb P}^\prime(f^{-1}(A))} for all {A\in\mathcal{F}}. Any processes on the original probability space can then be lifted to {(\Omega^\prime,\mathcal{F}^\prime,{\mathbb P}^\prime)}. The filtration is also lifted to {\mathcal{F}^\prime_t=\{f^{-1}(A)\colon A\in\mathcal{F}_t\}}. In this way, it is always possible to enlarge the probability space so that Brownian motions exist. For example, if {(\Omega^{\prime\prime},\mathcal{F}^{\prime\prime},{\mathbb P}^{\prime\prime})} is a probability space on which there is a Brownian motion defined, we can take {\Omega^\prime=\Omega\times\Omega^{\prime\prime}}, {\mathcal{F}^\prime=\mathcal{F}\otimes\mathcal{F}^{\prime\prime}} and {{\mathbb P}^\prime={\mathbb P}\otimes{\mathbb P}^{\prime\prime}} for the enlargement, and {f\colon\Omega^\prime\rightarrow\Omega} is just the projection onto {\Omega}.

Theorem 1 is a special case of the following time-change result for multidimensional local martingales. A d-dimension continuous local martingale is a time change of Brownian motion if the quadratic variation {[X]^{ij}\equiv[X^i,X^j]} is proportional to the identity matrix. Below, {\delta_{ij}} denotes the Kronecker delta.

Theorem 2 Let {X=(X^1,X^2,\ldots,X^d)} be a continuous local martingale with {X_0=0}. Suppose, furthermore, that {[X^i,X^j]_t=\delta_{ij}A_t} for some process A and all {1\le i,j\le d} and {t\ge 0}. Then, under an enlargement of the probability space, X is a continuous time-change of standard d-dimensional Brownian motion.

More precisely, there is a d-dimensional Brownian motion B with respect to a filtration {\{\mathcal{G}_t\}_{t\ge 0}} such that, for each {t\ge 0}, {\omega\mapsto A_t(\omega)} is a {\mathcal{G}_\cdot}-stopping time and {X_t=B_{A_t}}.

Proof: Define stopping times

\displaystyle  \tau_t=\inf\left\{s\ge 0\colon A_s\ge t\right\},

and the filtration {\mathcal{G}_t=\mathcal{F}_{\tau_t}}, for all {t\ge 0}. The stopped processes {(X^i)^{\tau_t}} have quadratic variation {[(X^i)^{\tau_t}]=A^{\tau_t}\le t}. So, they are {L^2}-bounded martingales, and the limit

\displaystyle  Z^i_t\equiv\lim_{u\rightarrow\infty}X^i_{\tau_t\wedge u}

exists almost surely. By optional sampling,

\displaystyle  {\mathbb E}[Z^i_t\mid\mathcal{F}_{\tau_s}]=\lim_{u\rightarrow\infty}{\mathbb E}[X^i_{\tau_t\wedge u}\mid\mathcal{F}_{\tau_s}]=\lim_{u\rightarrow\infty}X^i_{\tau_s\wedge u}=Z^i_s

for all {t\ge s}, so Z is a martingale. From the definition, {t\mapsto\tau_t} is left-continuous and A is constant with value t on the interval {[\tau_t,\tau_{t+}]}. Then, as the intervals of constancy of {Z^k} coincide with those of {[Z^k]=A}, it follows that {Z^k_{t+}=X^k_{\tau_{t+}}=X^k_{\tau_t}=Z^k_t}, and Z is continuous.

Next, {L^{ij}_t\equiv X^j_tX^k_t-\delta_{ij}A_t} is a local martingale. As {\sup_s\vert X^i_{\tau_t\wedge s}\vert} is square integrable it follows that {(L^{ij})^{\tau_t}} are uniformly integrable martingales. Applying optional sampling as above and substituting in {A_{\tau_t}=t\wedge A_\infty} shows that

\displaystyle  Z^i_tZ^j_t-\delta_{ij}t\wedge A_\infty=L^{ij}_{\tau_t}

is a martingale with respect to {\mathcal{G}_\cdot}. If it is known that {A_\infty=\infty}, then Lévy’s characterization says that Z is a standard d-dimensional Brownian motion. More generally, enlarging the probability space if necessary, we may suppose that there exists some d-dimensional Brownian motion W independently of {\mathcal{G}_t}. Then, {M_t\equiv W_t-W_{t\wedge A_\infty}} is a martingale under its natural filtration, with covariations {[M^i,M^j]_t=\delta_{ij}(t-t\wedge A_\infty)}. So, {M^i_tM^j_t-\delta_{ij}(t-t\wedge A_\infty)} is a local martingale.

Then, {B^i\equiv Z^i+M^i} and {B^i_tB^j_t-\delta_{ij}t} are local martingales under the filtration {\mathcal{G}^\prime_t} jointly generated by {\mathcal{G}_t} and M. So, by Levy’s characterization, B is a standard d-dimensional Brownian motion. For any {t\ge 0}, A is constant on the interval {[\tau_{A_t},t]}. So, X is also constant on this interval giving,

\displaystyle  B^i_{A_t}=X^i_{\tau_{A_t}}=X^i_t.

It only remains to be shown that {A_t} is a {\mathcal{G}_\cdot}-stopping time. For any times {s,u\ge 0} the definition of {\tau_s} gives

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle \left\{A_t\le s\right\}\cap\left\{\tau_s\le u\right\} &\displaystyle=\left\{A_t\le s\right\}\cap\left\{A_u\le s\right\}\smallskip\\ &\displaystyle=\left\{A_{t\wedge u}\le s\right\}\in\mathcal{F}_u \end{array}

As this holds for all u, {\{A_t\le s\}\in\mathcal{F}_{\tau_s}=\mathcal{G}_s}, and {A_t} is a {\mathcal{G}_\cdot}-stopping time. ⬜

So, all continuous local martingales are continuous time changes of standard Brownian motion. The converse statement is much simpler and, in fact, the local martingale property is preserved under continuous time-changes.

Lemma 3 Let X be a local martingale and {\{\tau_t\}_{t\ge 0}} be finite stopping times such that {t\mapsto\tau_t} is continuous and increasing.

Then, {\tilde X_t\equiv X_{\tau_t}} is a local martingale with respect to the filtration {\mathcal{\tilde F}_t=\mathcal{F}_{\tau_t}}.

Proof: Choose stopping times {\sigma_n\uparrow\infty} such that the stopped processes {Y^n\equiv 1_{\{\sigma_n>0\}}X^{\sigma_n}} are uniformly integrable martingales. Then, set

\displaystyle  \tilde\sigma_n=\inf\left\{t\ge 0\colon\tau_t\ge\sigma_n\right\}.

Continuity of {t\mapsto\tau_t} gives {\tau_{\tilde\sigma_n}=(\sigma_n\vee\tau_0)\wedge\tau_\infty}, and {\tilde\sigma_n\uparrow\infty} as n goes to infinity. For each {t\ge 0},

\displaystyle  \left\{\tilde\sigma_n\le t\right\}=\left\{\tau_t\ge\sigma_n\right\}\in\mathcal{F}_{\tau_t}

implies that {\tilde\sigma_n} is a {\mathcal{\tilde F}_\cdot}-stopping time. Finally, {1_{\{\tilde\sigma_n>0\}}\tilde X^{\tilde\sigma_n}_t=1_{\{\sigma_n>\tau_0\}}Y^n_{\tau_t}} is a uniformly integrable process and optional sampling gives

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle{\mathbb E}\left[1_{\{\tilde\sigma_n>0\}}\tilde X^{\tilde\sigma_n}_t\;\middle\vert\;\mathcal{\tilde F}_s\right] &\displaystyle={\mathbb E}\left[1_{\{\sigma_n>\tau_0\}}Y^n_{\tau_t}\;\middle\vert\;\mathcal{F}_{\tau_s}\right]\smallskip\\ &\displaystyle=1_{\{\sigma_n>\tau_0\}}Y^n_{\tau_s}\smallskip\\ &\displaystyle=1_{\{\tilde\sigma_n>0\}}\tilde X^{\tilde\sigma_n}_s. \end{array}

So, {\tilde X} is a local martingale with respect to {\mathcal{\tilde F}_\cdot}. ⬜

Finally for this post, we show that stochastic integration transforms nicely under continuous time-changes. Under a time-change defined by stopping times {\tau_t} the integral transforms as

\displaystyle  \int_0^t\xi_{\tau_s}\,dX_{\tau_s}=\int_{\tau_0}^{\tau_t}\xi_s\,dX_s.

We have to be a bit careful here, as the integral on the left hand side is defined with respect to a different filtration than on the right. The precise statement is as follows.

Lemma 4 Let X be a semimartingale and {\xi} be a predictable, X-integrable process. Suppose that {\{\tau_t\}_{t\ge 0}} are finite stopping times such that {t\mapsto\tau_t} is continuous and increasing. Define the time-changes {\mathcal{\tilde F}_t=\mathcal{F}_{\tau_t}}, {\tilde X_t=X_{\tau_t}} and {\tilde \xi_t=\xi_{\tau_t}}.

With respect to the filtration {\mathcal{\tilde F}_t}, {\tilde X} is a semimartingale, {\tilde\xi} is predictable and {\tilde X}-integrable, and

\displaystyle  \int_0^t\tilde\xi\,d\tilde X=\int_{\tau_0}^{\tau_t}\xi\,dX. (1)

Proof: First, as {t\mapsto\tau_t} is continuous, changing time takes the set of {\mathcal{F}_\cdot}-adapted and left-continuous (resp. cadlag) processes to the set of {\mathcal{\tilde F}_\cdot}-adapted and left-continuous (resp. cadlag) processes . However, the predictable sigma algebra is generated by the adapted left-continuous processes, so the time-change takes {\mathcal{F}}-predictable processes to {\mathcal{\tilde F}_\cdot}-predictable processes. Therefore, with respect to {\mathcal{\tilde F}_\cdot}, {\tilde\xi} is predictable and {\tilde X} is a cadlag adapted processes.

If {\xi} is elementary predictable, then (1) follows immediately from the explicit expression for the integral. Once it is known that {\tilde X} is a semimartingale, dominated convergence for stochastic integration will imply that the set of bounded predictable processes {\xi} for which (1) is satisfied will be closed under pointwise convergence of bounded sequences. So, by the monotone class theorem, the result holds for all bounded predictable {\xi}.

To show that {\tilde X} is a semimartingale, it is necessary to show that it is possible to define stochastic integration for bounded predictable integrands such that the explicit expression for elementary integrals and bounded convergence in probability are satisfied. We can use (1) to define the integral. Let us set

\displaystyle  A_t=\inf\left\{s\ge 0\colon\tau_s\ge t\right\},

which, by the Debut theorem, will be an {\mathcal{\tilde F}_\cdot}-stopping time and {\tau_{A_t}=t\wedge\tau_\infty}. Also, {t\mapsto A_t} is left-continuous. Suppose that we are given a left-continuous and adapted process {\tilde\xi} with respect to {\mathcal{\tilde F}_\cdot} then, for each {K\ge 0}, {1_{\{A_t\le K\}}\tilde\xi_{A_t}} will be left-continuous and adapted with respect to {\mathcal{F}_\cdot}. Therefore

\displaystyle  \xi_t\equiv 1_{\{A_t<\infty\}}\tilde\xi_{A_t}=\lim_{K\rightarrow\infty}1_{\{A_t\le K\}}\tilde\xi_{A_t}

will be predictable. More generally, this holds for any {\mathcal{\tilde F}_\cdot}-predictable {\tilde\xi}. Also, {A_{\tau_t}=t}, so {\tilde\xi_t=\xi_{\tau_t}}, except on intervals for which {\tau_t} (and hence {\tilde X_t}) is constant. So, when they are elementary predictable, equation (1) holds with these values of {\xi} and {\tilde\xi}.

We then use (1) to define the stochastic integral {\int\tilde \xi\,d\tilde X} for bounded {\mathcal{\tilde F}_\cdot}-predictable {\tilde\xi}. By the dominated convergence theorem for integration with respect to X, it follows that the integral we have defined with respect to {\tilde X} satisfies bounded convergence in probability, as required. So {\tilde X} is indeed an {\mathcal{\tilde F}}-semimartingale.

Equation (1) can be generalized to arbitrary X-integrable {\xi} by making use of associativity of stochastic integration. Let {\xi} be X-integrable and set {Y=\int\xi\,dX} and {\tilde Y_t=Y_{\tau_t}}. Then, {\tilde Y} is an {\mathcal{\tilde F}_\cdot}-semimartingale and, setting {\alpha=1/(1+\vert\xi\vert)}, {\tilde\alpha_t=\alpha_{\tau_t}} gives

\displaystyle  \int_0^t\tilde\alpha\tilde\xi\,d\tilde X = \int_{\tau_0}^{\tau_t}\alpha\xi\,dX=\int_{\tau_0}^{\tau_t}\alpha\,dY=\int_0^t\tilde\alpha\,d\tilde Y.

Here, (1) has been applied to the bounded integrands {\alpha\xi} and {\alpha}. Integrating {\tilde\alpha^{-1}} with respect to both sides shows that {\tilde\xi} is {\tilde X}-integrable and {\int\tilde\xi\,d\tilde X=\tilde Y-\tilde Y_0} as required. ⬜

30 thoughts on “Time-Changed Brownian Motion

  1. Hi,

    looking for time changes of Brownian motion I stumbled on your site. It’s amazing! =)

    I wonder about the first eqnarray: the last equality being \theta(t)-\theta(s)=\mathbb{E}((B_{\theta(t)}-B_{\theta(s)})^2).

    I cannot see this right away, what am I missing?

    Best regards,

    Konsta

    1. Hm, returning, it looks like applying Ito-Isometry backwards. I think I’ve sorted out my difficulties with that equality.

      1. Hi Konsta, and welcome!

        There’s no need to use anything as advanced as the Ito isometry for this equality. By definition, if B is a Brownian motion and s<t, then B_t-B_s has mean 0 and variance t-s. So, \mathbb{E}[(B_t-B_s)^2]=t-s. I just replaced t by \theta(t) and s by \theta(s).

  2. Hi,

    thank you for this very interesting post !

    I recently had to deal with the following situation, and was wondering if you could shed some light on it ?
    Consider a Brownian motion W and two nice random processes (suppose any regularity you want – but they are random) \alpha, H. The process X_t = \int_0^t \alpha_s \, ds + \int_0^t H_s \, dW_s is a semi-martingale, and after a Girsanov-like change of probability we can see that under another probability \mathbb{Q}, the process X_t is a continuous martingale with quadratic variation \int_0^t H_s^2 \, ds. It is very tempting to say that there is a \mathbb{Q}-Brownian motion Z (adapted to the same filtration as W) such that X_t = \int_0^t H_s dZ_s, with the same process H.

    Is this true ?

    Best
    Alekk

    1. Welcome Alekk.

      The quick answer to your question is Yes!

      Under an equivalent measure \mathbb{Q}, W will decompose as W=Z+\int\beta_s\,ds for a \mathbb{Q}-Brownian motion Z. Then, X=\int(\alpha_s+H_s\beta_s)\,ds +\int H_s\,dZ_s, where X and \int H\,dZ are both \mathbb{Q}-local martingales. By uniqueness of the decomposition into continuous local martingale and FV components, X=\int H\,dZ.

      Coincidentally, my next post is going to be on Girsanov transforms, which I’ll probably put up tomorrow.

      1. Thank you!
        I did not think to use this uniqueness of the decomposition. I was using this result while trying to re-prove the Girsanov change of probability to go from a SDE dX_t=a(X_t) + dW_t to dY_t=a(Y_t) + dW_t.

        Best
        Alekk

    2. Hi Alekk,

      I had a look at your comment and checked Oksendal’s book: I believe Theorem 8.6.4 (The Girsanov theorem II) on page 157 in my version of the book applies directly to a special case of your question (set alpha=0). Does this help?

  3. Hi again,

    since I was interested in the relation of the time-change Ito integral \int\sqrt{\theta}\,dB and the time-changed Brownian motion B_\theta, I tried to prove their mean-square equality and I am quite sure they are not. So, while they seem to have the same distribution, they are not equal in general (in the mean-square sense). Do you agree or disagree?

    1. They are not what? The processes are definitely not equal, but they do have the same distribution.

      btw, I’m not sure what you mean by the “time-change Ito integral”. The integral \int\sqrt{\theta}\,dB does not involve a time change.

      1. ad “They are not what?”:
        They are not equal in mean-square sense. Basically I’d like to write \int\sqrt{\theta}\,dB=B_{\theta} – actually I’m not sure which equality sign this is in case of stochastic calculus – but usually it’s not just equality in distribution.

        ad “time-change Ito integral”:
        I meant the integral \int\sqrt{\theta}\,dB and I called it such because I think of it as \int\sqrt{\theta}\,dB=\hat{B}_{\theta}, where B and \hat{B} are most likely different Brownian motions.

      2. Well X=\int\sqrt{\theta^\prime}\,dB is a local martingale with quadratic variation \theta(t). So, using Theorem 1, \int_0^t\sqrt{\theta^\prime}\,dB = W_{\theta(t)} for some Brownian motion W. However, W is not the same as B. In fact, it is defined with respect to a different filtration \{\mathcal{G}_t\}_{t\ge0}. The point is that the integral has the same distribution as a time-change of a Brownian motion.

        Also, I made a typo. \sqrt{\theta(t)} should be replaced by \sqrt{\theta^\prime(t)} in the post. I’ll fix this.

    1. p.s. I mean in the range, of course. So, this would mean you can express things as the square root, but that isn’t important. What matters is that \theta = int \xi^2 . Because of the square, the range of \xi is irrelevant, right?

      1. Yes it can take negative values. The only reason I restricted to nonnegative values is so that the formula \int\xi\,dB=\int\sqrt{\theta^\prime(t)}\,dB_t a bit further down the paragraph holds. But this is a very minor point, and we don’t really need this identity.

  4. Dear George Lowther,

    We are working with stochastic methods for finanace and are very interested in Theorem 2 above. Compared to the results given e.g. in Protter Theorem 42 and Revuz/Yor Theorem V.1.9, theorem 2 presents an additional result which is the representation of the local martingale X as a time changed Brownian Motion (X_t = B_A_t). Inspite of the fact that the proof is presented above, we would like to have more details of it. Could you help us?

    1. Hi. I should be able to help – I’ll have to check again the references you mention, but I think what I write here is fairly standard. I can drop you an email on the address registered with this comment if you prefer.

  5. In several places you require that the time chamge is “increasing.” I take this to mean strictly increasing, rather than nondecreasing. For what steps in the logic is this necessary? Can you recommend a reference for what can be said if they are nondecreasing? More generally, whats your favorite reference for these theorems? Thanks!

    1. Generally, I am using increasing to mean nondecreasing. Perhaps I should have been clearer. For references, I think Rogers & Williams or Revuz & Yor have sections on time changes, although I’ll have to double check my other references.

  6. Dear George Lowther, first of all thank you for your enlightening blog posts. They really help me to deepen my understanding of stochastic calculus. I came of with a question that I was not able to find an answer on and thought that maybe you could provide me with one. I’m currently interested in time changed Levy-models and want to show that an arithmetic Brownian motion time changed by the integral of a CIR-process is equivalent to the Heston-model. To do so, I need to show the following: Let W^1(t) and W^2(t) be two independent Brownian motions and \mathrm{d}\lambda(t)=\kappa(\theta-\lambda(t))\mathrm{d}t+\eta\sqrt{\lambda(t)}\mathrm{d}W^1(t) the CIR SDE. Now define T(t):=\int_0^t\lambda(s)\mathrm{d}s. I need to show that the laws of the processes \left(\int_0^t\lambda(s)\mathrm{d}s, W^1\left(T(t)\right), W^2\left(T(t)\right)\right) and \left(\int_0^t\lambda(s)\mathrm{d}s, \int_0^t\sqrt{\lambda(t)}\mathrm{d}W^1(t), \int_0^t\sqrt{\lambda(t)}\mathrm{d}W^2(t)\right) follow the same law. I believe I could use some variant of Dubins-Schwarz theorem. However, I am not sure if the correlation/independence structure is preserved under that change.

  7. Thank you for the informative web site.
    A question: from Monroe’s Theorem we can represent a cadlag semimartingale as a time-changed Brownian motion (omitting some of the statement). If I understand the theorem, a Poisson process should be representable as a time-changed Brownian motion (as should a compound Poisson process). Has the explicit time change appeared in print?

    1. I didn’t deal with Monroe’s theorem here, and am not familiar with the construction of the time-change in general, but it looks like it shouldn’t be too difficult for the special case of increasing processes.

  8. Hi George, I can’t figure out why in Lemma 3, we have 1_{\tilde{\sigma_n}>0} \tilde{X_t}^{\tilde{\sigma_n}} = 1_{\sigma_n>\tau_0} Y_{\tau_t}^n? From definition, Y_{\tau_t}^n = 1_{\sigma_n>0} X_{\tau_t \wedge \sigma_n} and \tilde{X_t}^{\tilde{\sigma_n}} = X_{\tau_t \wedge \tilde{\sigma_n}}. I can see that 1_{\tilde{\sigma_n}>0} = 1_{\sigma_n>\tau_0}. But 1_{\sigma_n>\tau_0} may not be equal to 1_{\sigma_n>0} and how do we relate X_{\tau_t \wedge \sigma_n} with X_{\tau_t \wedge \tilde{\sigma_n}} here?

  9. Hi George, there are a couple of points I don’t understand in the proof of Lemma 4.

    In the final sentences of the third paragraph, how can we justify that this holds for any \tilde{F}-predictable \tilde{\xi}? I assume you mean we can use the same limiting argument, but is \tilde{\xi}_{A_t} still predictable?

    Also, in the sentences below, you say A_{\tau_t}=t, so \tilde{\xi}_t=\xi_{\tau_t}, except on intervals for which \tua_t (and hence \tilde{X}_t) is constant. But you define \tilde{\xi}_t:=\xi_{\tau_t}, so I don’t understand why they may not be equal on intervals for which \tau_t is constant. And what happens to equation (1) for elementary predictable \xi and \tilde{\xi} if they are not equal on some intervals? I am confused about the statement that equation (1) holds with these values of \xi and \tilde{\xi} as you say in the first sentence of paragraph 2 that if \xi is elementary predictable, then (1) follows immediately from the explicit expression for the integral.

    Finally, in the last sentence, how are we able to integrate \tilde{\alpha}^{-1} with resepct to both sides? We have an equivalence of two integrals \int \tilde{\alpha}\tilde{\xi} d\tilde{X} and \int \tilde{\alpha} d\tilde{Y}.
    I don’t understand how we integrate over an integral here.

Leave a Reply to DF Cancel reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s