Special Semimartingales

For stochastic processes in discrete time, the Doob decomposition uniquely decomposes any integrable process into the sum of a martingale and a predictable process. If {\{X_n\}_{n=0,1,\ldots}} is an integrable process adapted to a filtration {\{\mathcal{F}_n\}_{n=0,1,\ldots}} then we write {X_n=M_n+A_n}. Here, M is a martingale, so that {M_{n-1}={\mathbb E}[M_n\vert\mathcal{F}_{n-1}]}, and A is predictable with {A_0=0}. By saying that A is predictable, we mean that {A_n} is {\mathcal{F}_{n-1}} measurable for each {n\ge1}. It can be seen that this implies that

\displaystyle  A_n-A_{n-1}={\mathbb E}[A_n-A_{n-1}\vert\mathcal{F}_{n-1}]={\mathbb E}[X_n-X_{n-1}\vert\mathcal{F}_{n-1}].

Then it is possible to write A and M as

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle A_n&\displaystyle=\sum_{k=1}^n{\mathbb E}[X_k-X_{k-1}\vert\mathcal{F}_{k-1}],\smallskip\\ \displaystyle M_n&\displaystyle=X_n-A_n. \end{array} (1)

So, the Doob decomposition is unique and, conversely, the processes A and M constructed according to equation (1) can be seen to be respectively, a predictable process starting from zero and a martingale. For many purposes, this allows us to reduce problems concerning processes in discrete time to simpler statements about martingales and separately about predictable processes. In the case where X is a submartingale then things reduce further as, in this case, A will be an increasing process.

The situation is considerably more complicated when looking at processes in continuous time. The extension of the Doob decomposition to continuous time processes, known as the Doob-Meyer decomposition, was an important result historically in the development of stochastic calculus. First, we would usually restrict attention to sufficiently nice modifications of the processes and, in particular, suppose that X is cadlag. When attempting an analogous decomposition to the one above, it is not immediately clear what should be meant by the predictable component. The continuous time predictable processes are defined to be the set of all processes which are measurable with respect to the predictable sigma algebra, which is the sigma algebra generated by the space of processes which are adapted and continuous (or, equivalently, left-continuous). In particular, all continuous and adapted processes are predictable but, due to the existence of continuous martingales such as Brownian motion, this means that decompositions as sums of martingales and predictable processes are not unique. It is therefore necessary to impose further conditions on the term A in the decomposition. It turns out that we obtain unique decompositions if, in addition to being predictable, A is required to be cadlag with locally finite variation (an FV process). The processes which can be decomposed into a local martingale and a predictable FV process are known as special semimartingales. This is precisely the space of locally integrable semimartingales. As usual, we work with respect to a complete filtered probability space {(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\ge0},{\mathbb P})} and two stochastic processes are considered to be the same if they are equivalent up to evanescence.

Theorem 1 For a process X, the following are equivalent.

  • X is a locally integrable semimartingale.
  • X decomposes as
    \displaystyle  X=M+A (2)

    for a local martingale M and predictable FV process A.

Furthermore, choosing {A_0=0}, decomposition (2) is unique.

Theorem 1 is a general version of the Doob-Meyer decomposition. However, the name `Doob-Meyer decomposition’ is often used to specifically refer to the important special case where X is a submartingale. Historically, the theorem was first stated and proved for that case, and I will look at the decomposition for submartingales in more detail in a later post. Continue reading “Special Semimartingales”

Predictable FV Processes

By definition, an FV process is a cadlag adapted stochastic process which almost surely has finite variation over finite time intervals. These are always semimartingales, because the stochastic integral for bounded integrands can be constructed by taking the Lebesgue-Stieltjes integral along sample paths. Also, from the previous post on continuous semimartingales, we know that the class of continuous FV processes is particularly well behaved under stochastic integration. For one thing, given a continuous FV process X and predictable {\xi}, then {\xi} is X-integrable in the stochastic sense if and only if it is almost surely Lebesgue-Stieltjes integrable along the sample paths of X. In that case the stochastic and Lebesgue-Stieltjes integrals coincide. Furthermore, the stochastic integral preserves the class of continuous FV processes, so that {\int\xi\,dX} is again a continuous FV process. It was also shown that all continuous semimartingales decompose in a unique way as the sum of a local martingale and a continuous FV process, and that the stochastic integral preserves this decomposition.

Moving on to studying non-continuous semimartingales, it would be useful to extend the results just mentioned beyond the class of continuous FV processes. The first thought might be to simply drop the continuity requirement and look at all FV processes. After all, we know that every FV process is a semimartingale and, by the Bichteler-Dellacherie theorem, that every semimartingale decomposes as the sum of a local martingale and an FV process. However, this does not work out very well. The existence of local martingales with finite variation means that the decomposition given by the Bichteler-Dellacherie theorem is not unique, and need not commute with stochastic integration for integrands which are not locally bounded. Also, it is possible for the stochastic integral of a predictable {\xi} with respect to an FV process X to be well-defined even if {\xi} is not Lebesgue-Stieltjes integrable with respect to X along its sample paths. In this case, the integral {\int\xi\,dX} is not itself an FV process. See this post for examples where this happens.

Instead, when we do not want to restrict ourselves to continuous processes, it turns out that the class of predictable FV processes is the correct generalisation to use. By definition, a process is predictable if it is measurable with respect to the set of adapted and left-continuous processes so, in particular, continuous FV processes are predictable. We can show that all predictable FV local martingales are constant (Lemma 2 below), which will imply that decompositions into the sum of local martingales and predictable FV processes are unique (up to constant processes). I do not look at general semimartingales in this post, so will not prove the existence of such decompositions, although they do follow quickly from the results stated here. We can also show that predictable FV processes are very well behaved with respect to stochastic integration. A predictable process {\xi} is integrable with respect to a predictable FV process X in the stochastic sense if and only if it is Lebesgue-Stieltjes integrable along the sample paths, in which case stochastic and Lebesgue-Stieltjes integrals agree. Also, {\int\xi\,dX} will again be a predictable FV process. See Theorem 6 below.

In the previous post on continuous semimartingales, it was also shown that the continuous FV processes can be characterised in terms of their quadratic variations and covariations. They are precisely the semimartingales with zero quadratic variation. Alternatively, they are continuous semimartingales which have zero quadratic covariation with all local martingales. We start by extending this characterisation to the class of predictable FV processes. As always, we work with respect to a complete filtered probability space {(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\ge0},{\mathbb P})} and two stochastic processes are considered to be equal if they are equivalent up to evanescence. Recall that, in these notes, the notation {[X]^c_t=[X]_t-\sum_{s\le t}(\Delta X_s)^2} is used to denote the continuous part of the quadratic variation of a semimartingale X.

Theorem 1 For a process X, the following are equivalent.

  1. X is a predictable FV process.
  2. X is a predictable semimartingale with {[X]^c=0}.
  3. X is a semimartingale such that {[X,M]} is a local martingale for all local martingales M.
  4. X is a semimartingale such that {[X,M]} is a local martingale for all uniformly bounded cadlag martingales M.

Continue reading “Predictable FV Processes”

Predictable Stopping Times

Although this post is under the heading of `the general theory of semimartingales’ it is not, strictly speaking, about semimartingales at all. Instead, I will be concerned with a characterization of predictable stopping times. The reason for including this now is twofold. First, the results are too advanced to have been proven in the earlier post on predictable stopping times, and reasonably efficient self-contained proofs can only be given now that we have already built up a certain amount of stochastic calculus theory. Secondly, the results stated here are indispensable to the further study of semimartingales. In particular, standard semimartingale decompositions require some knowledge of predictable processes and predictable stopping times.

Recall that a stopping time {\tau} is said to be predictable if there exists a sequence of stopping times {\tau_n\le\tau} increasing to {\tau} and such that {\tau_n < \tau} whenever {\tau > 0}. Also, the predictable sigma-algebra {\mathcal{P}} is defined as the sigma-algebra generated by the left-continuous and adapted processes. Stated like this, these two concepts can appear quite different. However, as was previously shown, stochastic intervals of the form {[\tau,\infty)} for predictable times {\tau} are all in {\mathcal{P}} and, in fact, generate the predictable sigma-algebra.

The main result (Theorem 1) of this post is to show that a converse statement holds, so that {[\tau,\infty)} is in {\mathcal{P}} if and only if the stopping time {\tau} is predictable. This rather simple sounding result does have many far-reaching consequences. We can use it show that all cadlag predictable processes are locally bounded, local martingales are predictable if and only if they are continuous, and also give a characterization of cadlag predictable processes in terms of their jumps. Some very strong statements about stopping times also follow without much difficulty for certain special stochastic processes. For example, if the underlying filtration is generated by a Brownian motion then every stopping time is predictable. Actually, this is true whenever the filtration is generated by a continuous Feller process. It is also possible to give a surprisingly simple characterization of stopping times for filtrations generated by arbitrary non-continuous Feller processes. Precisely, a stopping time {\tau} is predictable if the process is almost surely continuous at time {\tau} and is totally inaccessible if the underlying Feller process is almost surely discontinuous at {\tau}.

As usual, we work with respect to a complete filtered probability space {(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\in{\mathbb R}_+},{\mathbb P})}. I now give a statement and proof of the main result of this post. Note that the equivalence of the four conditions below means that any of them can be used as alternative definitions of predictable stopping times. Often, the first condition below is used instead. Stopping times satisfying the definition used in these notes are sometimes called announceable, with the sequence {\tau_n\uparrow\tau} said to announce {\tau} (this terminology is used by, e.g., Rogers & Williams). Stopping times satisfying property 3 below, which is easily seen to be equivalent to 2, are sometimes called fair. Then, the following theorem says that the sets of predictable, fair and announceable stopping times all coincide.

Theorem 1 Let {\tau} be a stopping time. Then, the following are equivalent.

  1. {[\tau]\in\mathcal{P}}.
  2. {\Delta M_\tau1_{[\tau,\infty)}} is a local martingale for all local martingales M.
  3. {{\mathbb E}[1_{\{\tau < \infty\}}\Delta M_\tau]=0} for all cadlag bounded martingales M.
  4. {\tau} is predictable.

Continue reading “Predictable Stopping Times”

Continuous Semimartingales

A stochastic process is a semimartingale if and only if it can be decomposed as the sum of a local martingale and an FV process. This is stated by the Bichteler-Dellacherie theorem or, alternatively, is often taken as the definition of a semimartingale. For continuous semimartingales, which are the subject of this post, things simplify considerably. The terms in the decomposition can be taken to be continuous, in which case they are also unique. As usual, we work with respect to a complete filtered probability space {(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\ge0},{\mathbb P})}, all processes are real-valued, and two processes are considered to be the same if they are indistinguishable.

Theorem 1 A continuous stochastic process X is a semimartingale if and only if it decomposes as

\displaystyle  X=M+A (1)

for a continuous local martingale M and continuous FV process A. Furthermore, assuming that {A_0=0}, decomposition (1) is unique.

Proof: As sums of local martingales and FV processes are semimartingales, X is a semimartingale whenever it satisfies the decomposition (1). Furthermore, if {X=M+A=M^\prime+A^\prime} were two such decompositions with {A_0=A^\prime_0=0} then {M-M^\prime=A^\prime-A} is both a local martingale and a continuous FV process. Therefore, {A^\prime-A} is constant, so {A=A^\prime} and {M=M^\prime}.

It just remains to prove the existence of decomposition (1). However, X is continuous and, hence, is locally square integrable. So, Lemmas 4 and 5 of the previous post say that we can decompose {X=M+A} where M is a local martingale, A is an FV process and the quadratic covariation {[M,A]} is a local martingale. As X is continuous we have {\Delta M=-\Delta A} so that, by the properties of covariations,

\displaystyle  -[M,A]_t=-\sum_{s\le t}\Delta M_s\Delta A_s=\sum_{s\le t}(\Delta A_s)^2. (2)

We have shown that {-[M,A]} is a nonnegative local martingale so, in particular, it is a supermartingale. This gives {\mathbb{E}[-[M,A]_t]\le\mathbb{E}[-[M,A]_0]=0}. Then (2) implies that {\Delta A} is zero and, hence, A and {M=X-A} are continuous. ⬜

Using decomposition (1), it can be shown that a predictable process {\xi} is X-integrable if and only if it is both M-integrable and A-integrable. Then, the integral with respect to X breaks down into the sum of the integrals with respect to M and A. This greatly simplifies the construction of the stochastic integral for continuous semimartingales. The integral with respect to the continuous FV process A is equivalent to Lebesgue-Stieltjes integration along sample paths, and it is possible to construct the integral with respect to the continuous local martingale M for the full set of M-integrable integrands using the Ito isometry. Many introductions to stochastic calculus focus on integration with respect to continuous semimartingales, which is made much easier because of these results.

Theorem 2 Let {X=M+A} be the decomposition of the continuous semimartingale X into a continuous local martingale M and continuous FV process A. Then, a predictable process {\xi} is X-integrable if and only if

\displaystyle  \int_0^t\xi^2\,d[M]+\int_0^t\vert\xi\vert\,\vert dA\vert < \infty (3)

almost surely, for each time {t\ge0}. In that case, {\xi} is both M-integrable and A-integrable and,

\displaystyle  \int\xi\,dX=\int\xi\,dM+\int\xi\,dA (4)

gives the decomposition of {\int\xi\,dX} into its local martingale and FV terms.

Continue reading “Continuous Semimartingales”

The Bichteler-Dellacherie Theorem

In this post, I will give a statement and proof of the Bichteler-Dellacherie theorem describing the space of semimartingales. A semimartingale, as defined in these notes, is a cadlag adapted stochastic process X such that the stochastic integral {\int\xi\,dX} is well-defined for all bounded predictable integrands {\xi}. More precisely, an integral should exist which agrees with the explicit expression for elementary integrands, and satisfies bounded convergence in the following sense. If {\{\xi^n\}_{n=1,2,\ldots}} is a uniformly bounded sequence of predictable processes tending to a limit {\xi}, then {\int_0^t\xi^n\,dX\rightarrow\int_0^t\xi\,dX} in probability as n goes to infinity. If such an integral exists, then it is uniquely defined up to zero probability sets.

An immediate consequence of bounded convergence is that the set of integrals {\int_0^t\xi\,dX} for a fixed time t and bounded elementary integrands {\vert\xi\vert\le1} is bounded in probability. That is,

\displaystyle  \left\{\int_0^t\xi\,dX\colon\xi{\rm\ is\ elementary},\ \vert\xi\vert\le1\right\} (1)

is bounded in probability, for each {t\ge0}. For cadlag adapted processes, it was shown in a previous post that this is both a necessary and sufficient condition to be a semimartingale. Some authors use the property that (1) is bounded in probability as the definition of semimartingales (e.g., Protter, Stochastic Calculus and Differential Equations). The existence of the stochastic integral for arbitrary predictable integrands does not follow particularly easily from this definition, at least, not without using results on extensions of vector valued measures. On the other hand, if you are content to restrict to integrands which are left-continuous with right limits, the integral can be constructed very efficiently and, furthermore, such integrands are sufficient for many uses (integration by parts, Ito’s formula, a large class of stochastic differential equations, etc).

It was previously shown in these notes that, if X can be decomposed as {X=M+V} for a local martingale M and FV process V then it is possible to construct the stochastic integral, so X is a semimartingale. The importance of the Bichteler-Dellacherie theorem is that it tells us that a process is a semimartingale if and only if it is the sum of a local martingale and an FV process. In fact this was the historical definition used of semimartingales, and is still probably the most common definition.

Throughout, we work with respect to a complete filtered probability space {(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\ge0},{\mathbb P})}, and all processes are real-valued.

Theorem 1 (Bichteler-Dellacherie) For a cadlag adapted process X, the following are equivalent.

  1. X is a semimartingale.
  2. For each {t\ge0}, the set given by (1) is bounded in probability.
  3. X is the sum of a local martingale and an FV process.

Furthermore, the local martingale term in 3 can be taken to be locally bounded.

Continue reading “The Bichteler-Dellacherie Theorem”

The General Theory of Semimartingales

Having completed the series of posts applying the methods of stochastic calculus to various special types of processes, I now return to the development of the theory. The next few posts of these notes will be grouped under the heading `The General Theory of Semimartingales’. Subjects which will be covered include the classification of predictable stopping times, integration with respect to continuous and predictable FV processes, decompositions of special semimartingales, the Bichteler-Dellacherie theorem, the Doob-Meyer decomposition and the theory of quasimartingales.

One of the main results is the Bichteler-Dellacherie theorem describing the class of semimartingales which, in these notes, were defined to be cadlag adapted processes with respect to which the stochastic integral can be defined (that is, they are good integrators). It was shown that these include the sums of local martingales and FV processes. The Bichteler-Dellacherie theorem says that this is the full class of semimartingales. Classically, semimartingales were defined as a sum of a local martingale and an FV process so, an alternative statement of the theorem is that the classical definition agrees with the one used in these notes. Further results, such as the Doob-Meyer decomposition for submartingales and Rao’s decomposition for quasimartingales, will follow quickly from this.

Logically, the structure of these notes will be almost directly opposite to the historical development of the results. Originally, much of the development of the stochastic integral was based on the Doob-Meyer decomposition which, in turn, relied on some advanced ideas such as the predictable and dual predictable projection theorems. However, here, we have already introduced stochastic integration without recourse to such general theory, and can instead make use of this in the theory. The reasons I have taken this approach are as follows. First, stochastic integration is a particularly straightforward and useful technique for many applications, so it is desirable to introduce this early on. Second, although it is possible to use the general theory of processes in the construction of the integral, such an approach seems rather distinct from the intuitive understanding of stochastic integration as well as superfluous to many of its properties. So it seemed more natural from the point of view of these notes to define the integral first, guided by the properties of the (non-stochastic) Lebesgue integral, then show how its elementary properties follow from the definitions, and develop the further theory later. Continue reading “The General Theory of Semimartingales”

Properties of Lévy Processes

Lévy processes, which are defined as having stationary and independent increments, were introduced in the previous post. It was seen that the distribution of a d-dimensional Lévy process X is determined by the characteristics (Σ, b, ν) via the Lévy-Khintchine formula,

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} &\displaystyle{\mathbb E}\left[e^{ia\cdot (X_t-X_0)}\right] = \exp(t\psi(a)),\smallskip\\ &\displaystyle\psi(a)=ia\cdot b-\frac12a^{\rm T}\Sigma a+\int_{{\mathbb R}^d}\left(e^{ia\cdot x}-1-\frac{ia\cdot x}{1+\Vert x\Vert}\right)\,d\nu(x). \end{array} (1)

The positive semidefinite matrix Σ describes the Brownian motion component of X, b is a drift term, and ν is a measure on d such that ν(A) is the rate at which jumps ΔX ∈ A of X occur. Then, equation (1) gives us the characteristic function of the increments of the process.

In the current post, I will investigate some of the properties of such processes, and how they are related to the characteristics. In particular, we will be concerned with pathwise properties of X. It is known that Brownian motion and Cauchy processes have infinite variation in every nonempty time interval, whereas other Lévy processes — such as the Poisson process — are piecewise constant, only jumping at a discrete set of times. There are also purely discontinuous Lévy processes which have infinitely many discontinuities, yet are of finite variation, on every interval (e.g., the gamma process). Continue reading “Properties of Lévy Processes”

Lévy Processes

A Poisson process sample path
Figure 1: A Cauchy process sample path

Continuous-time stochastic processes with stationary independent increments are known as Lévy processes. In the previous post, it was seen that processes with independent increments are described by three terms — the covariance structure of the Brownian motion component, a drift term, and a measure describing the rate at which jumps occur. Being a special case of independent increments processes, the situation with Lévy processes is similar. However, stationarity of the increments does simplify things a bit. We start with the definition.

Definition 1 (Lévy process) A d-dimensional Lévy process X is a stochastic process taking values in {{\mathbb R}^d} such that

  • independent increments: {X_t-X_s} is independent of {\{X_u\colon u\le s\}} for any {s<t}.
  • stationary increments: {X_{s+t}-X_s} has the same distribution as {X_t-X_0} for any {s,t>0}.
  • continuity in probability: {X_s\rightarrow X_t} in probability as s tends to t.

More generally, it is possible to define the notion of a Lévy process with respect to a given filtered probability space {(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\ge0},{\mathbb P})}. In that case, we also require that X is adapted to the filtration and that {X_t-X_s} is independent of {\mathcal{F}_s} for all {s < t}. In particular, if X is a Lévy process according to definition 1 then it is also a Lévy process with respect to its natural filtration {\mathcal{F}_t=\sigma(X_s\colon s\le t)}. Note that slightly different definitions are sometimes used by different authors. It is often required that {X_0} is zero and that X has cadlag sample paths. These are minor points and, as will be shown, any process satisfying the definition above will admit a cadlag modification.

The most common example of a Lévy process is Brownian motion, where {X_t-X_s} is normally distributed with zero mean and variance {t-s} independently of {\mathcal{F}_s}. Other examples include Poisson processes, compound Poisson processes, the Cauchy process, gamma processes and the variance gamma process.

For example, the symmetric Cauchy distribution on the real numbers with scale parameter {\gamma > 0} has probability density function p and characteristic function {\phi} given by,

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} &\displaystyle p(x)=\frac{\gamma}{\pi(\gamma^2+x^2)},\smallskip\\ &\displaystyle\phi(a)\equiv{\mathbb E}\left[e^{iaX}\right]=e^{-\gamma\vert a\vert}. \end{array} (1)

From the characteristic function it can be seen that if X and Y are independent Cauchy random variables with scale parameters {\gamma_1} and {\gamma_2} respectively then {X+Y} is Cauchy with parameter {\gamma_1+\gamma_2}. We can therefore consistently define a stochastic process {X_t} such that {X_t-X_s} has the symmetric Cauchy distribution with parameter {t-s} independent of {\{X_u\colon u\le t\}}, for any {s < t}. This is called a Cauchy process, which is a purely discontinuous Lévy process. See Figure 1.

Lévy processes are determined by the triple {(\Sigma,b,\nu)}, where {\Sigma} describes the covariance structure of the Brownian motion component, b is the drift component, and {\nu} describes the rate at which jumps occur. The distribution of the process is given by the Lévy-Khintchine formula, equation (3) below.

Theorem 2 (Lévy-Khintchine) Let X be a d-dimensional Lévy process. Then, there is a unique function {\psi\colon{\mathbb R}\rightarrow{\mathbb C}} such that

\displaystyle  {\mathbb E}\left[e^{ia\cdot (X_t-X_0)}\right]=e^{t\psi(a)} (2)

for all {a\in{\mathbb R}^d} and {t\ge0}. Also, {\psi(a)} can be written as

\displaystyle  \psi(a)=ia\cdot b-\frac{1}{2}a^{\rm T}\Sigma a+\int _{{\mathbb R}^d}\left(e^{ia\cdot x}-1-\frac{ia\cdot x}{1+\Vert x\Vert}\right)\,d\nu(x) (3)

where {\Sigma}, b and {\nu} are uniquely determined and satisfy the following,

  1. {\Sigma\in{\mathbb R}^{d^2}} is a positive semidefinite matrix.
  2. {b\in{\mathbb R}^d}.
  3. {\nu} is a Borel measure on {{\mathbb R}^d} with {\nu(\{0\})=0} and,
    \displaystyle  \int_{{\mathbb R}^d}\Vert x\Vert^2\wedge 1\,d\nu(x)<\infty. (4)

Furthermore, {(\Sigma,b,\nu)} uniquely determine all finite distributions of the process {X-X_0}.

Conversely, if {(\Sigma,b,\nu)} is any triple satisfying the three conditions above, then there exists a Lévy process satisfying (2,3).

Continue reading “Lévy Processes”

Asymptotic Expansions of a Recurrence Relation

In today’s post, I investigate a simple recurrence relation and show how it is possible to describe its behaviour asymptotically at large times. The relation describing how the series evolves at a time n will depend both on its value at the earlier time n/2 and on whether n is even or odd, which, as we will see, introduces a `period doubling’ behaviour in the asymptotics. The proof will involve defining a Dirichlet generating function for the series, showing that it satisfies a functional equation and has a meromorphic extension to the whole complex plane, and then inverting the generating function with Perron’s formula. Cauchy’s residue theorem relates the terms of the asymptotic expansion to the poles of the meromorphic extension of the Dirichlet series. Such an approach is well-known in analytic number theory, originally used by Riemann to give the explicit formula for the prime counting function in terms of the zeros of the Riemann zeta function. However, Dirichlet generating functions can be applied in many other situations to generate asymptotic expansions (see the references for examples). Although I will concentrate on a specific difference equation here, the techniques described are quite general and also apply to many other kinds of series.

This post grew out of an answer I gave to a question by Byron Schmuland at the math.stackexchange website. Continue reading “Asymptotic Expansions of a Recurrence Relation”

Processes with Independent Increments

In a previous post, it was seen that all continuous processes with independent increments are Gaussian. We move on now to look at a much more general class of independent increments processes which need not have continuous sample paths. Such processes can be completely described by their jump intensities, a Brownian term, and a deterministic drift component. However, this class of processes is large enough to capture the kinds of behaviour that occur for more general jump-diffusion processes. An important subclass is that of Lévy processes, which have independent and stationary increments. Lévy processes will be looked at in more detail in the following post, and includes as special cases, the Cauchy process, gamma processes, the variance gamma process, Poisson processes, compound Poisson processes and Brownian motion.

Recall that a process {\{X_t\}_{t\ge0}} has the independent increments property if {X_t-X_s} is independent of {\{X_u\colon u\le s\}} for all times {0\le s\le t}. More generally, we say that X has the independent increments property with respect to an underlying filtered probability space {(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\ge0},{\mathbb P})} if it is adapted and {X_t-X_s} is independent of {\mathcal{F}_s} for all {s < t}. In particular, every process with independent increments also satisfies the independent increments property with respect to its natural filtration. Throughout this post, I will assume the existence of such a filtered probability space, and the independent increments property will be understood to be with regard to this space.

The process X is said to be continuous in probability if {X_s\rightarrow X_t} in probability as s tends to t. As we now state, a d-dimensional independent increments process X is uniquely specified by a triple {(\Sigma,b,\mu)} where {\mu} is a measure describing the jumps of X, {\Sigma} determines the covariance structure of the Brownian motion component of X, and b is an additional deterministic drift term.

Theorem 1 Let X be an {{\mathbb R}^d}-valued process with independent increments and continuous in probability. Then, there is a unique continuous function {{\mathbb R}^d\times{\mathbb R}_+\rightarrow{\mathbb C}}, {(a,t)\mapsto\psi_t(a)} such that {\psi_0(a)=0} and

\displaystyle  {\mathbb E}\left[e^{ia\cdot (X_t-X_0)}\right]=e^{i\psi_t(a)} (1)

for all {a\in{\mathbb R}^d} and {t\ge0}. Also, {\psi_t(a)} can be written as

\displaystyle  \psi_t(a)=ia\cdot b_t-\frac{1}{2}a^{\rm T}\Sigma_t a+\int _{{\mathbb R}^d\times[0,t]}\left(e^{ia\cdot x}-1-\frac{ia\cdot x}{1+\Vert x\Vert}\right)\,d\mu(x,s) (2)

where {\Sigma_t}, {b_t} and {\mu} are uniquely determined and satisfy the following,

  1. {t\mapsto\Sigma_t} is a continuous function from {{\mathbb R}_+} to {{\mathbb R}^{d^2}} such that {\Sigma_0=0} and {\Sigma_t-\Sigma_s} is positive semidefinite for all {t\ge s}.
  2. {t\mapsto b_t} is a continuous function from {{\mathbb R}_+} to {{\mathbb R}^d}, with {b_0=0}.
  3. {\mu} is a Borel measure on {{\mathbb R}^d\times{\mathbb R}_+} with {\mu(\{0\}\times{\mathbb R}_+)=0}, {\mu({\mathbb R}^d\times\{t\})=0} for all {t\ge 0} and,
    \displaystyle  \int_{{\mathbb R}^d\times[0,t]}\Vert x\Vert^2\wedge 1\,d\mu(x,s)<\infty. (3)

Furthermore, {(\Sigma,b,\mu)} uniquely determine all finite distributions of the process {X-X_0}.

Conversely, if {(\Sigma,b,\mu)} is any triple satisfying the three conditions above, then there exists a process with independent increments satisfying (1,2).

Continue reading “Processes with Independent Increments”