Semimartingale Completeness

A sequence of stochastic processes, ${X^n}$, is said to converge to a process X under the semimartingale topology, as n goes to infinity, if the following conditions are met. First, ${X^n_0}$ should tend to ${X_0}$ in probability. Also, for every sequence ${\xi^n}$ of elementary predictable processes with ${\vert\xi^n\vert\le 1}$,

 $\displaystyle \int_0^t\xi^n\,dX^n-\int_0^t\xi^n\,dX\rightarrow 0$

in probability for all times t. For short, this will be denoted by ${X^n\xrightarrow{\rm sm}X}$.

The semimartingale topology is particularly well suited to the class of semimartingales, and to stochastic integration. Previously, it was shown that the cadlag and adapted processes are complete under semimartingale convergence. In this post, it will be shown that the set of semimartingales is also complete. That is, if a sequence ${X^n}$ of semimartingales converge to a limit X under the semimartingale topology, then X is also a semimartingale.

Theorem 1 The space of semimartingales is complete under the semimartingale topology.

The same is true of the space of stochastic integrals defined with respect to any given semimartingale. In fact, for a semimartingale X, the set of all processes which can be expressed as a stochastic integral ${\int\xi\,dX}$ can be characterized as follows; it is precisely the closure, under the semimartingale topology, of the set of elementary integrals of X. This result was originally due to Memin, using a rather different proof to the one given here. The method used in this post only relies on the elementary properties of stochastic integrals, such as the dominated convergence theorem.

Theorem 2 Let X be a semimartingale. Then, a process Y is of the form ${Y=\int\xi\,dX}$ for some ${\xi\in L^1(X)}$ if and only if there is a sequence ${\xi^n}$ of bounded elementary processes with ${\int\xi^n\,dX\xrightarrow{\rm sm}Y}$.

Writing S for the set of processes of the form ${\int\xi\,dX}$ for bounded elementary ${\xi}$, and ${\bar S}$ for its closure under the semimartingale topology, the statement of the theorem is equivalent to

 $\displaystyle \bar S=\left\{\int\xi\,dX\colon \xi\in L^1(X)\right\}.$ (1)

Recall that the semimartingale topology is generated by a translation invariant metric ${D^{\rm sm}(X-Y)}$ satisfying the following basic properties.

1. ${D^{\rm sm}(X+Y)\le D^{\rm sm}(X)+D^{\rm sm}(Y)}$.
2. ${D^{\rm sm}(\lambda X)\le D^{\rm sm}(X)}$ for all real numbers ${\vert\lambda\vert\le 1}$.

To verify that such a semimetric generates a vector topology, so that addition of processes and multiplication by scalars is continuous, the only other required property is that ${D^{\rm sm}(\lambda_n X)\rightarrow 0}$ for all sequences of real numbers ${\lambda_n\rightarrow 0}$. In fact, this property precisely picks out the semimartingales.

Theorem 3 A cadlag adapted process X is a semimartingale if and only if ${\lambda_nX\xrightarrow{\rm sm}0}$ for all sequences of real numbers ${\lambda_n\rightarrow 0}$.

Proof: The condition that ${\lambda_nX\xrightarrow{\rm sm}0}$ is equivalent to the property that, for all sequences ${\xi^n}$ of elementary processes with ${\vert\xi^n\vert\le 1}$, the integrals ${\lambda_n\int_0^t\xi^n\,dX}$ tend to zero in probability. By the sequential characterization of boundedness, this is equivalent to ${\{\int_0^t\xi\,dX\colon\vert\xi\vert\le 1\textrm{ is elementary}\}}$ being bounded in probability for each t. As previously shown, this property characterizes semimartingales. ⬜

It follows that the semimartingales form a topological vector space.

Corollary 4 Semimartingale convergence is a vector topology on the space of semimartingales.

We now move on to the proofs of the completeness theorems. The starting point is to show that every bounded predictable integrand can be approximated by bounded elementary integrands in the following sense.

Lemma 5 Let X be a semimartingale and ${\xi}$ be a predictable process with ${\vert\xi\vert\le K}$ for some constant K. Then, there is a sequence of elementary processes ${\xi^n}$ with ${\vert\xi^n\vert\le K}$ and such that ${\int\xi^n\,dX\xrightarrow{\rm sm}\int\xi\,dX}$.

Proof: Let S be the set of processes of the form ${\int\xi\,dX}$ for elementary ${\vert\xi\vert\le K}$. The lemma states that ${\int\xi\,dX}$ is in the semimartingale closure ${\bar S}$ for all predictable ${\vert\xi\vert\le K}$.

Let A be the set of predictable processes ${\xi}$ such that ${\int(\xi\wedge K)\vee(-K)\,dX\in\bar S}$. This contains the elementary processes. Furthermore, if ${\xi^n\in A}$ is a sequence converging to a limit ${\xi}$, then dominated convergence gives

 $\displaystyle \int(\xi^n\wedge K)\vee(-K)\,dX\xrightarrow{\rm sm}\int(\xi\wedge K)\vee(-K)\,dX.$

So, ${\xi\in A}$. Then, the monotone class theorem says that all predictable processes are in A. ⬜

Next, stochastic integration of a bounded predictable process is a Lipschitz continuous map on the semimartingales.

Lemma 6 Let ${K\ge 1}$ be a constant and ${Y=\int\xi\,dX}$ for cadlag X and elementary ${\vert\xi\vert\le K}$. Then ${D^{\rm sm}(Y)\le K D^{\rm sm}(X)}$.

If, furthermore, X is a semimartingale then this holds for all predictable ${\vert\xi\vert\le K}$.

Proof: Recall that ${D^{\rm sm}(X)=\sum_{n=1}^\infty 2^{-n} D^{\rm sm}_n(X)}$ where, for each t, ${D^{\rm sm}_t}$ is defined to be the supremum of

 $\displaystyle {\mathbb E}\left[\left\vert \alpha_0X_0+\int_0^t\alpha\,dX\right\vert\wedge 1\right]$

over all elementary ${\alpha}$ with ${\vert\alpha\vert\le 1}$. In particular, if ${Y=\int_0^t\xi\,dX}$ for an elementary process ${\vert\xi\vert\le K}$ then ${K^{-1}\alpha\xi}$ is elementary and bounded by 1, giving

 $\displaystyle \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle{\mathbb E}\left[\left\vert\int_0^t\alpha\,dY\right\vert\wedge 1\right] &\displaystyle= {\mathbb E}\left[\left\vert\int_0^t\alpha\xi\,dX\right\vert\wedge 1\right]\smallskip\\ &\displaystyle\le K{\mathbb E}\left[\left\vert\int_0^t K^{-1}\alpha\xi\,dX\right\vert\wedge 1\right]\smallskip\\ &\displaystyle\le K D^{\rm sm}_t(X). \end{array}$

Taking the supremum over all such ${\alpha}$ gives ${D^{\rm sm}_t(Y)\le K D^{\rm sm}_t(X)}$ and, therefore, ${D^{\rm sm}(Y)\le K D^{\rm sm}(X)}$.

Now suppose that X is a semimartingale and ${\vert\xi\vert\le K}$ is predictable. By Lemma 5 there exist elementary ${\vert\xi^n\vert\le K}$ such that ${Y^n\equiv\int\xi^n\,dX}$ converge to Y under the semimartingale topology. So,

 $\displaystyle D^{\rm sm}(Y) = \lim_{n\rightarrow\infty}D^{\rm sm}(Y^n)\le K D^{\rm sm}(X).$

The proof of Theorem 1 for the completeness of the space of semimartingales is as follows.

Theorem 7 Let ${X^n}$ be a sequence of semimartingales and ${X^n\xrightarrow{\rm sm}X}$. Then, X is a semimartingale.

Proof: According to the definition of semimartingales used in these notes, it needs to be shown that there is a well-defined stochastic integral with respect to X for bounded integrands which agrees with the explicit expression for elementary integrands and satisfies bounded convergence in probability.

Using the semimartingale topology, Lemma 6 says that, for any bounded elementary ${\xi}$, the map ${Y\mapsto\int\xi\,dY}$ is continuous on the space of cadlag processes Y. So,

 $\displaystyle \int\xi\,dX = \lim_{n\rightarrow\infty}\int\xi\,dX^n.$ (2)

If ${\xi}$ is a bounded predictable process then, again by Lemma 6, ${Y\mapsto\int\xi\,dY}$ is continuous on the space of semimartingales. In particular, ${\int\xi\,dX^n}$ is Cauchy in the semimartingale topology and, by completeness for cadlag processes, converges to a limit. Then, (2) can be used to extend the integral to all bounded predictable processes.

Now suppose that ${\vert\xi^m\vert\le K}$ is a sequence of predictable processes tending to a limit ${\xi}$, for constant ${K\ge 1}$. By dominated convergence, ${\int\xi^m\,dX^n\xrightarrow{\rm sm}\int\xi\,dX^n}$ as m goes to infinity, for each n. Finally, by Lemma 6, convergence in (2) holds uniformly over all ${\vert\xi\vert\le K}$ so it follows that ${\int\xi^m\,dX\xrightarrow{\rm sm}\int\xi\,dX}$ as required. ⬜

For the remainder of the post, let us turn our attention towards the proof of Theorem 2. If S is the set of integrals ${\int\xi\,dX}$ for bounded elementary ${\xi}$, equality (1) needs to be established. We start by proving the inclusion

 $\displaystyle \left\{\int\xi\,dX\colon\xi\in L^1(X)\right\}\subseteq\bar S,$ (3)

which is established by the following lemma.

Lemma 8 Let X be a semimartingale and ${\xi\in L^1(X)}$. Then, ${\int\xi\,dX\in\bar S}$.

Proof: This is stated by Lemma 5 in the case where ${\xi}$ is bounded. Then, for general X-integrable ${\xi}$, dominated convergence gives

 $\displaystyle \int\xi\,dX = \lim_{n\rightarrow\infty}\int(\xi\wedge n)\vee(-n)\,dX\in\bar S.$

Proving the reverse inclusion to (3) is more difficult. By definition, for any ${Y\in\bar S}$, there will be bounded elementary ${\xi^n}$ such that ${\int\xi^n\,dX\xrightarrow{\rm sm}Y}$. Then, the idea is to attempt to write Y as ${\int\xi\,dX}$ for ${\xi=\sum_n(\xi^{n+1}-\xi^n)+\xi^1}$. This sum need not converge, but we will show that it is possible to pass to a subsequence such that convergence holds absolutely outside of a negligible set.

For convenience, the notation ${D_X(\xi)\equiv D^{\rm sm}(\int\xi\,dX)}$ will be used, for semimartingales X and bounded predictable ${\xi}$. If ${\vert\alpha\vert\le\vert\beta\vert}$ are bounded predictable processes, setting ${Y=\int\beta\,dX}$, Lemma 6 gives the following inequality

 $\displaystyle D_X(\alpha) = D^{\rm sm}\left(\int\alpha\beta^{-1}\,dY\right)\le D^{\rm sm}(Y) = D_X(\beta).$ (4)

The following will be used to show that certain sums of predictable processes converge absolutely to an X-integrable process.

Lemma 9 Let X be a semimartingale and ${\alpha^n}$ be a sequence of bounded predictable processes with ${\sum_n D_X(\alpha^n)<\infty}$. Setting ${\alpha=\sum_n\vert\alpha^n\vert}$ then, for each t,

 $\displaystyle \left\{\int_0^t\zeta\,dX\colon \zeta\in{\rm b}\mathcal{P},\vert\zeta\vert\le\alpha\right\}$ (5)

is bounded in probability. Furthermore, if ${\zeta^n\rightarrow 0}$ is a sequence of bounded predictable processes with ${\vert\zeta^n\vert\le\alpha}$ then ${\int\zeta^n\,dX\xrightarrow{\rm sm}0}$.

Proof: Setting ${\beta^n=\sum_{k=1}^n\vert\alpha^k\vert}$ then for any bounded predictable process ${\vert\zeta\vert\le\alpha}$,

 $\displaystyle (\zeta\wedge\beta^n)\vee(-\beta^n)\le \sum_{k=1}^n\vert\zeta\vert\wedge\vert\alpha^k\vert.$

So, for constant ${\lambda}$, dominated convergence gives

 $\displaystyle \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle D_X(\lambda\zeta) &\displaystyle=\lim_{n\rightarrow\infty}D_X\left(\lambda(\zeta\wedge\beta^n)\vee(-\beta^n)\right)\smallskip\\ &\displaystyle\le \lim_{n\rightarrow\infty}\sum_{k=1}^nD_X\left(\lambda\vert\zeta\vert\wedge\vert\alpha^k\vert\right)\smallskip\\ &\displaystyle=\sum_{n=1}^\infty D_X\left(\lambda\vert\zeta\vert\wedge\vert\alpha^n\vert\right). \end{array}$

The inequality here is just the triangle inequality combined with (4). It follows that if ${\vert\lambda_m\vert\le 1}$ are real numbers and ${\vert\zeta^m\vert\le\alpha}$ are bounded predictable processes such that either ${\lambda_m\rightarrow 0}$ or ${\zeta^m\rightarrow 0}$ then,

 $\displaystyle \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle\lim_{m\rightarrow\infty}D_X(\lambda_m\zeta^m)&\displaystyle\le\lim_{m\rightarrow\infty}\sum_{n=1}^\infty D_X(\lambda_m\vert\zeta^m\vert\wedge\vert\alpha^n\vert)\smallskip\\ &\displaystyle=\sum_{n=1}^\infty\lim_{m\rightarrow\infty} D_X(\lambda_m\vert\zeta^m\vert\wedge\vert\alpha^n\vert)\smallskip\\ &\displaystyle=0. \end{array}$

Here, the terms inside the summation are bounded by ${D_X(\alpha^n)}$ and, as this has finite sum, the limit ${m\rightarrow\infty}$ can be commuted with the sum over n. Also, ${\vert\lambda_m\vert\,\vert\zeta^m\vert\wedge\vert\alpha^n\vert\le\vert\alpha^n\vert}$ are uniformly bounded and tend to zero as ${m\rightarrow\infty}$. So dominated convergence is used to deduce that ${D_X(\lambda_m\vert\zeta^m\vert\wedge\vert\alpha^n\vert)}$ tends to zero as m goes to infinity. It follows that ${\lambda_m\int\zeta^m\,dX\xrightarrow{\rm sm}0}$ as m goes to infinity. In particular, taking ${\lambda_m=1}$ and ${\zeta^m\rightarrow 0}$ gives ${\int\zeta^m\,dX\xrightarrow{\rm sm}0}$. Similarly, taking ${\lambda_m\rightarrow 0}$ gives ${\lambda_m\int_0^t\zeta^m\,dX\rightarrow 0}$ in probability for each ${t\ge 0}$ and, by the sequential characterization of boundedness, the set in (5) is bounded in probability as required. ⬜

Next, the sum ${\sum_n\vert\alpha^n\vert}$ in the above lemma is finite outside of a negligible set.

Lemma 10 Let X and ${\alpha}$ be as in Lemma 9. Then, the set ${A\equiv\{\alpha=\infty\}\subseteq\Omega}$ is predictable and

 $\displaystyle \int 1_A\xi\,dX = 0$

for all bounded predictable ${\xi}$.

Proof: The fact that A is predictable follows from the condition that ${\alpha^n}$ are predictable. Now choose any bounded predictable ${\xi}$, any positive time t, and set ${U\equiv\int_0^t 1_A\xi\,dX}$. If ${\lambda_n}$ is a sequence of real numbers tending to infinity then, as ${\alpha=\infty}$ on A, the processes ${\vert\lambda_n 1_A\xi\vert}$ are all bounded by ${\alpha}$. Then, Lemma 9 says that the sequence ${\lambda_nU}$ is bounded in probability. However, by construction, ${\lambda_nU\rightarrow\pm\infty}$ as ${n\rightarrow\infty}$ whenever ${U\not=0}$. So, U=0 almost surely, as required. ⬜

Finally, the reverse inclusion to (3) can be shown, completing the proof of Theorem 2.

Lemma 11 Let X be a semimartingale, S be the set of integrals ${\int\xi\,dX}$ for bounded elementary ${\xi}$, and ${\bar S}$ be the semimartingale closure of S. Then,

 $\displaystyle \bar S\subseteq\left\{\int\xi\,dX\colon\xi\in L^1(X)\right\}.$

Proof: By definition, for any ${Y\in\bar S}$ there is a sequence of bounded elementary ${\xi^n}$ such that ${\int\xi^n\,dX\xrightarrow{\rm sm}Y}$. So, ${\int(\xi^m-\xi^n)\,dX\xrightarrow{\rm sm}0}$ as m,n go to infinity. Equivalently, ${D_X(\xi^m-\xi^n)\rightarrow 0}$. Passing to a subsequence if necessary, we may suppose that ${D_X(\xi^{n+1}-\xi^n)\le 2^{-n}}$.

Then, set ${\alpha^n=\xi^{n+1}-\xi^n}$ and ${\alpha=\sum_n\vert\alpha^n\vert}$. The sum

 $\displaystyle \sum_{n=1}^\infty D_X(\alpha^n)\le\sum_{n=1}^\infty 2^{-n}=1$

is finite. So, setting ${A\equiv\{\alpha=\infty\}}$, Lemma 9 shows that all sequences of bounded predictable processes ${\vert\zeta^n\vert\le 1_{A^c}\alpha}$ with ${\zeta^n\rightarrow 0}$ satisfy ${\int\zeta^n\,dX\xrightarrow{\rm sm}0}$. Therefore, ${1_{A^c}\alpha}$ is X-integrable. Furthermore, ${\sum_n\alpha^n}$ is absolutely convergent outside of A. Assuming, without loss of generality, that ${\xi^1}$ is identically zero, define the following limit

 $\displaystyle \xi\equiv \lim_{n\rightarrow\infty}1_{A^c}\xi^n=\sum_{n=1}^\infty1_{A^c}\alpha^n.$

As ${\vert1_{A^c}\xi^n\vert}$ are bounded by the X-integrable process ${1_{A^c}\alpha}$, ${\xi}$ will be X-integrable and, by dominated convergence

 $\displaystyle \int\xi^n\,dX=\int 1_{A^c}\xi^n\,dX\xrightarrow{\rm sm}\int\xi\,dX.$

Finally, Lemma 10 gives ${\int 1_A\xi^n\,dX=0}$ so,

 $\displaystyle Y = \lim_{n\rightarrow\infty}\int\xi^n\,dX = \lim_{n\rightarrow\infty}\int 1_{A^c}\xi^n\,dX = \int\xi\,dX$

as required. ⬜

Notes

Completeness of the set of stochastic integrals with respect to a semimartingale was originally stated by Memin, in the paper Espaces de semi martingales et changement de probabilité. This uses quite different techniques to those employed in this post. In Memin’s approach, stochastic integration was defined with respect to decompositions into local martingale and finite variation terms. Completeness of stochastic integrals for the local martingale and finite variation components can be inferred from completeness of ${L^p}$ spaces. Then, changes of measure are used to prove completeness for arbitrary semimartingales.