Extending Filtered Probability Spaces

In stochastic calculus it is common to work with processes adapted to a filtered probability space { (\Omega,\mathcal F,\{\mathcal F_t\}_{t\ge0},{\mathbb P})}. As with probability space extensions, It can sometimes be necessary to enlarge the underlying space to introduce additional events and processes. For example, many diffusions and local martingales can be expressed as an integral with respect to Brownian motion but, sometimes, it may be necessary to enlarge the space to make sure that it includes a Brownian motion to work with. Also, in the theory of stochastic differential equations, finding solutions can sometimes require enlarging the space.

Extending a probability space is a relatively straightforward concept, which I covered in an earlier post. Extending a filtered probability space is the same, except that it also involves enlarging the filtration {\{\mathcal F_t\}_{t\ge0}}. It is important to do this in a way which does not destroy properties of existing processes, such as their distributions conditional on the filtration at each time.

Let’s consider a filtered probability space {(\Omega,\mathcal F,\{\mathcal F_t\}_{t\ge0},{\mathbb P})}. An enlargement

\displaystyle \pi\colon (\Omega',\mathcal F',\{\mathcal F'_t\}_{t\ge0},{\mathbb P}')\rightarrow(\Omega,\mathcal F,\{\mathcal F_t\}_{t\ge0},{\mathbb P})

is, firstly, an extension of the probability spaces. It is a map from Ω′ to Ω measurable with respect to {\mathcal F'} and {\mathcal F}, and preserving probabilities. So ℙ′(π-1E) = π(E) for all { E\in\mathcal F}. In addition, it is required to be {\mathcal F'_t/\mathcal F_t} measurable for each time t ≥ 0, meaning that {\pi^{-1}(E)\in\mathcal F'_t} for all { E\in\mathcal F_t}. Consequently, any adapted process Xt lifts to an adapted process Xt = πXt on the larger space, defined by Xt(ω) = Xt(π(ω)).

As with extensions of probability spaces, this can be considered in two steps. First, we extend to the filtered probability space on Ω′ with induced sigma-algebra {\pi^*\mathcal F} consisting of sets π-1E for { E\in\mathcal F}, and to the filtration {\pi^*\mathcal F_t}. This is essentially a no-op, since events and random variables on the original filtered probability space are in one-to-one correspondence with those on the enlarged space, up to zero probability events. Next, the sigma-algebras are enlarged to {\mathcal F'\supseteq\pi^*\mathcal F} and {\mathcal F'_t\supseteq\pi^*\mathcal F_t}. This is where new random events are added to the event space and filtration.

Such arbitrary extensions are too general for many uses in stochastic calculus where we merely want to add in some additional source of randomness. Consider, for example, a standard Brownian motion B defined on the original space so that, for any times s < t, Bt – Bs is normal and independent of {\mathcal F_s}. Does it necessarily lift to a Brownian motion on the enlarged space? The answer to this is no! It need not be the case that Bt – Bs is independent of {\mathcal F'_s}. For an extreme case, consider the situation where {(\Omega',\mathcal F',{\mathbb P}')=(\Omega,\mathcal F,{\mathbb P})} and π is the identity, so there is no enlargement of the sample space. If the filtration is is extended to the maximum, {\mathcal F'_t=\mathcal F}, consider what happens to our Brownian motion. The increment Bt – Bs is {\mathcal F'_s}-measurable, so is not independent of it. In fact, conditioned on {\mathcal F'_0}, the entire path of B is deterministic. It is definitely not a Brownian motion with respect to this new filtration. Similarly, martingales, submartingales and supermartingales will not remain as such if we pass to this enlarged filtration.

The idea is that, if { Y={\mathbb E}[X\vert\mathcal F_t]} for random variables X, Y defined on our original probability space, then this relation should continue to hold in the extension. It is required that { Y^*={\mathbb E}[X^*\vert\mathcal F'_t]}. This is exactly relative independence of {\mathcal F'_t} and {\pi^*\mathcal F} over {\pi^*\mathcal F_t}.

Recall that two sigma-algebras {\mathcal G} and {\mathcal H} are relatively independent over a third {\mathcal K\subseteq\mathcal G\cap\mathcal H} if

\displaystyle {\mathbb P}(A\cap B) = {\mathbb E}\left[{\mathbb P}(A\vert\mathcal K){\mathbb P}(B\vert\mathcal K)\right]

for all { A\in\mathcal G} and { B\in\mathcal H}. The following properties are each equivalent to this definition;

  • {{\mathbb E}[XY\vert\mathcal K]={\mathbb E}[X\vert\mathcal K]{\mathbb E}[Y\vert\mathcal K]} for all bounded {\mathcal G}-measurable random variables X and {\mathcal H}-measurable Y.
  • {{\mathbb E}[X\vert\mathcal G]={\mathbb E}[X\vert\mathcal K]} for all bounded {\mathcal H}-measurable X.
  • {{\mathbb E}[X\vert\mathcal H]={\mathbb E}[X\vert\mathcal K]} for all bounded {\mathcal G}-measurable X.

This leads us to the idea of a standard extension of filtered probability spaces.

Definition 1 An extension of filtered probability spaces

\displaystyle \pi\colon(\Omega',\mathcal F', \{\mathcal F'_t\}_{t\ge0},{\mathbb P}')\rightarrow(\Omega,\mathcal F, \{\mathcal F_t\}_{t\ge0},{\mathbb P})

is standard if, for each time t ≥ 0, the sigma-algebras {\mathcal F'_t} and {\pi^*\mathcal F} are relatively independent over {\pi^*\mathcal F_t}.

Continue reading “Extending Filtered Probability Spaces”

Probability Space Extensions and Relative Products

According to Kolmogorov’s axioms, to define a probability space we start with a set Ω and an event space consisting of a sigma-algebra F  on Ω. A probability measure on this gives the probability space (Ω, F , ℙ), on which we can define random variables as measurable maps from Ω to the reals or other measurable space.

However, it is common practice to suppress explicit mention of the underlying sample space Ω. The values of a random variable X: Ω → ℝ are simply written as X, rather than X(ω) for ω ∈ Ω. It is intuitively thought of as a real number which happens to be random, rather than a function. For one thing, we usually do not really care what the sample space is and, instead, only care about events and their probabilities, and about random variables and their expectations. This philosophy has some benefits. Frequently, when performing constructions, it can be useful to introduce new supplementary random variables to work with. It may be necessary to enlarge the sample space and add new events to the sigma-algebra to accommodate these. If the underlying space is not set in stone then this is straightforward to do, and we can continue to work with these new variables as if they were always there from the start.

Definition 1 An extension π of a probability space (Ω, F , ℙ) to a new space (Ω′, F ′, ℙ′),

\displaystyle \pi\colon(\Omega',\mathcal F',{\mathbb P}')\rightarrow(\Omega,\mathcal F,{\mathbb P}),

is a probability preserving measurable map π: Ω′ → Ω. That is, ℙ′(π-1E) = ℙ(E) for events E ∈ F .

By construction, events E ∈ F  pull back to events π-1E ∈ F  with the same probabilities. Random variables X defined on (Ω, F , ℙ) lift to variables πX with the same distribution defined on (Ω′, F ′, ℙ′), given by πX(ω) ≡ X(π(ω)). I will use the notation X in place of πX for brevity although, in applications, it is common to reuse the same symbol X and simply note that we are now working with respect to an enlarged the probability space if necessary.

\displaystyle \arraycolsep=4pt\begin{array}{rcl} \Omega'&\xrightarrow{\displaystyle\ \pi\ }&\Omega\medskip\\ & \hspace{-2em}{}_{{}_{\displaystyle X^*}}\hspace{-0.6em}\searrow&\Big\downarrow X\medskip\\ &&\,{\mathbb R} \end{array}

The extension can be thought of in two steps. First, the enlargement of the sample space, π: Ω′ → Ω on which we induce the sigma algebra πF  consisting of events π-1E for E ∈ F , and the measure ℙ′(π-1E) = ℙ(E). This is essentially a no-op, since events and random variables on the initial space are in one-to-one correspondence with those on the enlarged space (at least, up to zero probability events). Next, we enlarge the sigma-algebra to F ′ ⊇ πF  and extend the measure ℙ′ to this. It is this second step which introduces new events and random variables.

Since we may want to extend a probability space more than a single time, I look at how these combine. Consider an extension π of the original probability space, and then a further extension ρ of this.

\displaystyle (\Omega'',\mathcal F'',{\mathbb P}'')\xrightarrow{\rho} (\Omega',\mathcal F',{\mathbb P}')\xrightarrow{\pi} (\Omega,\mathcal F,{\mathbb P}).

These can be combined into a single extension ϕ = π○ρ of the original space,

\displaystyle \phi\colon(\Omega'',\mathcal F'')\rightarrow(\Omega,\mathcal F,{\mathbb P}).

Lemma 2 The composition ϕ = π○ρ is itself an extension of the probability space.

Proof: As compositions of measurable maps are measurable, it is sufficient to check that ϕ preserves probabilities. This is straightforward,

\displaystyle {\mathbb P}''(\phi^{-1}E)={\mathbb P}''(\rho^{-1}\pi^{-1}E)={\mathbb P}'(\pi^{-1}E)={\mathbb P}(E)

for all E ∈ F . ⬜

So far, so simple. The main purpose of this post, however, is to look at the situation with two separate extensions of the same underlying space. Both of these will add in some additional source of randomness, and we would like to combine them into a single extension.

Separate probability spaces can be combined by the product measure, which is the measure on the product space for which the projections onto the original spaces preserves probability, and for which the sigma-algebras generated by these projections are independent. Recall that a pair of sigma-algebras F  and G  defined on a probability space are independent if, for any sets A ∈ F  and B ∈ G  then ℙ(A ∩ B) = ℙ(A)ℙ(B).

Combining extensions of probability spaces will, instead, make use of relative independence.

Definition 3 Let (Ω, F , ℙ) be a probability space. Two sub-sigma-algebras G , H  ⊆ F  are relatively independent over a third sigma-algebra K  ⊆ G  ∩ H  if

\displaystyle {\mathbb P}(A\cap B) = {\mathbb E}\left[{\mathbb P}(A\vert\mathcal K){\mathbb P}(B\vert\mathcal K)\right] (1)

for all A ∈ G  and B ∈ H .

It can be shown that the following properties are each equivalent to this definition;

  • 𝔼[XY|K ] = 𝔼[X|K ]𝔼[Y|K ] for all bounded G -measurable random variables X and H -measurable Y.
  • 𝔼[X|G ] = 𝔼[X|K ] for all bounded H -measurable X.
  • 𝔼[X|H ] = 𝔼[X|K ] for all bounded G -measurable X.

Once a probability measure is specified separately on G  and H  then its extension to the sigma-algebra generated by G  ∪ H , if it exists, is uniquely determined by relative independence. This is a consequence of the pi-system lemma, since (1) defines it on the events {A ∩ B: A ∈ G , B ∈ H }, which is a pi-system generating the same sigma-algebra.

Now consider two separate extensions π1 and π2 of the same underlying probability space,

\displaystyle (\Omega_1,\mathcal F^1,{\mathbb P}_1)\xrightarrow{\pi_1} (\Omega,\mathcal F,{\mathbb P})\xleftarrow{\pi_2} (\Omega_2,\mathcal F^2,{\mathbb P}_2)

As maps between sets, these can both be embedded into a single extension known as the pullback or fiber product. This is the set Ω′= Ω1 ×Ω Ω2 defined by

\displaystyle \Omega_1\times_{\Omega}\Omega_2=\left\{(\omega_1,\omega_2)\in\Omega_1\times\Omega_2\colon\pi_1(\omega_1)=\pi_2(\omega_2)\right\}.

Defining projection maps ρi: Ω′ → Ωi by

\displaystyle \rho_1(\omega_1,\omega_2)=\omega_1,\ \rho_2(\omega_1,\omega_2)=\omega_2

results in a commutative square with ϕ ≡ π1ρ1 = π2ρ2,

\displaystyle \arraycolsep=1.4pt\begin{array}{rcl} \Omega'\ &\xrightarrow{\displaystyle\ \rho_1\ }&\Omega_1\medskip\\ {\rho_2}\Big\downarrow\,\ &\searrow^{\hspace{-0.3em}\displaystyle\phi}&\,\Big\downarrow{\pi_1}\medskip\\ \Omega_2\,&\xrightarrow{\displaystyle\ \pi_2\ }&\,\Omega \end{array}

In fact, Ω′ is exactly the cartesian product Ω1 × Ω2 restricted to the subset on which π1ρ1 and π2ρ2 agree.

This constructs an extension ϕ of the sample space containing π1 and π2 as sub-extensions. However, it still needs to be made into a probability space. Use the smallest sigma-algebra F  on Ω′ making ρ1, ρ2 into measurable maps, which is generated by ρ1F 1 ∪ ρ2F 2. The probability measure ℙ′ on (Ω′, F ′) is uniquely determined on each of the sub-sigma-algebras by the requirement that ρi preserve probabilities,

\displaystyle {\mathbb P}'(\rho_i^{-1}A)={\mathbb P}_i(A)

for i = 1, 2 and A ∈ F i. These necessarily agree on ϕF  ⊆ ρ1F 1 ∩ ρ2F 2,

\displaystyle {\mathbb P}'(\phi^*A)={\mathbb P}'(\rho_i^{-1}\pi_i^{-1}A)={\mathbb P}_i(\pi_i^{-1}A)={\mathbb P}(A)

for A ∈ F . The natural way to extend ℙ′ to all of F  is to use relative independence over ϕF .

Definition 4 The relative product of the extensions π1 and π2 is the extension

\displaystyle \phi\colon(\Omega',\mathcal F',{\mathbb P}')\rightarrow(\Omega,\mathcal F,{\mathbb P})

with ϕ, Ω′, F  constructed as above, and ℙ′ is the unique probability measure for which the projections ρ1, ρ2 preserve probabilities, and for which ρ1F 1 and ρ2F 2 are relatively independent over ϕF .

Continue reading “Probability Space Extensions and Relative Products”