Brownian Bridge Fourier Expansions

Sine series
Figure 1: Sine series approximations to a Brownian bridge

Brownian bridges were described in a previous post, along with various different methods by which they can be constructed. Since a Brownian bridge on an interval {[0,T]} is continuous and equal to zero at both endpoints, we can consider extending to the entire real line by partitioning the real numbers into intervals of length T and replicating the path of the process across each of these. This will result in continuous and periodic sample paths, suggesting another method of representing Brownian bridges. That is, by Fourier expansion. As we will see, the Fourier coefficients turn out to be independent normal random variables, giving a useful alternative method of constructing a Brownian bridge.

There are actually a couple of distinct Fourier expansions that can be used, which depends on precisely how we consider extending the sample paths to the real line. A particularly simple result is given by the sine series, which I describe first. This is shown for an example Brownian bridge sample path in figure 1 above, which plots the sequence of approximations formed by truncating the series after a small number of terms. This tends uniformly to the sample path, although it is quite slow to converge as should be expected when approximating such a rough path by smooth functions. Also plotted, is the series after the first 100 terms, by which time the approximation is quite close to the target. For simplicity, I only consider standard Brownian bridges, which are defined on the unit interval {[0,1]}. This does not reduce the generality, since bridges on an interval {[0,T]} can be expressed as scaled versions of standard Brownian bridges.

Theorem 1 A standard Brownian bridge B can be decomposed as

\displaystyle  B_t=\sum_{n=1}^\infty\frac{\sqrt2Z_n}{\pi n}\sin(\pi nt) (1)

over {0\le t\le1}, where {Z_1,Z_2,\ldots} is an IID sequence of standard normals. This series converges uniformly in t, both with probability one and in the {L^p} norm for all {1\le p < \infty}.

Proof: We extend the Brownian bridge to be of period 2 and odd. That is, it is first extended to the interval {[-1,1]} by setting {B_{-t}=-B_t}, and then to the rest of the real number line by periodicity, {B_{t+2}=B_t}. The Fourier sine series expansion is,

\displaystyle  \begin{aligned} B_t&=\sum_{n=1}^\infty c_n\sin(\pi nt),\\ c_n&=\int_{-1}^1B_t\sin(\pi nt)dt. \end{aligned}

We will deal with the uniform convergence in a moment but, first, let us compute the Fourier coefficients. We may suppose that the Brownian bridge is constructed from a Brownian motion X by {B_t=X_t-tX_1} for {0\le t\le1}. Then, integration by parts gives

\displaystyle  c_n=\frac2{n\pi}\int_0^1\cos(\pi nt)dX_t.

As these are integrals of deterministic functions with respect to Brownian motion, they are joint normal with zero mean and covariances,

\displaystyle  {\mathbb E}[c_mc_n] =\frac{4}{mn\pi^2}\int_0^1\cos(\pi mt)\cos(\pi nt)dt.

For {m\not=n}, this integral is zero and, hence, {c_n} is a sequence of independent normal random variables. Also, taking {m=n} gives {{\mathbb E}[c_n^2]=2/(\pi n)^2} and, hence, {Z_n\equiv n\pi c_n/\sqrt2} is a sequence of independent standard normals. Substituting this into the Fourier series gives (1).

Only uniform convergence remains to be shown. Since we know that Brownian motion sample paths are locally Hölder continuous, I make use of a standard result for convergence of Fourier series of Hölder continuous functions. Using {B^N_t} to denote the sum of the first N terms on the right hand side of (1), we need to show that {\sup_{t\in[0,1]}\lvert B^N_t-B_t\rvert} tends to zero both almost surely and in the {L^p} norms as N goes to infinity, for each {1\le p < \infty}.

If the sample path is {\gamma}-Hölder continuous with coefficient {C_\gamma}, then the approximations satisfy the bound

\displaystyle  \sup_t\lvert B^N_t-B_t\rvert\le KC_\gamma N^{-\gamma}\log N

for a constant K independent of both {N\ge2} and of the sample path (Dunham Jackson The theory of Approximation, AMS Colloquium Publication Volume XI, New York 1930). In particular, so long as we can find {\gamma} and {1\le p < \infty} such that {\gamma p > 1} and {C_\gamma} is {L^p}-integrable then,

\displaystyle  \sum_N{\mathbb E}[\sup_t\lvert B^N_t-B_t\rvert^p]\le K^p{\mathbb E}[C_\gamma^p]\sum_N N^{-\gamma p}\log N < \infty.

This implies that {B^N\rightarrow B} uniformly both with probability one and in the {L^p} norm. To find suitable {\gamma} and p, we use the Kolmogorov continuity theorem. For any times {0\le s\le t\le1} the covariance structure of a Brownian bridge shows that {B_t-B_s} has variance bounded by {t-s} and, hence,

\displaystyle  {\mathbb E}[\lvert B_t-B_s\rvert^p]\le a_p\lvert t-s\rvert^{p/2}

for any {p > 0} and constant {a_p}. The continuity theorem states that the sample paths are {\gamma}-Hölder continuous for all {\gamma < 1/2 - 1/p} and, furthermore, the {\gamma}-Hölder coefficient is {L^p}-integrable. So long as {p > 4}, we can ensure that {\gamma p > 1}, proving uniform convergence of the sine series both almost surely and in the {L^p} norm. Finally, as the {L^p} norm is increasing in p, this ensures convergence in the {L^p} norm for all {1\le p\le\infty}. ⬜

An alternative Fourier expansion can be used, which has both sine and cosine terms so s a little more complicated. However, it is still the case that the coefficients are independent normals, and this expansion is sometimes preferred to the sine series given above.

Theorem 2 A Brownian bridge B can be decomposed as,

\displaystyle  \begin{aligned} B_t &=\bar B+\sum_{n=1}^\infty\frac1{\sqrt2\pi n}(Y_n\cos(2\pi nt)+Z_n\sin(2\pi nt))\\ &=\sum_{n=1}^\infty\frac1{\sqrt2\pi n}(Y_n(\cos(2\pi nt)-1)+Z_n\sin(2\pi nt)) \end{aligned} (2)

over {0\le t\le1}, where {Y_1,Z_1,Y_2,Z_2,\ldots} is an IID sequence of standard normals and {\bar B} is the sample mean of B,

\displaystyle  \bar B = \int_0^1 B_tdt =-\sum_{n=1}^\infty\frac{Y_n}{\sqrt 2\pi n}. (3)

The series converges uniformly in t, both with probability one and in the {L^p} norm for all {1\le p < \infty}.

Proof: We extend the Brownian bridge continuously on the real line to be of period 1, {B_{t+1}=B_t}, which uniquely determines B in terms of its values on the unit interval {[0,1]}. The Fourier series decomposition is,

\displaystyle  \begin{aligned} B_t&=\bar B+\sum_{n=1}^\infty(a_n\cos(2\pi nt)+b_n\sin(2\pi nt)),\\ a_n&=2\int_0^1B_t\cos(2\pi nt)dt,\\ b_n&=2\int_0^1B_t\sin(2\pi nt)dt. \end{aligned}

As above, we suppose that the Brownian bridge is constructed from a standard Brownian motion X by {B_t=X_t-tX_1}. Applying integration by parts,

\displaystyle  \begin{aligned} a_n&=\frac{-1}{n\pi}\int_0^1\sin(2\pi nt)dX_t,\\ b_n&=\frac{1}{n\pi}\int_0^1\cos(2\pi nt)dX_t. \end{aligned}

As they are integrals of a deterministic function with respect to Brownian motion, these coefficients are joint normal with zero mean and correlations given by,

\displaystyle  \begin{aligned} {\mathbb E}[a_ma_n]&=\frac{1}{mn\pi^2}\int_0^1\cos(2\pi mt)\cos(2\pi nt)dt,\\ {\mathbb E}[b_mb_n]&=\frac{1}{mn\pi^2}\int_0^1\sin(2\pi mt)\sin(2\pi nt)dt,\\ {\mathbb E}[a_mb_n]&=\frac{-1}{mn\pi^2}\int_0^1\cos(2\pi mt)\sin(2\pi nt)dt=0. \end{aligned}

Consequently, {{\mathbb E}[a_ma_n]={\mathbb E}[b_mb_n]=0} whenever {m\not=n}. Hence, {\{a_1,b_1,a_2,b_2,\ldots\}} are independent random variables and, by the expression above, they have variances {{\mathbb E}[a_n^2]={\mathbb E}[b_n^2]=1/(2n^2\pi^2)}. Hence, {Y_n\equiv\sqrt2n\pi a_n} and {Z_n\equiv\sqrt2n\pi b_n} are independent standard normals. Substituting these coefficients back into the Fourier expansion gives the first line of (2). Evaluating at {t=0} gives the expression (3) for {\bar B} and, substituting this back into the Fourier expansion gives the second line of (2).

Finally, uniform convergence of the Fourier expansion follows using exactly the same argument as in the proof of theorem 1 above. ⬜

With {Y_n,Z_n} being as in (2), consider complex-valued random variables {U_n=Y_n-iZ_n}. These are IID with real and imaginary parts being independent standard normals. It is straightforward to see that {\omega U_n} has the same distribution as {U_n} for any {\omega\in{\mathbb C}} with {\lvert\omega\rvert=1}. Using {\{t\}} to denote the fractional part of real number t, we can rewrite (2) as

\displaystyle  B_{\{t\}}-\bar B = \sum_{n=1}^\infty\frac1{\sqrt2\pi n}\Re\left[U_ne^{2\pi int}\right]

for all {t\in{\mathbb R}}. Now, for any fixed {T\in{\mathbb R}}, the effect of replacing t by {T+t} on the right hand side is the same as multiplying {U_n} by {e^{2\pi inT}} which, as we already noted, does not affect its distribution. This gives an alternative way to see that the joint distribution of {B_{\{t\}}-\bar B} is translation invariant with respect to the time index.

3 thoughts on “Brownian Bridge Fourier Expansions

  1. In case it of use to anyone, the python3 code for generating the plot at the top of the post is below.
    As all of the Brownian bridge samples and terms in the sine series are joint normal, I just compute their covariances and use the numpy multivariate_normal function to generate the samples.

    import numpy as np
    import matplotlib.pyplot as plt
    
    # compute Brownian bridge and sine series
    nt = 400
    nsines = 100
    np.random.seed(6)
    times = np.linspace(0.0, 1.0, nt)
    nrands = nt + nsines
    cov = np.zeros(shape=(nrands, nrands))
    
    for i, t in enumerate(times):
        cov[i, :i+1] = cov[:i+1, i] = times[:i+1] * (1-t)
    
    for i in range(nsines):
        cov[nt + i, nt + i] = c = 2 / (np.pi * (i+1))**2
        cov[nt + i, :nt] = cov[:nt, nt+i] = np.sin(times * np.pi * (i+1)) * c
    
    rands = np.random.multivariate_normal(np.zeros(shape=(nrands,)), cov)
    series = np.cumsum([np.sin(times * np.pi * (i+1)) * rands[nt + i] for i in range(nsines)], axis=0)
    
    # plot results
    fig = plt.figure()
    ax = fig.add_subplot(111)
    ax.plot(times, rands[0:nt], label='Brownian bridge', linewidth=1, color='black')
    for i in range(39):
        alpha = 0.8 * np.exp(-i * 0.1)
        label = 'sine approximations' if i == 0 else None
        ax.plot(times, series[i], label=label, linewidth=1, color='blue', alpha=alpha)
    ax.plot(times, series[-1], label='sine approx., {} terms'.format(nsines), linewidth=1, color='red')
    ax.plot([0, 1], [0, 0], linewidth=0.5, color='black')
    ax.set_xlim(0, 1)
    ax.set_xticks([])
    ax.set_yticks([])
    plt.subplots_adjust(left=0.01, right=0.99, bottom=0.01, top=0.99, hspace=0, wspace=0)
    ax.legend(loc='lower right')
    plt.show()
  2. First, I would like to thank M.Lowther for the diversity and quality of the probabilistic content we have the happiness to discover on his website. As a young probabilist, it’s always a pleasure to perfect my knowledge with this kind of ressources and I hope this website can also make some newcomers discover the beauty of probability.

    I am a bit perplex on the way you derive the Hölder continuity of the Brownian bridge. Precisely this sentence : “For any times {0\le s\le t\le1} the covariance structure of a Brownian bridge shows that {B_t-B_s} has variance bounded by {t-s} and, hence,

    \displaystyle  {\mathbb E}[\lvert B_t-B_s\rvert^p]\le a_p\lvert t-s\rvert^{p/2} “.

    I know how to obtain this property thanks to Cameron Martin theorem but how do you get this inequality from the previous one on the order 2 moment? It is probably an inequality like Cauchy Schwarz or Hölder but I don’t see the right one.

    Thanks again for your answer.

    1. Hi pascalcule,

      This is just a simple consequence of normality. Hölder, Cauchy-Schwarz, etc, are not needed.
      As B_t – B_s is normal with zero mean, it is equal to σX for a standard normal X. So, its p’th moment is σ^pE[X^p]. As was noted, σ^2 is bounded by a multiple of t – s, and E[X^p] is a constant (depending only on p), giving the inequality you quote.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s