# Brownian Motion and the Riemann Zeta Function

Intriguingly, various constructions related to Brownian motion result in quantities with moments described by the Riemann zeta function. These distributions appear in integral representations used to extend the zeta function to the entire complex plane, as described in an earlier post. Now, I look at how they also arise from processes constructed from Brownian motion such as Brownian bridges, excursions and meanders.

Recall the definition of the Riemann zeta function as an infinite series

 $\displaystyle \zeta(s)=1+2^{-s}+3^{-s}+4^{-s}+\cdots$

which converges for complex argument s with real part greater than one. This has a unique extension to an analytic function on the complex plane outside of a simple pole at s = 1.

Often, it is more convenient to use the Riemann xi function which can be defined as zeta multiplied by a prefactor involving the gamma function,

 $\displaystyle \xi(s)=\frac12s(s-1)\pi^{-s/2}\Gamma(s/2)\zeta(s).$

This is an entire function on the complex plane satisfying the functional equation ξ(1 - s) = ξ(s).

It turns out that ξ describes the moments of a probability distribution, according to which a random variable X is positive with moments

 $\displaystyle {\mathbb E}[X^s]=2\xi(s),$ (1)

which is well-defined for all complex s. In the post titled The Riemann Zeta Function and Probability Distributions, I denoted this distribution by Ψ, which is a little arbitrary but was the symbol used for its probability density. A related distribution on the positive reals, which we will denote by Φ, is given by the moments

 $\displaystyle {\mathbb E}[X^s]=\frac{1-2^{1-s}}{s-1}2\xi(s)$ (2)

which, again, is defined for all complex s.

As standard, complex powers of a positive real x are defined by xs = eslogx, so (1,2) are equivalent to the moment generating functions of logX, which uniquely determines the distributions. The probability densities and cumulative distribution functions can be given, although I will not do that here since they are already explicitly written out in the earlier post. I will write X ∼ Φ or X ∼ Ψ to mean that random variable X has the respective distribution. As we previously explained, these are closely connected:

• If X ∼ Ψ and, independently, Y is uniform on [1, 2], then X/Y ∼ Φ.
• If X, Y ∼ Φ are independent then X2 + Y2 ∼ Ψ.

The purpose of this post is to describe some constructions involving Brownian bridges, excursions and meanders which naturally involve the Φ and Ψ distributions.

Theorem 1 The following have distribution Φ:

1. 2/πZ where Z = supt|Bt| is the absolute maximum of a standard Brownian bridge B.
2. Z/√ where Z = suptBt is the maximum of a Brownian meander B.
3. Z where Z is the sample standard deviation of a Brownian bridge B,

 $\displaystyle Z=\left(\int_0^1(B_t-\bar B)^2\,dt\right)^{\frac12}$

with sample mean  = ∫01Btdt.

4. π/2Z where Z is the pathwise Euclidean norm of a 2-dimensional Brownian bridge B = (B1, B2),

 $\displaystyle Z=\left(\int_0^1\lVert B_t\rVert^2\,dt\right)^{\frac12}$
5. τπ/2 where τ = inf{t ≥ 0: ‖Bt‖= 1} is the first time at which the norm of a 3-dimensional standard Brownian motion B = (B1, B2, B3) hits 1.

The Kolmogorov distribution is, by definition, the absolute maximum of a Brownian bridge. So, the first statement of theorem 1 is saying that Φ is just the Kolmogorov distribution scaled by the constant factor 2/π. Moving on to Ψ;

Theorem 2 The following have distribution Ψ:

1. 2/πZ where Z = suptBt – inftBt is the range of a standard Brownian bridge B.
2. 2/πZ where Z = suptBt is the maximum of a (normalized) Brownian excursion B.
3. π/2Z where Z is the pathwise Euclidean norm of a 4-dimensional Brownian bridge B = (B1, B2, B3, B4),

 $\displaystyle Z=\left(\int_0^1\lVert B_t\rVert^2\,dt\right)^{\frac12}.$

# Brownian Meanders

Having previously looked at Brownian bridges and excursions, I now turn to a third kind of process which can be constructed either as a conditioned Brownian motion or by extracting a segment from Brownian motion sample paths. Specifically, the Brownian meander, which is a Brownian motion conditioned to be positive over a unit time interval. Since this requires conditioning on a zero probability event, care must be taken. Instead, it is cleaner to start with an alternative definition by appropriately scaling a segment of a Brownian motion.

For a fixed positive times T, consider the last time σ before T at which a Brownian motion X is equal to zero,

 $\displaystyle \sigma=\sup\left\{t\le T\colon X_t=0\right\}.$ (1)

On interval [σ, T], the path of X will start from 0 and then be either strictly positive or strictly negative, and we may as well restrict to the positive case by taking absolute values. Scaling invariance says that c-1/2Xct is itself a standard Brownian motion for any positive constant c. So, scaling the path of X on [σ, 1] to the unit interval defines a process

 $\displaystyle B_t=(T-\sigma)^{-1/2}\lvert X_{\sigma+t(T-\sigma)}\rvert.$ (2)

over 0 ≤ t ≤ 1; This starts from zero and is strictly positive at all other times.

Scaling invariance shows that the law of the process B does not depend on the choice of fixed time T The only remaining ambiguity is in the choice of the fixed time T.

Lemma 1 The distribution of B defined by (2) does not depend on the choice of the time T > 0.

Proof: Consider any other fixed positive time , and use the construction above with , σ̃,  in place of T, σ, B respectively. We need to show that and B have the same distribution. Using the scaling factor S = /T, then Xt = S-1/2XtS is a standard Brownian motion. Also, σ′= σ̃/S is the last time before T at which X′ is zero. So,

 $\displaystyle \tilde B_t=(T-\sigma')^{-1/2}\lvert X'_{\sigma'+t(T-\sigma')}\rvert$

has the same distribution as B. ⬜

This leads to the definition used here for Brownian meanders.

Definition 2 A continuous process {Bt}t ∈ [0, 1] is a Brownian meander if and only it has the same distribution as (2) for a standard Brownian motion X and fixed time T > 0.

In fact, there are various alternative — but equivalent — ways in which Brownian excursions can be defined and constructed.

• As a scaled segment of a Brownian motion before a time T and after it last hits 0. This is definition 2.
• As a Brownian motion conditioned on being positive. See theorem 4 below.
• As a segment of a Brownian excursion. See lemma 5.
• As the path of a standard Brownian motion starting from its minimum, in either the forwards or backwards direction. See theorem 6.
• As a Markov process with specified transition probabilities. See theorem 9 below.
• As a solution to an SDE. See theorem 12 below.