Like what you see? Follow Adrian on Twitter to be notified of new content.

The moments of a statistical distribution are a family of quantities that describe the distribution’s shape. Moments include the familiar quantities mean, variance, skewness and kurtosis.

Moments are broken down into raw, central and standardised moments. They are also ordered. For example, variance is the second central moment. The first 4 sets of moments are:

Order Raw moments Central moments Standardised moments
1 E[X]=μE[X] = \mu E[Xμ]=0E[X - \mu] = 0 E[Xμσ]=0E[\frac{X - \mu}{\sigma}] = 0
2 E[X2]E[X^2] E[(Xμ)2]=σ2E[(X - \mu)^2] = \sigma^2 E[(Xμσ)2]=1E[(\frac{X - \mu}{\sigma})^2] = 1
3 E[X3]E[X^3] E[(Xμ)3]E[(X - \mu)^3] E[(Xμσ)3]=sE[(\frac{X - \mu}{\sigma})^3] = s
4 E[X4]E[X^4] E[(Xμ)4]E[(X - \mu)^4] E[(Xμσ)4]=κE[(\frac{X - \mu}{\sigma})^4] = \kappa

where μ\mu is the mean, σ2\sigma^2 is the variance, ss is the skewness, and κ\kappa is the kurtosis.

Often, you can calculate the moments of a function of a random value based on the moments of that random variable. I frequently find myself calculating the moments of functions of Gaussian random variables. Here, I list out the moments of the Gaussian distribution for reference and describe a method for calculating co-moments.

Gaussian moments

If X is a Gaussian distribution, XN(μ,σ2)X \sim \mathcal{N}(\mu, \sigma^2), then the moments are 1 2:

Raw moments Central moments Standardised moments
E[X]=μE[X] = \mu E[Xμ]=0E[X - \mu] = 0 E[Xμσ]=0E[\frac{X - \mu}{\sigma}] = 0
E[X2]=μ2+σ2E[X^2] = \mu^2 + \sigma^2 E[(Xμ)2]=σ2E[(X - \mu)^2] = \sigma^2 E[(Xμσ)2]=1E[(\frac{X - \mu}{\sigma})^2] = 1
E[X3]=μ3+3μσ2E[X^3] = \mu^3 + 3\mu\sigma^2 E[(Xμ)3]=0E[(X - \mu)^3] = 0 E[(Xμσ)3]=0=sE[(\frac{X - \mu}{\sigma})^3] = 0 = s
E[X4]=μ4+6μ2σ2+3σ4E[X^4] = \mu^4 +6\mu^2\sigma^2 + 3\sigma^4 E[(Xμ)4]=3σ4E[(X - \mu)^4] = 3\sigma^4 E[(Xμσ)4]=3=κE[(\frac{X - \mu}{\sigma})^4] = 3 = \kappa

Gaussian co-moments

Co-moments involve multiple random variables. For example, E[XY]E[XY] is a co-moment of XX and YY. These can be fairly tedious to derive and require a trick.

How to derive

The trick to deriving Gaussian co-moments is to write the two variables as linear combinations of three standard normal distributions (X^,Y^,Z^N(0,1)\hat{X},\hat{Y},\hat{Z} \sim \mathcal{N}(0, 1)). Then, expand out the expected value into combinations of E[X^iY^jZ^k]E[\hat{X}^i\hat{Y}^j\hat{Z}^k] which resolve to E[X^i]E[Y^j]E[Z^k]E[\hat{X}^i]E[\hat{Y}^j]E[\hat{Z}^k] 3.

Say we have two Gaussian random variables XX and YY with means μX\mu_X and μY\mu_Y, variances σX2\sigma^2_X and σY2\sigma^2_Y and covariance σXY2\sigma^2_{XY}. We can write these two variables as functions of three uncorrelated standard normals X^\hat{X}, Y^\hat{Y} and Z^\hat{Z}: σ^X=σX2σXY2σ^Y=σY2σXY2X=μX+σ^XX^+σXYZ^Y=μY+σ^YY^+σXYZ^ \begin{aligned} \hat{\sigma}_X &= \sqrt{\sigma^2_X - \sigma^2_{XY}} \\ \hat{\sigma}_Y &= \sqrt{\sigma^2_Y - \sigma^2_{XY}} \\ \\ X &= \mu_X + \hat{\sigma}_X \hat{X} + \sigma_{XY} \hat{Z} \\ Y &= \mu_Y + \hat{\sigma}_Y \hat{Y} + \sigma_{XY} \hat{Z} \\ \end{aligned}

We can check these formulas by checking that the variance and covariance resolve to σX2\sigma^2_X, σY2\sigma^2_Y and σXY2\sigma^2_{XY}. The variance of summed independent Gaussians is the sum of the variances, and so: var[X]=σ^X2+σXY2=σX2 \text{var}[X] = \hat{\sigma}_X^2 + \sigma_{XY}^2= \sigma^2_X YY resolves the same. And the covariance: cov[X,Y]=E[(σ^XX^+σXYZ^)(σ^YY^+σXYZ^)]=E[σ^Xσ^YX^Y^+σ^XσXYX^Z^+σXYσ^YY^Z^+σXY2Z^2]=σ^Xσ^YE[X^Y^]+σ^XσXYE[X^Z^]+σXYσ^YE[Y^Z^]+σXY2E[Z^2]=σXY2 \begin{aligned} \text{cov}[X,Y] &= E[(\hat{\sigma}_X \hat{X} + \sigma_{XY} \hat{Z})(\hat{\sigma}_Y \hat{Y} + \sigma_{XY} \hat{Z})] \\ &= E[\hat{\sigma}_X \hat{\sigma}_Y \hat{X} \hat{Y} + \hat{\sigma}_X \sigma_{XY} \hat{X} \hat{Z} + \sigma_{XY}\hat{\sigma}_Y \hat{Y} \hat{Z} + \sigma_{XY}^2 \hat{Z}^2] \\ &= \hat{\sigma}_X \hat{\sigma}_Y E[\hat{X} \hat{Y}] + \hat{\sigma}_X \sigma_{XY} E[\hat{X} \hat{Z}] + \sigma_{XY}\hat{\sigma}_Y E[\hat{Y} \hat{Z}] + \sigma_{XY}^2 E[\hat{Z}^2] \\ &= \sigma_{XY}^2 \\ \end{aligned}

Derivations

We can use this method of rewriting into a combination of 3 standard normals to derive various co-moments. First expand out the two Gaussians: E[XY]=E[(μX+σ^XX^+σXYZ^)(μY+σ^YY^+σXYZ^)]=μXμY+μXσ^YE[Y^]+μXσXYE[Z^]+σ^XμYE[X^]+σ^Xσ^YE[X^Y^]+σ^XσXYE[X^Z^]+σXYμYE[Z^]+σXYσ^YE[Y^Z^]+σXY2E[Z^2] \begin{aligned} E[XY] &= E[(\mu_X + \hat{\sigma}_X \hat{X} + \sigma_{XY} \hat{Z})(\mu_Y + \hat{\sigma}_Y \hat{Y} + \sigma_{XY} \hat{Z})] \\ &= \mu_X\mu_Y + \mu_X\hat{\sigma}_Y E[\hat{Y}] + \mu_X\sigma_{XY} E[\hat{Z}]\\ &\quad + \hat{\sigma}_X\mu_Y E[\hat{X}] + \hat{\sigma}_X \hat{\sigma}_Y E[\hat{X}\hat{Y}] + \hat{\sigma}_X \sigma_{XY} E[\hat{X}\hat{Z}]\\ &\quad + \sigma_{XY}\mu_Y E[\hat{Z}] + \sigma_{XY} \hat{\sigma}_Y E[\hat{Y}\hat{Z}] + \sigma_{XY}^2 E[\hat{Z}^2]\\ \end{aligned} Then, expand the monomials from E[X^iY^jZ^k]E[\hat{X}^i\hat{Y}^j\hat{Z}^k] to E[X^i]E[Y^j]E[Z^k]E[\hat{X}^i]E[\hat{Y}^j]E[\hat{Z}^k]: =μXμY+μXσ^YE[Y^]+μXσXYE[Z^]+σ^XμYE[X^]+σ^Xσ^YE[X^]E[Y^]+σ^XσXYE[X^]E[Z^]+σXYμYE[Z^]+σXYσ^YE[Y^]E[Z^]+σXY2E[Z^2] \begin{aligned} &= \mu_X\mu_Y + \mu_X\hat{\sigma}_Y E[\hat{Y}] + \mu_X\sigma_{XY} E[\hat{Z}]\\ &\quad + \hat{\sigma}_X\mu_Y E[\hat{X}] + \hat{\sigma}_X \hat{\sigma}_Y E[\hat{X}]E[\hat{Y}] + \hat{\sigma}_X \sigma_{XY} E[\hat{X}]E[\hat{Z}]\\ &\quad + \sigma_{XY}\mu_Y E[\hat{Z}] + \sigma_{XY} \hat{\sigma}_Y E[\hat{Y}]E[\hat{Z}] + \sigma_{XY}^2 E[\hat{Z}^2]\\ \end{aligned} And replace all the Gaussian moments with their values from the table above (for example E[Y^]=0E[\hat{Y}] = 0): E[XY]=μXμY+σXY2(1) E[XY] = \mu_X\mu_Y + \sigma_{XY}^2 \tag{1}

Two other co-moments that come up often are: E[X2Y]=E[(μX+σ^XX^+σXYZ^)2(μY+σ^YY^+σXYZ^)]=μX2μY+2μXσXY2+μYσX2(2) \begin{aligned} E[X^2Y] &= E[(\mu_X + \hat{\sigma}_X \hat{X} + \sigma_{XY} \hat{Z})^2(\mu_Y + \hat{\sigma}_Y \hat{Y} + \sigma_{XY} \hat{Z})] \\ &= \mu_X^2\mu_Y + 2 \mu_X\sigma_{XY}^2 + \mu_Y\sigma^2_X \label{2}\tag{2} \end{aligned}

E[X2Y2]=E[(μX+σ^XX^+σXYZ^)2(μY+σ^YY^+σXYZ^)2]=σX2σY2+σX2μY2+σY2μX2+2σXY4+4μXμYσXY2+μX2μY2(3) \begin{aligned} E[X^2Y^2] &= E[(\mu_X + \hat{\sigma}_X \hat{X} + \sigma_{XY} \hat{Z})^2(\mu_Y + \hat{\sigma}_Y \hat{Y} + \sigma_{XY} \hat{Z})^2] \\ &= \sigma^2_X\sigma^2_Y + \sigma^2_X \mu_{Y}^{2} + \sigma^2_Y \mu_{X}^{2} + 2 \sigma_{XY}^{4} + 4 \mu_{X} \mu_{Y} \sigma_{XY}^{2} + \mu_{X}^{2} \mu_{Y}^{2} \label{3}\tag{3} \end{aligned}

Like what you see? Follow Adrian on Twitter to be notified of new content.

Footnotes

References & Notes


  1. Raw Gaussian moments. Answer on Stack Exchange. ↩︎

  2. Normal distribution, moments. Wikipedia. ↩︎

  3. Standard normal monomials. Answer on Math Overflow. ↩︎

Corrections

If you see any mistakes or room for improvement, please reach out to me on Twitter @DrAdrian.

Citation

Please cite this work as:

Letchford (2023), "Moments of the Gaussian distribution", OS Quant.

In BibTeX please use:

@article{Letchford2023,
    author = {Letchford, Adrian},
    title = {Moments of the Gaussian distribution},
    journal = {OS Quant},
    year = {2023},
    note = {https://osquant.com/papers/moments-of-the-guassian-distribution/},
}