Supplement 1: Correlated Random Draws with Gaussian PDF

1-D Processes

updated February 24, 2024

It can be useful to generate random draws of correlated processes. These can provide an input with known statistics for sensor models, estimation algorithms, control systems, etc. This article provides an introduction to the process, which also serves as background for generating phase screens, as described in Section 9.3. In particular, this article focuses on one-dimensional (1-D) processes with a Gaussian probability density function (PDF).

Real-Valued Gaussian Random Variables

Independent Gaussian Random Variables

Let \(u_i\) be one variable from a set of \(N\) independent random variables with a Gaussian PDF given by $$ \begin{equation} p\left(u_i\right) = \frac{1}{\sqrt{2 \pi \sigma_{ui}^2}} \exp\left[-\frac{\left(u_i-\mu_{ui}\right)^2}{2\sigma_{ui}^2}\right], \end{equation} $$ where \(\mu_{ui}\) is the set of \(N\) mean values and \(\sigma_{ui}^2\) is the set of \(N\) variances. The \(N^{th}\)-order joint PDF of all of the \(u_i\) is product of individual PDFs given by $$ \begin{equation} p\left(u_1, u_2, \ldots, u_N\right) = \prod\limits_{i=1}^N p\left(u_i\right). \end{equation} $$

You can generate independent Gaussian random numbers with a variance of 1 and mean of 0 in Matlab using the randn function. Here is how to adjust the mean and variance: Let \(u\) be a Gaussian random number with a variance of 1 and mean of 0. Then, the number $$ \begin{equation} v = \sigma_v u + \mu_v \end{equation} $$ has a Gaussian PDF with a mean of \(\mu_v\) and variance of \(\sigma_v^2\).

Correlated Gaussian Random Variables

Let \(v_i\) be a set of \(N\) correlated random variables, and denote all of the \(v_i\) as a column vector \(\mathbf{v}\). It has a vector of mean values \(\mathbf{\mu}_{v}\). The elements of the covariance matrix are given by $$ \begin{equation} C_{ij} = \left\langle \left(v_i - \mu_{vi}\right)\left(v_j - \mu_{vj}\right)\right\rangle. \end{equation} $$ With these definitions, the \(N^{th}\)-order joint PDF of all of the \(v_i\) (or the vector \(\mathbf{v}\)) is $$ \begin{equation} p\left(\mathbf{v}\right) = \frac{1}{\left(2\pi\right)^{N/2} \left\vert \mathbf{C} \right\vert^{1/2}} \exp\left[-\frac{1}{2}\left(\mathbf{v}-\mathbf{\mu}_v\right)^t \mathbf{C}^{-1}\left(\mathbf{v}-\mathbf{\mu}_v\right)\right]. \end{equation} $$

Complex-Valued Gaussian Random Variables

General Mean and Covariance

In this case, each random variable \(z_i\) is complex-values such that $$ \begin{equation} z_i = u_i + \text{i}v_i. \end{equation} $$ Readers should note the difference between the integer index \(i\) and the imaginary number \(\text{i} = \sqrt{-1}\). Joint PDFs of complex numbers are written in terms of a vector of the real and imaginary parts stacked on top of each other like $$ \begin{equation} \mathbf{z} = \begin{pmatrix} u_1 \\ u_2 \\ \vdots \\ u_N \\ v_1 \\ v_2 \\ \vdots \\ v_N \end{pmatrix}. \end{equation} $$ This is a vector of length \(2 N\). For a complex Gaussian random vector, the form of the PDF does not change. Namely, it is still written as $$ \begin{equation} p\left(\mathbf{z}\right) = \frac{1}{\left(2\pi\right)^{N/2} \left\vert \mathbf{C} \right\vert^{1/2}} \exp\left[-\frac{1}{2}\left(\mathbf{z}-\mathbf{\mu}_z\right)^t \mathbf{C}^{-1}\left(\mathbf{z}-\mathbf{\mu}_z\right)\right], \end{equation} $$ where \(\mathbf{\mu}_z = \left\langle \mathbf{z} \right\rangle\) is the mean vector, and \(\mathbf{C}\) is the \(2 N \times 2 N\) covariance matrix of \(\mathbf{z}\).

Circular Complex Gaussian Random Variables

We can define a special class for Gaussian random variables called CCG. To define the conditions for CCG, we define two separate vectors from the real and imaginary parts according to $$ \begin{align} \mathbf{u} &= \begin{pmatrix} u_1 \\ u_2 \\ \vdots \\ u_N \end{pmatrix}, & \mathbf{v} &= \begin{pmatrix} v_1 \\ v_2 \\ \vdots \\ v_N \end{pmatrix}. \end{align} $$ Additionally, we define their covariance matrices by $$ \begin{align} \mathbf{C}^{uu} & = \left\langle \left(\mathbf{u}-\mathbf{\mu}_u\right) \left(\mathbf{u}-\mathbf{\mu}_u\right)^t \right\rangle & \mathbf{C}^{vv} & = \left\langle \left(\mathbf{v}-\mathbf{\mu}_v\right) \left(\mathbf{v}-\mathbf{\mu}_v\right)^t \right\rangle \\ \mathbf{C}^{uv} & = \left\langle \left(\mathbf{u}-\mathbf{\mu}_u\right) \left(\mathbf{v}-\mathbf{\mu}_v\right)^t \right\rangle & \mathbf{C}^{vu} & = \left\langle \left(\mathbf{v}-\mathbf{\mu}_v\right) \left(\mathbf{u}-\mathbf{\mu}_u\right)^t \right\rangle. \end{align} $$ The special case of CCG is defined by zero-valued mean vectors $$ \begin{equation} \mathbf{\mu}_u = \mathbf{\mu}_v = \mathbf{0} \end{equation} $$ and symmetry in the covariance matrices such that $$ \begin{align} \mathbf{C}^{uu} &= \mathbf{C}^{uu}, & \mathbf{C}^{uv} &= -\mathbf{C}^{vu}. \end{align} $$

Sums of Gaussian Random Variables

Gaussian random variables have the property that weighted sums of Gaussian variables still have a Gaussian PDF. For example, let \(a_i\) be a set of \(N\) fixed numbers. Then, the sum $$ \begin{equation} w = \sum\limits_{i=1}^N a_i u_i, \end{equation} $$ also has a Gaussian PDF given by $$ \begin{equation} \label{eq:GaussSumPDF} p\left(w\right) = \frac{1}{\sqrt{2 \pi \sigma_w^2}} \exp\left[-\frac{\left(w-\mu_w\right)^2}{2\sigma_w^2}\right]. \end{equation} $$ In this equation, the \(w\) parameters are related to the \(u_i\) parameters acccording to $$ \begin{align} \mu_w = \sum\limits_{i=1}^N a_i \mu_{ui} \\ \sigma_w^2 = \sum\limits_{i=1}^N a_i \sigma_{ui}^2. \end{align} $$ Equation \(\eqref{eq:GaussSumPDF}\) is an important property of working with Gaussian random numbers. Random variables with other PDFs have different relationships. For more information on these relationships, see the Wikipedia citations below.

The sum of random variable above can be extended to a vector of Gaussian random variables in the following way. Let \(\mathbf{u}\) be column vector of \(N\) independent random variables \(u_j\) with the same Gaussian PDF that has a mean of 0 and variance of 1. Also, let \(\mathbf{A}\) be a matrix with \(M\) rows and \(N\) columns of fixed values \(a_{ij}\). Then, the random column vector $$ \begin{equation} \mathbf{w} = \mathbf{A} \mathbf{u} \end{equation} $$ has \(M\) values. The entries in \(\mathbf{w}\) are Gaussian random variables that have a mean of zero and a covariance given by \(\mathbf{C} = \mathbf{A} \mathbf{A}^t\).

References

  1. Joseph W. Goodman, Statistical Optics Ch. 3, Wiley, New York, NY (1985)
  2. Relationships among probability distributions, In Wikipedia.
  3. Algebra of random variables, In Wikipedia.
  4. List of convolutions of probability distributions, In Wikipedia.