# Simulation and Analysis of White Noise in Matlab

(23 votes, average: 4.57 out of 5)

If you are looking for a way to generate AWGN noise in Matlab, go here.

Checkout this ebook : Simulation of Digital Communications using Matlab – by Mathuranathan Viswanathan

## White Noise Process

A random process (or signal for your visualization) with a constant power spectral density (PSD) function is a white noise process.

## Power Spectral Density

Power Spectral Density function shows how much power is contained in each of the spectral component. For example, for a sine wave of fixed frequency, the PSD plot will contain only one spectral component present at the given frequency. PSD is an even function and so the frequency components will be mirrored across the Y-axis when plotted. Thus for a sine wave of fixed frequency, the double sided plot of PSD will have two components – one at +ve frequency and another at –ve frequency of the sine wave. (How to plot PSD/FFT in Matlab)

## Gaussian and Uniform White Noise:

A white noise signal (process) is constituted by a set of independent and identically distributed (i.i.d) random variables. In discrete sense, the white noise signal constitutes a series of samples that are independent and generated from the same probability distribution. For example, you can generate a white noise signal using a random number generator in which all the samples follow a given Gaussian distribution. This is called White Gaussian Noise (WGN) or Gaussian White Noise. Similarly, a white noise signal generated from a Uniform distribution is called Uniform White Noise. White Gaussian Noise and Uniform White Noise are frequently used in system modelling. In modelling/simulation, a white noise can be generated using an appropriate random generator. White Gaussian Noise can be generated using “randn” function in Matlab which generates random numbers that follow a Gaussian distribution. Similarly, “rand” function can be used to generate Uniform White Noise in Matlab that follows a uniform distribution. When the random number generators are used, it generates a series of random numbers from the given distribution. Let’s take the example of generating a White Gaussian Noise of length 10 using “randn” function in Matlab – with zero mean and standard deviation=1.

noise =   -1.5121    0.7321   -0.1621    0.4651    1.4284    1.0955   -0.5586    1.4362   -0.8026    0.0949
More simulation techniques available in this ebook – Simulation of Digital Communication systems using Matlab

## What is i.i.d ?

This simply generates 10 random numbers from the standard normal distribution. As we know that a white process is seen as a random process composing several random variables following the same Probability Distribution Function (PDF). The 10 random numbers above are generated from the same PDF (standard normal distribution). This condition is called “identically distributed” condition. The individual samples given above are “independent” of each other. Furthermore, each sample can be viewed as a realization of one random variable. In effect, we have generated a random process that is composed of realizations of 10 random variables. Thus, the process above is constituted from “independent identically distributed” (i.i.d) random variables.

## Strictly and weakly defined White noise:

Since the white noise process is constructed from i.i.d random variable/samples, all the samples follow the same underlying probability distribution function (PDF). Thus, the Joint Probability Distribution function of the process will not change with any shift in time. This is called a stationary process. Thus the white noise is a stationary process. As with a stationary process which can be classified as Strict Sense Stationary (SSS) and Wide Sense Stationary (WSS) processes, we can have white noise that is SSS and white noise that is WSS. Correspondingly they can be called “strictly white noise signal” and “weakly white noise signal”.

## What’s with Covariance Function/Matrix ?

A white noise signal, denoted by x(t), is defined in weak sense is a more practical condition. Here, the samples are statistically uncorrelated and identically distributed with some variance equal to σ2. This condition is specified by using a covariance function as

Why do we need a covariance function? Because, we are dealing with a random process which composes “n” random variables(10 variables in the modelling example above). Such a process is viewed as multivariate random vector or multivariate random variable. For multivariate random variables, Covariance function specified how each of the “n” variables in the given random process behaves with respect to each other. Covariance function generalizes the notion of variance to multiple dimensions. The above equation when represented in the matrix form gives the covariance matrix of the white noise random process. Since the random variables in the white noise process are statistically uncorrelated, the covariance function contains values only along the diagonal. The matrix above indicates that only the auto-correlation function exists for each random variable. The cross-correlation values are zero (samples/variables are statistically uncorrelated with respect to each other). The diagonal elements are equal to the variance and all other elements in the matrix are zero.The ensemble auto-correlation function of the weakly defined white noise is given by This indicates that the auto-correlation function of weakly defined white noise process is zero everywhere except at lag $$\tau=0$$.
Related topic: Constructing the auto-correlation matrix in Matlab

## Frequency Domain Characteristics:

Wiener-Khintchine Theorem states that for Wide Sense Stationary Process (WSS), the power spectral density function $$S_{xx}(f)$$ of the random process can be obtained by Fourier Transform of auto-correlation function of the random process. In continuous time domain, this is represented as For the weakly defined white noise process, we find that the mean is a constant and its covariance does not vary with respect to time. This is a sufficient condition for a WSS process. Thus we can apply Weiner Khintchine Theorem. Therefore, the power spectral density of the weakly defined white noise process is constant (flat) across the entire frequency spectrum. The value of the constant is equal to the variance or power of the white noise.

## Testing the characteristics of White Gaussian Noise in Matlab:

Generate a Gaussian white noise signal of length L=100,000 using the randn function in Matlab and plot it. Here the underlying pdf is a Gaussian pdf with mean μ=0 and standard deviation σ=2. Thus the variance of the Gaussian pdf is σ2=4. More simulation techniques available in this ebook – Simulation of Digital Communication systems using Matlab

Plot the histogram of the generated white noise and verify the histogram by plotting against the theoretical pdf of the Gaussian random variable. More simulation techniques available in this ebook – Simulation of Digital Communication systems using Matlab

Compute the auto-correlation function of the white noise. The computed auto-correlation function has to be scaled properly. If the ‘xcorr’ function (inbuilt in Matlab) is used for computing the auto-correlation function, use the ‘biased’ argument in the function to scale it properly. More simulation techniques available in this ebook – Simulation of Digital Communication systems using Matlab

## Simulating the PSD of the white noise:

Simulating the Power Spectral Density (PSD) of the white noise is a little tricky business. There are two issues here 1) The generated samples are of finite length. This is synonymous to applying truncating an infinite series of random samples. This implies that the lags are defined over a fixed range. ( FFT and spectral leakage – an additional resource on this topic can be found here) 2) The random number generators used in simulations are pseudo-random generators. Due these two reasons, you will not get a flat spectrum of psd when you apply Fourier Transform over the generated auto-correlation values.The wavering effect of the psd can be minimized by generating sufficiently long random signal and averaging the psd over several realizations of the random signal.

### Simulating Gaussian White Noise as a Multivariate Gaussian Random Vector:

To verify the power spectral density of the white noise, we will use the approach of envisaging the white noise as a composite of ‘N’ Gaussian random variables. We want to average the psd over L such realizations. Since there are ‘N’ Gaussian random variables (N individual samples) per realization, the covariance matrix Cxx will be of dimension NxN. The vector of mean for this multivariate case will be of dimension 1xN. Cholesky decomposition of covariance matrix gives the equivalent standard deviation for the multivariate case. Cholesky decomposition can be viewed as square root operation. Matlab’s randn function is used here to generate the multi-dimensional Gaussian random process with the given mean matrix and covariance matrix.

Compute PSD of the above generated multi-dimensional process and average it to get a smooth plot.

The PSD plot of a white noise gives almost fixed power in all the frequencies. In other words, for a white noise signal, the PSD is constant (flat) across all the frequencies (-∞ to + ∞). The y-axis in the above plot is expressed in dB/Hz unit. We can see from the plot that the constant power = 10*log10(σ2)=10*log10(4)=6 dB.

## Recommended Books on Probability and Random Process:

• Isa

please explain how to use trapz..

• Fatih

Perfect, thank you

• Welcome

i am trying to find the time delay between two signals in matlab. the two signals are added with gaussian white noise.then i found the covariance matrix.the det of cov matrix is always zero at the delay point.i used the joint entropy based time delay estimation which gave me infinite value at the delay point.how to get a finite value from this

• When the correlation is zero, it means that no correlation exist between the signals.
For the time delay estimation to work, there exist some significant amount correlation between the shifted versions of the signal.

Check if the noise added to signal is too severe. In this case the correlation may go to zero at delay point (autocorrelation of noise vs noise is zero at all time lags except at zero lag).

Try if you can get reasonable value using the inbuilt Matlab function – “D = finddelay(X,Y)”

• Jyothi VBN

Dear Sir,
Do you have any idea about R language for statistics.I have some doubts can i ask you sir

• I do not know R. But you can always post the question here

• Jyothi VBN

In the given link,he takes three noisy sensor data and estimates the best value from the three available data using kalman

• zahi

dear sir do you hane any idea about function regression to calculat coeffienet and lamda use the white gaussian noise …. plz help me ….

thx

• Can you rephrase your question ? I am unable to understand it…

• Innocent LeMaitre

Hey there, do u have a way to change (demodulation) a digital signal into bits using Matlab?? If please can elaborate on how to do so??????

• Sotiris No

hello,
does the book has more code about generating white noise and random walk processes?i am interested in simulating an inertial measurement unit’s errors in matlab.will the book be useful for me?

• Discussions on random walks are not available in the ebook. Thanks for your understanding

• Nikhil Karmude

Hello Sir, Does the book have implementation for OFDM receiver? I mainly need to refer the packet detection code, cross correlation code and carrier frequency output code for it. Thanks!

• It has only a conceptual level (cyclic prefix, FFT/IFFT, parallel-serial & serial-parallel converters ) implementation of OFDM transmitter & receiver. Code for packet detection/cross-correlation/carrier frequency is not available. However, the simulation for the concept of cross-correlation is applied in another chapter dealing with spread spectrum

hello sir ,is there any book or online material which focuses on how statistical properties of samples change when gaussian white noise is added to it for instance white noise has weighted identity covariance mat but when it is added to data and if now we find covariance mat of distorted data what form should it take

• Nivedita negi

@humourmind:disqus can you please tell me how to plot power spectral density bar graph? Using FFT gives PSD line graph. In my problem I have to calculate PSD of a signal from 0 to 9 Hz having .25 Hz bandwidth and then plot its bar graph. I would really appreciate if you could provide me matlab code for the same. Thanks.

• yanhui li

ok,Tks.Could you please tell me what is the definition of the power in frequency domain of white gauss noise?

• The power spectrum of white noise is flat across all frequencies extending from -infinity to +infinity Hertz, as illustrated in the noise power spectral density graph on the right (First Figure).

• yanhui li

Tks，but what’s the expession of the power in frequency domain?Is this as follows? where,N is the FFT number,X(n) is the Fourier transform of x(n).

• yes, that is correct. In Matlab it can be computed as follows

Z = 1/sqrt(N)*fft(z,[],2); %Scaling by sqrt(N);
Pzavg = mean(Z.*conj(Z));%Computing the mean power from fft

• Jyothi VBN

Hi Sir,
Can you help me on 1st Gauss Markov process modeling

• Yamuna

Sir,
When I give cov(A), where A is randn(m,n), it is supposed to make a diagonal matrix, but it is not happening. Can you please explain why?

• If you want to test the covariance of white noise sequence, you need to take two realizations of the noise process and find the covariance matrix.

A1 = randn(1,10000); %realization 1 of zero mean, unit variance white noise process
A2 = randn(1,10000); %realization 2 of zero mean, unit variance white noise process

cov(A1,A2)

ans =
0.9909 0.0045
-0.0045 0.9999

Diagonal elements will approximate to unity as the length of the sequences are increased further. This indicates that the variance of the underlying process is close to unity

• Yamuna

Thank you Sir. That was a piece of information.

• Michael Schwager

Hi, I’m trying to figure out how to convert the last example to an actual noise density with units of V/sqrt(Hz), assuming your random variable is volts sampled at 1kHz (for sake of discussion), to tie together the idea that the standard deviation = RMS = the integration of noise density (ND) across the bandwidth of interest (BW) as is commonly done in opamp or sensor noise analysis. So I’d like to be able to say RMS(volts) = ND * sqrt(BW). I think the value you show (6dBV = 4.0 V), actually has V^2 units, and you haven’t divided by any spectral bandwidth, but since you’re normalizing the BW to 1.0Hz (from -0.5Hz to 0.5Hz), no division is actually necessary in this particular case. So I *think* to convert to the type of noise density I’m looking for, one must divide the mean of the output array by something, then take the square root, to arrive at a single “noise density” figure which is valid for the whole spectrum of interest. For the example you show, I cheat and directly calculate the noise density ND as 2.0/sqrt(500 Hz) = 0.08944 V/sqrt(Hz). Thus 0.08944 * sqrt(500) = 2.0. But this is cheating because I know the answer I want, so I’m not exactly sure how to get there from the PSD array (which is essentially the RMS^2 times some factor). If this makes any sense, can you please shed some light on this? Thanks very much for the great post.