### Abstract

To study the characteristics of the noise received by Software defined radio (SDR) and the characteristics of signal transimtted from one SDR to the other SDR using Intermediate Frequency (IF) and Radio Frequency (RF).

Click here for the second part of this series

## Noise and its characteristics:

### 1.1 Introduction:

Noise may be definned as any unwanted signal that interferes with the communication, measurement or processing of an information-bearing signal. Noise is present in various degrees in almost all environments. Noise can cause transmission errors and may even disrupt a communication process; hence noise processing is an important part of modern telecommunication and signal processing systems. Depending on its frequency or time characteristics, a noise process can be classified into one of several categories as follows:

1. Narrowband noise: A noise process with a narrow bandwidth such as a 50/60 Hz hum from the electricity supply.

2. White noise: Purely random noise that has a at power spectrum. White noise theoretically contains all frequencies in equal intensity.

3. Band-limited white noise: A noise with a at spectrum and a limited bandwidth that usually covers the limited spectrum of the device or the signal of interest.

4. Coloured noise: Non-white noise or any wideband noise whose spectrum has a non-flat shape; examples are pink noise, brown noise

5. Impulsive noise: Consists of short-duration pulses of random amplitude and random duration. and autoregressive noise.

The noise received by SDR in our lab is coloured noise.

### 1.2 Coloured Noise

Although the concept of white noise provides a reasonably realistic and mathematically convenient and useful approximation to some predominant noise processes encountered in telecommunication systems, many other noise processes are non-white. The term coloured noise refers to any broadband noise with a non-white spectrum. study of this characteristic of noise is performed by autocorrelation of the noise received by the SDR and by the Power Spectral Density (PSD) which is Discrete fourier transform of autocorrelated samples.

### 1.3 Power Spectral Density

Power spectral density function (PSD) shows the strength of the variations(energy) as a function of frequency. In other words, it shows at which frequencies variations are strong and at which frequencies variations are weak. The unit of PSD is energy per frequency(width) and you can obtain energy within a specific frequency range by integrating PSD within that frequency range. Computation of PSD is done directly by the method called FFT or computing autocorrelation function and then transforming it. PSD is a very useful tool if you want to identify oscillatory signals in your time series data and want to know their amplitude.

For example let assume we are operating a factory with many machines and some of them have motors inside. We detect unwanted vibrations from somewhere. We might be able to get a clue to locate offending machines by looking at PSD which would give us frequencies of vibrations.

### 1.3.1 Calculating Power Spectral Density

For communications signals, the energy is effectively infinite (the signals are of unlimited duration), so we usually work with Power quantities. So, we find the average power by averaging over time

where \(S_x( \omega ) \) is the Power Spectral Density (PSD).

### 1.4 Fast Fourier Transform

The Discrete Fourier Transform (DFT) is performed using Fast Fourier transform (FFT). In Matlab, vector indices for a N-point vector are numbered from 1 to N. Starting at 0 this corresponds to the range of 0 – N-1. The frequency resolution may be seen from the fact that there are N points spanning to Fs, where Fs is the sampling frequency and so the resolution of each component is Fs=N.

The first component, number 1, is actually the zero-frequency or “DC” component.

The true frequency of component 2 is \(1 \times \frac{F_s}{N} \)

The true frequency of component 3 is \(2 \times \frac{F_s}{N} \)

.

.

The true frequency of component \(\frac{N}{2}+1 \) is \(\frac{N}{2} \times \frac{F_s}{N}\)

### 1.5 Auto-Correlation

The term correlation means, in general the similarity between two sets of data. Auto-correlation is the cross correlation of a signal with itself. It is used generally to know the presence of a periodic signal which has been burried under the noise.

### 1.5.1 Calculating Auto-correlation

The autocorrelation of a sampled singal is defined as the product of

where k is the lag or delay. The correlation may be normalized by the mean square of the signal \(R_{xx}(0) \) giving:

The subscript \(xx \) denotes that the signal is multiplied by a delayed version of itself. For a number of values of \(k \), the result is a set of correlation values i.e., an autocorrelation vector.

### 1.6 Autocorrelation of noise received by SDR

The noise received by the SDR using RF was autocorrelated by using the equations (1.4) and (1.5) The noise and the autocorrelation of the noise is shown in Fig.1.1.

The first part of the plot is the samples of noise received by the SDR with number of samples on X-axis and Amplitude of the samples on Y-axis. It was expected that the noise to be random but here we can see that there is certain periodicity of samples.

The second part of the plot is the autocorrelation performed using (1.4) and (1.5) on the samples received in the first part of plot. This autocorrelation plot in general was expected to be a sinc sort of shape for coloured noise. The last part of the plot is the PSD of the plot. This plot gives us the power of the signal (noise) with frequency. The two spikes in the plot are because of certain periodic nature of noise.

These plots are just like the plots discussed above except that this noise was received using IF.