Skip to content

How to calculate standard deviation?


I have 2 signals:



signal2 represents for 100 measurements, each measurement combines with signal1 to get time delay estimation using xcorr.

So, I will have 100 delays and put them in the vector delay (1×100),then calculate the Standard deviation

The expectation of the Standard deviation is few nanoseconds, but I still got in microseconds, I could not figure out what are issues?

The 2nd plot show me the delays are changing whenever I calculate, and increasing. How can I calculate Standard Deviation from these delays?

I also confuse the theory of the diff b/w sample waveform and continue time equivalent? The sample waveform is not the same the continue of time waveform.

The way I posted for this topic is a sample method, I believe if we can do on the continuous way, we can get delay up to few nanoseconds. But I am still thinking of it. I hope someone can show me.

I am not sure I did right and need help from experience people, if you see something wrong or not make sense to you, please let me know.

Thank you.

clear all;
close all;
format long;
%% initial values:
nsamps = inf;
nstart = 0;
Fs = 8e6; % sample rate
F_0 = 520e6;
c = 3e8; % speed of light
%% input data
file_tx = ‘TX.dat’;
file_rx = ‘RX.dat’;
x_tx = readcplx(file_tx, nsamps,nstart);
x_rx = readcplx(file_rx, nsamps,nstart);
%% condition for selected gain