Skip to content

What reason to prevent me from the achieving accuracy better than microsecond of delay?

0
Nate Duong 49 Rep.

I have 2 PERFECT data of the transmitter and receiver. From 2 data, I can calculate the delay estimation:

Fs  =  8e6 ;  %  sample rate

for i = 1 : 2

[ cc_correlation, lag ]    =   xcorr ( signal2(i) ,  signal1 ) ;

[ cc_maximum,  cc_time ]   =  max( abs (cc_correlation ) )  ;

cc_estimation   =  abs(length(signal1) - cc_time);

delay ( i )                =  cc_estimation / Fs ;

end

Then I have the matrix of delays are 11 microseconds and 13.875 microseconds.

The expectation in nanosecond from this function because from the sampling rate, I can see the period time T=1/Fs=125ns. Therefore, the delay should be in nanosecond, not microsecond as I had.

When I call the matlab function above:

[ cc_maximum ,  cc_time ]   =  max ( abs ( cc_correlation ) ) ;
It returns the values which are called cc_maximum, and another value cc_time. It is sample data.

What did I do wrong for this algorithm?

My professor also said:”you don’t have function, you have sample version of the function, the xcorr is a waveform of continuous function, they have a maximum in the current of time. When you work with a sample, the waveform you have entire function tell the value of the function you have discrete time”

and I still do not understand what his mean?

I hope someone can help me out.

Thank you.

Nate Duong edited question
×

Login