## Documentation Center |

This example shows how to use the cross-correlation
sequence to detect the time delay in a noise-corrupted sequence. The
output sequence is a delayed version of the input sequence with additive
white Gaussian noise. Create two sequences. One sequence is a delayed
version of the other. The delay is 3 samples. Add an N(0,0.3^{2})
white noise sequence to the delayed signal. Use the sample cross-correlation
sequence to detect the lag.

Create and plot the signals. Set the random number generator to the default settings for reproducible results.

rng default; x = triang(20); y = [zeros(3,1); x]+0.3*randn(length(x)+3,1); subplot(211) stem(x,'markerfacecolor',[0 0 1]); axis([0 22 -1 2]); subplot(212) stem(y,'markerfacecolor',[0 0 1]); axis([0 22 -1 2]);

Obtain the sample cross-correlation sequence and use the maximum absolute value to estimate the lag. Plot the sample cross-correlation sequence.

[xc,lags] = xcorr(y,x); [~,I] = max(abs(xc)); fprintf('Maximum cross correlation sequence value occurs at lag %d\n',lags(I)); figure; stem(lags,xc,'markerfacecolor',[0 0 1]);

The maximum cross correlation sequence value occurs at lag 3 as expected.

Was this topic helpful?