## Documentation Center |

This example illustrates one way to use a few of the adaptive
filter algorithms provided in the toolbox. In this example, a signal
enhancement application is used as an illustration. While there are
about 30 different adaptive filtering algorithms included with the
toolbox, this example demonstrates two algorithms — least means
square (LMS), using `adaptfilt.lms`,
and normalized LMS, using `adaptfilt.nlms`,
for adaptation.

The goal is to use an adaptive filter to extract a desired signal from a noise-corrupted signal by filtering out the noise. The desired signal (the output from the process) is a sinusoid with 1000 samples.

n = (1:1000)'; s = sin(0.075*pi*n);

To perform adaptation requires two signals:

a reference signal

a noisy signal that contains both the desired signal and an added noise component.

To create a noise signal, assume that the noise `v1` is
autoregressive, meaning that the value of the noise at time *t* depends
only on its previous values and on a random disturbance.

v = 0.8*randn(1000,1); % Random noise part. ar = [1,1/2]; % Autoregression coefficients. v1 = filter(1,ar,v); % Noise signal. Applies a 1-D digital % filter.

To generate the noisy signal that contains both the desired
signal and the noise, add the noise signal `v1` to
the desired signal `s`. The noise-corrupted sinusoid `x` is

x = s + v1;

where `s` is the desired signal and the noise
is `v1`. Adaptive filter processing seeks to recover `s` from `x` by
removing `v1`. To complete the signals needed to
perform adaptive filtering, the adaptation process requires a reference
signal.

Define a moving average signal `v2` that is
correlated with `v1`. This `v2` is
the reference signal for the examples.

ma = [1, -0.8, 0.4 , -0.2]; v2 = filter(ma,1,v);

Two similar adaptive filters — LMS and NLMS —
form the basis of this example, both sixth order. Set the order as
a variable in MATLAB^{®} and create the filters.

L = 7; hlms = adaptfilt.lms(7); hnlms = adaptfilt.nlms(7);

LMS-like algorithms have a step size that determines the amount of correction applied as the filter adapts from one iteration to the next. Choosing the appropriate step size is not always easy, usually requiring experience in adaptive filter design.

A step size that is too small increases the time for the filter to converge on a set of coefficients. This becomes an issue of speed and accuracy.

One that is too large may cause the adapting filter to diverge, never reaching convergence. In this case, the issue is stability — the resulting filter might not be stable.

As a rule of thumb, smaller step sizes improve the accuracy of the convergence of the filter to match the characteristics of the unknown, at the expense of the time it takes to adapt.

The toolbox includes an algorithm — `maxstep` —
to determine the maximum step size suitable for each LMS adaptive
filter algorithm that still ensures that the filter converges to a
solution. Often, the notation for the step size is µ.

>> [mumaxlms,mumaxmselms] = maxstep(hlms,x) [mumaxnlms,mumaxmsenlms] = maxstep(hnlms); Warning: Step size is not in the range 0 < mu < mumaxmse/2: Erratic behavior might result. > In adaptfilt.lms.maxstep at 32 mumaxlms = 0.2096 mumaxmselms = 0.1261

The first output of `maxstep` is the value
needed for the mean of the coefficients to converge while the second
is the value needed for the mean squared coefficients to converge.
Choosing a large step size often causes large variations from the
convergence values, so choose smaller step sizes generally.

hlms.StepSize = mumaxmselms/30; % This can also be set graphically: inspect(hlms) hnlms.StepSize = mumaxmsenlms/20; % This can also be set graphically: inspect(hnlms)

If you know the step size to use, you can set the step size
value with the `step` input argument when you create
your filter.

hlms = adaptfilt.lms(N,step); Adds the step input argument.

Now you have set up the parameters of the adaptive filters and
you are ready to filter the noisy signal. The reference signal, `v2`,
is the input to the adaptive filters. `x` is the
desired signal in this configuration.

Through adaptation, `y`, the output of the
filters, tries to emulate `x` as closely as possible.

Since `v2` is correlated only with the noise
component `v1` of `x`, it can only
really emulate `v1`. The error signal (the desired `x`),
minus the actual output `y`, constitutes an estimate
of the part of `x` that is not correlated with `v2` — `s`,
the signal to extract from `x`.

[ylms,elms] = filter(hlms,v2,x); [ynlms,enlms] = filter(hnlms,v2,x);

For comparison, compute the optimal FIR Wiener filter.

bw = firwiener(L-1,v2,x); % Optimal FIR Wiener filter yw = filter(bw,1,v2); % Estimate of x using Wiener filter ew = x - yw; % Estimate of actual sinusoid

Plot the resulting denoised sinusoid for each filter — the Wiener filter, the LMS adaptive filter, and the NLMS adaptive filterm — to compare the performance of the various techniques.

plot(n(900:end),[ew(900:end), elms(900:end),enlms(900:end)]); legend('Wiener filter denoised sinusoid',... 'LMS denoised sinusoid', 'NLMS denoised sinusoid'); xlabel('Time index (n)'); ylabel('Amplitude');

As a reference point, include the noisy signal as a dotted line in the plot.

hold on plot(n(900:end),x(900:end),'k:') xlabel('Time index (n)'); ylabel('Amplitude'); hold off

Finally, compare the Wiener filter coefficients with the coefficients of the adaptive filters. While adapting, the adaptive filters try to converge to the Wiener coefficients.

[bw.' hlms.Coefficients.' hnlms.Coefficients.'] ans = 1.0317 0.8879 1.0712 0.3555 0.1359 0.4070 0.1500 0.0036 0.1539 0.0848 0.0023 0.0549 0.1624 0.0810 0.1098 0.1079 0.0184 0.0521 0.0492 -0.0001 0.0041

Adaptive filters have a `PersistentMemory` property
that you can use to reproduce experiments exactly. By default, the `PersistentMemory` is `false`.
The states and the coefficients of the filter are reset before filtering
and the filter does not remember the results from previous times you
use the filter.

For instance, the following successive calls produce the same
output when `PersistentMemory` is `false`.

[ylms,elms] = filter(hlms,v2,x); [ylms2,elms2] = filter(hlms,v2,x);

To keep the history of the filter when filtering a new set of
data, enable persistent memory for the filter by setting the `PersistentMemory` property
to `true`. In this configuration, the filter uses
the final states and coefficients from the previous run as the initial
conditions for the next run and set of data.

[ylms,elms] = filter(hlms,v2,x); hlms.PersistentMemory = true; [ylms2,elms2] = filter(hlms,v2,x); % No longer the same

Setting the property value to `true` is useful
when you are filtering large amounts of data that you partition into
smaller sets and then feed into the filter using a for-loop construction.

To analyze the convergence of the adaptive filters, look at the learning curves. The toolbox provides methods to generate the learning curves, but you need more than one iteration of the experiment to obtain significant results.

This demonstration uses 25 sample realizations of the noisy sinusoids.

n = (1:5000)'; s = sin(0.075*pi*n); nr = 25; v = 0.8*randn(5000,nr); v1 = filter(1,ar,v); x = repmat(s,1,nr) + v1; v2 = filter(ma,1,v);

Now compute the mean-square error. To speed things up, compute the error every 10 samples.

First, reset the adaptive filters to avoid using the coefficients it has already computed and the states it has stored.

reset(hlms); reset(hnlms); M = 10; % Decimation factor mselms = msesim(hlms,v2,x,M); msenlms = msesim(hnlms,v2,x,M); plot(1:M:n(end),[mselms,msenlms]) legend('LMS learning curve','NLMS learning curve') xlabel('Time index (n)'); ylabel('MSE');

In the next plot you see the calculated learning curves for the LMS and NLMS adaptive filters.

For the LMS and NLMS algorithms, functions in the toolbox help you compute the theoretical learning curves, along with the minimum mean-square error (MMSE) the excess mean-square error (EMSE) and the mean value of the coefficients.

MATLAB may take some time to calculate the curves. The figure shown after the code plots the predicted and actual LMS curves.

reset(hlms); [mmselms,emselms,meanwlms,pmselms] = msepred(hlms,v2,x,M); plot(1:M:n(end),[mmselms*ones(500,1),emselms*ones(500,1),... pmselms,mselms]) legend('MMSE','EMSE','predicted LMS learning curve',... 'LMS learning curve') xlabel('Time index (n)'); ylabel('MSE');

Was this topic helpful?