You specify the algorithm using the Adaptation Method drop-down list in the Function Block Parameters dialog box of an adaptive lookup table block. This section discusses the details of these algorithms.

### Sample Mean

`Sample mean` provides the average value of n output data samples and is defined as:

`$\stackrel{^}{y}\left(n\right)=\frac{1}{n}\sum _{i=1}^{n}y\left(i\right)$`

where y(i) is the ith measurement collected within a particular cell. For each input data u, the sample mean at the corresponding cell is updated using the output data measurement, y. Instead of accumulating n samples of data for each cell, a recursive relation is used to calculate the sample mean. The recursive expression is obtained by the following equation:

`$\begin{array}{l}\stackrel{^}{y}\left(n\right)=\frac{1}{n}\left[\sum _{i=1}^{n-1}y\left(i\right)+y\left(n\right)\right]=\frac{n-1}{n}\left[\frac{1}{n-1}\sum _{i=1}^{n-1}y\left(i\right)\right]+\frac{1}{n}y\left(n\right)=\frac{n-1}{n}\stackrel{^}{y}\left(n-1\right)+\frac{1}{n}y\left(n\right)\\ \end{array}$`

where y(n) is the nth data sample.

Defining a priori estimation error as $e\left(n\right)=y\left(n\right)-\stackrel{^}{y}\left(n-1\right)$, the recursive relation can be written as:

`$\stackrel{^}{y}\left(n\right)=\stackrel{^}{y}\left(n-1\right)+\frac{1}{n}e\left(n\right)$`

where $n\ge 1$ and the initial estimate $\stackrel{^}{y}\left(0\right)$ is arbitrary.

In this expression, only the number of samples, n, for each cell— rather than n data samples—is stored in memory.

### Sample Mean with Forgetting

The adaptation method Sample Mean has an infinite memory. The past data samples have the same weight as the final sample in calculating the sample mean. `Sample mean (with forgetting)` uses an algorithm with a forgetting factor or Adaptation gain that puts more weight on the more recent samples. This algorithm provides robustness against initial response transients of the plant and an adjustable speed of adaptation. ```Sample mean (with forgetting)``` is defined as:

`$\begin{array}{c}\stackrel{^}{y}\left(n\right)=\frac{1}{{\sum }_{i=1}^{n}{\lambda }^{n-i}}\sum _{i=1}^{n}{\lambda }^{n-i}y\left(i\right)\\ =\frac{1}{{\sum }_{i=1}^{n}{\lambda }^{n-i}}\left[\sum _{i=1}^{n-1}{\lambda }^{n-i}y\left(i\right)+y\left(n\right)\right]=\frac{s\left(n-1\right)}{s\left(n\right)}\stackrel{^}{y}\left(n-1\right)+\frac{1}{s\left(n\right)}y\left(n\right)\\ \end{array}$`

where $\lambda \in \left[0,1\right]$ is the Adaptation gain and $s\left(k\right)={\sum }_{i=1}^{k}{\lambda }^{n-i}$.

Defining a priori estimation error as $e\left(n\right)=y\left(n\right)-\stackrel{^}{y}\left(n-1\right)$, where $n\ge 1$ and the initial estimate $\stackrel{^}{y}\left(0\right)$ is arbitrary, the recursive relation can be written as:

`$\stackrel{^}{y}\left(n\right)=\stackrel{^}{y}\left(n-1\right)+\frac{1}{s\left(n\right)}e\left(n\right)=\stackrel{^}{y}\left(n-1\right)+\frac{1-\lambda }{1-{\lambda }^{n}}e\left(n\right)$`

A small value of λ results in faster adaptation. A value of `0` indicates short memory (last data becomes the table value), and a value of `1` indicates long memory (average all data received in a cell).