MATLAB Examples

# Demo file for batchpleas.m

batchpleas is a wrapper for lsqnonlin, allowing it to solve many small problems (all with the same parameterization) in one batched, partitioned nonlinear least squares estimation. This takes advantage of economies of scale, so as to gain a higher throughput overall. The gain can be dramatic.

John D'Errico woodchips@rochester.rr.com 4/20/2010

## Generate some fake data to test out the code

I will pretend here that I have 10000 separate models to estimate, where the generic model will have the form:

`y = a1 + a2*exp(-a3)*x`

Here a1, a2, a3 are parameters to be estimated using nonlinear techniques. n = 10000 separate problems, with m = 10 data points in each problem.

```m = 10; n = 10000; Xdata = rand(n,m); ```

To generate the data so we can compare our estimates with known values we will select some numbers for these coefficients in advance.

```knowncoefs = 1 + rand(n,3); ```

Evaluate the function to get the aim values, but then add in just a wee bit of random noise. bsxfun is a good way to evaluate all of these functions in one (rather long) line.

```Ydata = bsxfun(@plus,knowncoefs(:,1), ... bsxfun(@times,knowncoefs(:,2), ... exp(-bsxfun(@times,knowncoefs(:,3),Xdata)))); Ydata = Ydata + randn(n,m)/100; Xdata = reshape(Xdata',[],1); Ydata = reshape(Ydata',[],1); ```

And generate a sub-problem identifier to distinguish the separate problems. This index must be a list of integers from the set [1,2,3,...,np], that distinguish data points between the different sub-problems. There is no requirement that each sub-problem have the same number of data points as the rest of the sub-problems.

```batchindex = kron(1:n,ones(1,m)); ```

## First, solve using a loop around lsqnonlin as a comparison

```tic modelfun = @(a,Xk,Yk) a(1) + a(2)*exp(-a(3)*Xk) - Yk; estcoefs = zeros(n,3); param0 = [1 1 1]; opts = optimset('lsqnonlin'); opts.Display = 'off'; for i = 1:n k = (batchindex == i); Xk = Xdata(k); Yk = Ydata(k); estcoefs(i,:) = lsqnonlin(@(C) modelfun(C,Xk,Yk),param0,[],[],opts); end toc ```
```Elapsed time is 143.142561 seconds. ```

## Next, solve using batchpleas

Note that we need provide starting values for the nonlinear parameter, i.e., a3. (A roughly 10 to 1 speedup is common compared to the loop.) 250 was used for the batch size here. The optimum batch size will vary based on the problem.

```tic INLPstart = 1; funlist = {1, @(c,X) exp(-c*X)}; batchsize = 250; [INLP,ILP] = batchpleas(Xdata,Ydata,batchindex,funlist,INLPstart,batchsize); bpcoefs = [ILP',INLP']; toc ```
```Elapsed time is 11.339750 seconds. ```

## Bound constraints on the parameters

we can see that the parameters varied by quite a lot here.

```min(bpcoefs) max(bpcoefs) ```
```ans = -1.51751064039364 0.875201384784349 0.215052328982282 ans = 2.21908409109594 4.78801703187211 2.95596454541221 ```

Remember that the model was y = a1 + a2*exp(-a3)*x

For example, see how the rate parameter (a3) varied using a histogram.

```hist(bpcoefs(:,1),100) title 'a1 parameter' figure hist(bpcoefs(:,2),100) title 'a2 parameter' figure hist(bpcoefs(:,3),100) title 'a3 parameter' ```

We can be confident that a negative rate parameter would make no sense. Likewise, the other parameters in this model must always be non-negative.

Sadly, the time for the solve is not as great of a speedup, because batchpleas is forced to use lsqlin for the linear solver when there are constraints on the linear parameters.

```tic INLPlb = 0; INLPub = []; ILPlb = [0 0]; ILPub = []; batchsize = 1000; [INLP,ILP] = batchpleas(Xdata,Ydata,batchindex,funlist,INLPstart,batchsize,[],INLPlb,INLPub,ILPlb,ILPub); bpcoefs2 = [ILP',INLP']; toc ```
```Elapsed time is 42.784164 seconds. ```

The bounded parameters are now much more well behaved.

```hist(bpcoefs2(:,1),100) title 'a1 parameter (bounded)' figure hist(bpcoefs2(:,2),100) title 'a2 parameter (bounded)' figure hist(bpcoefs2(:,3),100) title 'a3 parameter (bounded)' ```

## A problem with 2 nonlinear parameters, a sum of exponentials

Make up some random data.

```xdata = rand(100000,1); ydata = exp(xdata*0.4) + exp(-xdata/2) + randn(size(xdata))/1000; ```

randomly assign those points onto 1000 different curves, so on average 100 points per curve. Sums of exponential models are notoriously difficult to estimate well.

```batchindex = ceil(rand(size(xdata))*1000); ```

create a sum of exponentials model, and fit them all at once.

```funlist = {@(c,X) exp(-c(1)*X),@(c,X) exp(-c(2)*X)}; [INLP,ILP] = batchpleas(xdata,ydata,batchindex,funlist,[-.1,.1]); figure hist(INLP(1,:),20) title 'Nonlinear parameter #1 - nominally -0.5' figure hist(INLP(2,:),20) title 'Nonlinear parameter #2 - nominally 0.4' ```

The original nonlinear parametters were all [-.4, 0.5]

```mean(INLP,2) ```
```ans = -0.401047416658448 0.503252965493004 ```

The linear parameters were [1,2].

```mean(ILP,2) ```
```ans = 1.00050370239409 0.999501552491479 ```