I have some data (x, y) (no uncertainties on either x or y), which very nicely follows an exponentially decaying function. I use "nlinfit" to fit my model to the data, and when I visually inspect the fit, there is a very good correspondence between obtained fit and data.
However the uncertainties on my parameters are huge (as in 1e7, whereas the average y-value is around 7-8), and I don't know why. Here is the command i use
opts = statset('MaxIter',600, 'Display','iter', 'TolFun', 1e-10); [b, r, J, COVB, mse] = nlinfit(data(M:end-N, 1), data(M:end-N, 2), model, guess, opts);
Nothing out of the ordinary. Does anyone have a good idea of what might be wrong?
I wanted to upload my data and m-file for you to test, but apparently that functionality is not present.
No products are associated with this question.
The only reason I can think of for the large parameter variances (and all parameter estimates significantly different from zero) with a small MSE is that you have a large number of parameters relative to the size of the data set. If this is the problem, creating a second data set by interpolating the first to generate more points and then fitting it should result in approximately the same parameter estimates and MSE but with smaller variances.
You can upload your data set and code by pasting it to your original post (‘Edit’ at this point) or adding it as a comment. (If your data set is large, please provide a representative sample of it, including the start and end.)
EDIT: The other reason for large variances in the situation you describe, especially with parameters estimates significantly different from zero, is that the estimated parameter values themselves are large.