Hi John,
Thank you for your reply. Your posts have helped me before and you consistently give outstanding advice to this newsgroup. Many thanks for sharing your wisdom.
Let me clarify and give you the "whole story" behind my original post. Perhaps you could kindly provide input...
I gave a degenerate example in my post here just to understand the interface for fminsearch. I absolutely agree with everything you said with regard to the technical content (or lackthereof) of my post; I was using a very simple numerical example just to figure out how to use fminsearch. I thought more people would reply to a simpler example!
The real problem I am trying to solve is fitting a linear regression model to some real data, but instead of minimizing the mean square error (as is traditionally done with linear regression, specifically, multiple linear regression in my case and gives a very simple solution), I wish to minimize the dispersion component of the RMSE (i.e., trying to minimize phase errors), given by,
RMSE_disp = sqrt(2*std(p_hat)*std(p)*(1cc(p_hat,p)),
where ccm = corrcoef(p_hat,p) and cc = ccm(2,1). Also, I parameterize p_hat = X*beta. p (m x 1) and X (m x n) are observed, beta (n x 1) is the vector to be minimized.
I did a quick analysis using a surface plot of disp(std(p_hat)*std(p),cc(p_hat,p)) and it looks to be convex. In lieu of trudging through an analytical solution to minimization using the derivative of disp w.r.t beta, I thought to use fminsearch to minimize disp.
What do you think about such an approach???
Best regards,
Evan
"John D'Errico" <woodchips@rochester.rr.com> wrote in message <i1isd1$cgh$1@fred.mathworks.com>...
> "Evan Ruzanski" <ruzanski.02@engr.colostate.edu> wrote in message <i1hcqf$gph$1@fred.mathworks.com>...
> > Hello,
> >
> > I'm trying to find linear regression coefficients based on maximizing the correlation coefficient between a set of observed predictors (X) and observations (y) (instead of minimizing the LSE as is done with standard linear regression, i.e., b = [(X^TX)^1]X^Ty). In other words, I'm trying to find an optimum set of predictor coefficients based on minimizing a different cost function.
>
>
> DON'T do this.
>
> To start with, you do not even know how to
> solve the regression problem. This is a TERRIBLE
> line of code:
>
> b = [(X'*X)^1]X'*y);
>
> It uses a matrix inverse instead of backslash. It
> squares the condition number, making the
> problem more illconditioned than it should be.
>
> The correct way to solve that problem is:
>
> b = X\y;
>
> If that fails, then the use of fminsearch to solve
> your problem is (I'm sorry to say this) laughable.
> Fminsearch will not be able to resolve a poorly
> conditioned least squares problem more accurately
> than backslash.
>
> If you still have problems, then the next thing
> to do is to learn to use rescaling or another
> transformation of your problem to improve the
> conditioning. Better, recognize when the problem
> is simply a result of terribly generated data, that
> will never be adequate for estimation as it is.
>
> John
