Hi all! Since polyfit and corrcoef do not remove NaNs I am trying to remove NaNs first and then do the correlation.
The situation is the following: I have two variables, one is the observed and the other is the forecasted one. Both of them have 17 columns (each forecast time) and 9500 lines aprox.
What I know is that I need to ask for NaNs in at least one of them and then remove that row.
I tried this (my variables are rh_media_3 and hr_ref2):
tmp= [ rh_media_3(:,i) hr_ref2(:,i) ];
rowsWithNaN = any(isnan(tmp),2);
tmp=polyfit(rh_media_3(:,i),hr_ref2(:,i),1); pendiente_hr(i)=tmp(1); temp=corrcoef(rh_media_3(:,i),hr_ref2(:,i)); coef_lineal_hr(i)=temp(2,1); end
No matter what I do I get coefficients greater than 1, which is not real. Can anyone help please?
No products are associated with this question.
To compute the R-square you've got to compute the two SS terms -- SSe and SSt, the SSerror and SStotal, respectively
If have a fit of y=p(x), from
yhat = polyval(p,x); % fit results ye = y-yhat; % residuals (error) SSe = sum(ye.^2); % SS error SSt = (length(y)-1)*var(y); % SS total Rsq = 1-SSe/SSt; % R-square
Try that instead of comparing the slope and expecting it to be some specific value. (Substitute your appropriate variables for x,y and p, obviously)
If the concern is over the slopes of the linear fit, there's nothing that says limits its magnitude; that'll be wholly dependent upon the way the response variable (hr_ref2) varies w/ the independent (rh_media_3).
If the forecast response is preferentially biased high against the observed value, that's exactly what one would expect.
Plotting the data and the fitted line would probably reveal a lot.
Play games and win prizes!Learn more