Model II regression should be used when the two variables in the regression equation are random and subject to error, i.e. not controlled by the researcher. Model I regression using ordinary least squares underestimates the slope of the linear relationship between the variables when they both contain error. According to Sokal and Rohlf (1995), the subject of Model II regression is one on which research and controversy are continuing and definitive recommendations are difficult to make.
GMREGRESS is a Model II procedure. It standardize variables before the slope is computed. Each of the two variables is transformed to have a mean of zero and a standard deviation of one. The resulting slope is the geometric mean of the linear regression coefficient of Y on X. Ricker (1973) coined this term and gives an extensive review of Model II regression.
[B,BINTR,BINTJM] = GMREGRESS(X,Y,ALPHA) returns the vector B of regression coefficients in the linear Model II and a matrix BINT of the given confidence intervals for B by the Ricker (1973) and Jolicoeur and Mosimann (1968)McArdle (1988) procedure.
GMREGRESS treats NaNs in X or Y as missing values, and removes them.
Syntax: function [b,bintr,bintjm] = gmregress(x,y,alpha)
Antonio TrujilloOrtiz (2020). gmregress (https://www.mathworks.com/matlabcentral/fileexchange/27918gmregress), MATLAB Central File Exchange. Retrieved .
1.7.0.0  Code was improved. 

1.6.0.0  A bug were removed. We are grateful for the valuable suggestions given by Holger Goerlitz and Joel E. Cohen. 

1.4.0.0  Text was improved. 

1.3.0.0  Text was improved. 

1.1.0.0  It was added an appropriate format to cite this file. 
Create scripts with code, output, and formatted text in a single executable document.
Hi Michael. I have to tell you that I think there is no adjustment model when the intercept is zero. We must bear in mind that the reduced major axis regression is to describe the symmetric relationship between two variables and not for predictive use of the variable x with respect to y or y with respect to x (McArdle, 2003, Smith, 2009). On the other hand I have to tell you that I already have a few years retired and withdrawn from all academic activity. Good luck in your inquiry, Antonio TrujilloOrtiz.
McArdle, B.H. Lines, models, and errors: Regression in the field. Limnology and Oceanography. 2003, 48(3):13631366
Smith, R.J. Use and Misuse of the Reduced Major Axis for LineFitting. American Journal of Physical Anthropology. 2009, 140(3):476486
Hello,
is there a way to force the function to fit a special intercept (e.g. 0)?
Hello Liang Zhang. I do not know what your knowledge is with this Type II regression model [Geometric Mean Regression (Reduced Major Axis Regression)]. I have to tell you that the calculation of the coefficient of determination in this model is different from the Type I model with which you intend to estimate it. I recommend you review the statistical procedure in the literature that is cited in the Matlab gmregress file. On the other hand, I tell you that I am going for three years that I retired from any academic activity. Good luck in your inquiry. Antonio TujilloOrtiz.
Dear Antonio TrujilloOrtiz,
I want to know how to calculate R^2, because the number calculated of R^2 using R language is different from that I used by Matlab.(The method for calculating R^2 is that (R^2=1SSE/TSS)). Can you solve the problem?
Thanks so much for writing this function  has been very useful!
Hi Aish,
Thanks for your interst in our mfunctions. In the Matlab environment, go to the drive where you have saved this function. You can just write 'type gmregress'. There you can find the CImathematical algorithm; this explain you how it is possible to calculete them.
In this mfunction there were developed two procedures to estimate the intercept and slope, as well as its confidence intervals (CI's) for a Modell II regression: Ricker and Jolicoeur and Mosimann procedures. The first uses the tdistribution and the second the Fdistribution.
Recall that to perform a hypothesis test you can use the (1) comparision of it observed statistic and its estimate or a (2) CI's; this latter for a two tailed.test. In the case of the use of CI's, if the observed intercept or slope value falls between CI's, a nonsignificant result can be considered; otherwise significant (outside the CI's).
In a twotailed hypothesis test, if the result is not significant, it can be said, without calculatating, that the pvalue is greater than or equal to the alphavalue (Type I error or experimenterwise) is the the probability of do not reject Ho since Ho is true.
Yours,
Prof. A. TrujlloOrtiz
How do we calculate the p value of the estimate for slope? How do we find if the slope calculated is significant or not?
Hi Ainundil,
This is a parameter estimation for a particular regression model. As such, it is important to know the confidence intervals (CI's) of such a regression parameters coming from sample estimators. So, the alphavalue it is just the significance used and needed for our CI's (P = 1  alpha).
Best,
Antonio TrujilloOrtiz
PD. You must input at least three input arguments x,y and alhpavalue; if you only input x & y, automatically the file by default take it as 0.05.
Works great
but, you do not explain what is alpha. I assume that is for the confidence interval. However it does not give the same results as in ricker (1973, table 6) in the confidence intervals.
[El programa no explica lo que es alfa. y da distinto los intervalos al paper de ricker (1973, tabla 6). eso o no se bien como ingresar el parametro alfa.]
Thanks for the function
The slope sign bug was efficiently corrected thanks to the valuable suggestions given by Holger Goerlitz and Joel E. Cohen. Yes, a negative slope are always negative!
Antonio TrujilloOrtiz
Thank you very much, very well done and works great.
A quick comparison with the rma.m by Edward T. Peltzer (http://web.ics.purdue.edu/~braile/eas309/rma.m) gives identical results.
Except for one error, I believe: gmregress always returns a positive slope, even for data with a negative trend. After correcting the slope by (compare to rma.m):
si = r/abs(r); % sign of correl. coeff.
b = si*b;
negative slopes are negative.
Thanks for the function again,
Holger
Well documented. Appears to work as advertised.
I would invite somebody to compare and contrast this script with the scripts by Edward T. Peltzer hosted at: http://www.mbari.org/staff/etp3/regress.htm
(which encouragingly produce identical slopes)