File Exchange

## gmregress

version 1.7 (5.44 KB) by

Geometric Mean Regression (Reduced Major Axis Regression).

Updated

Model II regression should be used when the two variables in the regression equation are random and subject to error, i.e. not controlled by the researcher. Model I regression using ordinary least squares underestimates the slope of the linear relationship between the variables when they both contain error. According to Sokal and Rohlf (1995), the subject of Model II regression is one on which research and controversy are continuing and definitive recommendations are difficult to make.
GMREGRESS is a Model II procedure. It standardize variables before the slope is computed. Each of the two variables is transformed to have a mean of zero and a standard deviation of one. The resulting slope is the geometric mean of the linear regression coefficient of Y on X. Ricker (1973) coined this term and gives an extensive review of Model II regression.
[B,BINTR,BINTJM] = GMREGRESS(X,Y,ALPHA) returns the vector B of regression coefficients in the linear Model II and a matrix BINT of the given confidence intervals for B by the Ricker (1973) and Jolicoeur and Mosimann (1968)-McArdle (1988) procedure.

GMREGRESS treats NaNs in X or Y as missing values, and removes them.

Syntax: function [b,bintr,bintjm] = gmregress(x,y,alpha)

Antonio Trujillo-Ortiz

### Antonio Trujillo-Ortiz (view profile)

Hi Aish,

Thanks for your interst in our m-functions. In the Matlab environment, go to the drive where you have saved this function. You can just write 'type gmregress'. There you can find the CI-mathematical algorithm; this explain you how it is possible to calculete them.

In this m-function there were developed two procedures to estimate the intercept and slope, as well as its confidence intervals (CI's) for a Modell II regression: Ricker and Jolicoeur and Mosimann procedures. The first uses the t-distribution and the second the F-distribution.

Recall that to perform a hypothesis test you can use the (1) comparision of it observed statistic and its estimate or a (2) CI's; this latter for a two tailed.test. In the case of the use of CI's, if the observed intercept or slope value falls between CI's, a non-significant result can be considered; otherwise significant (outside the CI's).

In a two-tailed hypothesis test, if the result is not significant, it can be said, without calculatating, that the p-value is greater than or equal to the alpha-value (Type I error or experimenterwise) is the the probability of do not reject Ho since Ho is true.

Yours,
Prof. A. Trujllo-Ortiz

Aishwarya

### Aishwarya (view profile)

How do we calculate the p value of the estimate for slope? How do we find if the slope calculated is significant or not?

Antonio Trujillo-Ortiz

### Antonio Trujillo-Ortiz (view profile)

Hi Ainundil,

This is a parameter estimation for a particular regression model. As such, it is important to know the confidence intervals (CI's) of such a regression parameters coming from sample estimators. So, the alpha-value it is just the significance used and needed for our CI's (P = 1 - alpha).

Best,

Antonio Trujillo-Ortiz

PD. You must input at least three input arguments x,y and alhpa-value; if you only input x & y, automatically the file by default take it as 0.05.

Ainundil

### Ainundil (view profile)

Works great

but, you do not explain what is alpha. I assume that is for the confidence interval. However it does not give the same results as in ricker (1973, table 6) in the confidence intervals.

[El programa no explica lo que es alfa. y da distinto los intervalos al paper de ricker (1973, tabla 6). eso o no se bien como ingresar el parametro alfa.]

Thanks for the function

Antonio Trujillo-Ortiz

### Antonio Trujillo-Ortiz (view profile)

The slope sign bug was efficiently corrected thanks to the valuable suggestions given by Holger Goerlitz and Joel E. Cohen. Yes, a negative slope are always negative!

Antonio Trujillo-Ortiz

Holger Goerlitz

### Holger Goerlitz (view profile)

Thank you very much, very well done and works great.

A quick comparison with the rma.m by Edward T. Peltzer (http://web.ics.purdue.edu/~braile/eas309/rma.m) gives identical results.

Except for one error, I believe: gmregress always returns a positive slope, even for data with a negative trend. After correcting the slope by (compare to rma.m):

si = r/abs(r); % sign of correl. coeff.

b = si*b;

negative slopes are negative.

Thanks for the function again,
Holger

Pete

### Pete (view profile)

Well documented. Appears to work as advertised.

I would invite somebody to compare and contrast this script with the scripts by Edward T. Peltzer hosted at: http://www.mbari.org/staff/etp3/regress.htm

(which encouragingly produce identical slopes)