# How can I compute regression coefficients for two or more output variables?

70 views (last 30 days)
Tim Bennett on 5 Sep 2012
Edited: the cyclist on 23 Nov 2022 at 1:35
Is it possible for me to have two output variables using multiple linear regression? For example, computing the coefficients between mean free joint angles (x1 = hip, x2 = knee, x3 = ankle) as predictor variables (X) and changes in the mean-free foot centre of mass as an outcome variable (y) in the x-axis dimension (y1) and y-axis dimension (y2)?
Would I then enter the data as y1 + y2 = y and use the B = X\y'; equation to give me the coefficient estimates in a [2 x 3 ] matrix (i.e. 2 columns for the x and y axis and 3 rows for the hip, knee and ankle joints)?
Any help would be appreciated.
the cyclist on 8 Sep 2012
Your Y array will stay in the same format, but the mvregress() function requires a specific cell array format for the input of X. With full disclosure that I am not experienced at using this particular function, I can only suggest you pore over the example given in
> doc mvregress
to understand the syntax. Also, I found this example on the support site:
I hope these will help.

the cyclist on 5 Sep 2012
Edited: the cyclist on 23 Nov 2022 at 1:35
Do you have the Statistics Toolbox? The mvregress() function does the type of regression you want.
Here is a detailed, commented example of the use of mvregress(). In particular, I have given examples of three different "design matrices", corresponding to different choices of which regression parameters are independent.
Choice #1 is where y1 and y2 are separate regressions, meaning that you estimate the regression parameters for y1 and y2 independently. (This choice also corresponds to the solution that Star Strider and Greg Heath offered.) Notice that I replicate that matrix solution as "beta_alternative" in the code, to show that it gives the same results.
Choices #2 and #3 are different choices, in which the intercept and/or slope parameters are shared across the regression.
tbl = [65 71 63 67; ...
72 77 70 70; ...
77 73 72 70; ...
68 78 75 72; ...
81 76 89 88; ...
73 87 76 77];
X = tbl(:,[1 2]);
Y = tbl(:,[3 4]);
[N,M] = size(Y);
% Because our target data is multidimensional, we need to first put our
% predictor data into an [N x 1] cell array, one cell per observation.
% (In this example, N=6.) Each cell contains the desired design matrix,
% with the intercepts and independent variables for that observation.
% In each cell, there is one row per dimension (M) in Y. (In this example, M=2.)
pred_cell = cell(N,1);
for i = 1:N,
% Choose ONE of the three design matrices below:
% (1) For each of the N points, set up a design matrix specifying
% different intercept and different slope terms. (This is equivalent to
% doing y1 and y2 independently.) This will result in 6 betas.
pred_cell{i,1} = [ 1 0 X(i,1) 0 X(i,2) 0 ; ...
0 1 0 X(i,1) 0 X(i,2)];
% % (2) For each of the N points, set up a design matrix specifying
% % a different intercept but common slope terms. This will result
% % in 4 betas.
% pred_cell{i,1} = [eye(2), repmat(X(i,:),2,1)];
% % (3) For each of the N points, set up a design matrix specifying
% % common intercept and common slope terms. This will result in 2 betas.
% pred_cell{i,1} = [repmat([1 X(i,:)],M,1)];
end
% The result from passing the explanatory (X) and response (Y) variables into MVREGRESS using
% the cell format is a vector 'beta' of weights.
beta = mvregress(pred_cell, Y) %#ok<NASGU,NOPRT>
beta = 6×1
-37.5012 -21.4323 1.1346 0.9409 0.3795 0.3514
% The regression in which each row is independent (choice #1 from the design matrices above)
% can also be done with this simple matrix algegra:
beta_alternative = [ones(N,1) X]\Y %#ok<NOPRT,NASGU>
beta_alternative = 3×2
-37.5012 -21.4323 1.1346 0.9409 0.3795 0.3514
This example has 2 explanatory and 2 response variables. A good "homework" exercise for you would be if you could successfully generalize this to having 3 explanatory variables.
Tim Bennett on 6 Sep 2012

Greg Heath on 5 Sep 2012
If
[ N I ] = size(X)
[ N O ] = size(Y)
the linear model is
Y = [ones(N,1) X ] * B
where
B = Y / [ones(N,1) X ]
and
[ I+1 O ] = size(B)
Hope this helps.
Greg
##### 2 CommentsShowHide 1 older comment
Tim Bennett on 6 Sep 2012
Thanks for your reply. I'm having difficulty defining B and Y using your codes. For example when defining Y using:
Y = [ones(N,1) X ] * B
Matlab reports back: ??? Undefined function or variable 'N'?
Any help would be appreciated.