I'm looking to obtain regression coefficients from three predictor variables (e.g. alpha1rad,alpha2rad, alpha3rad), where each variable is [101 x 20] (i.e. 101 data frames and 20 trials).
mn = mean(data,2);
dev = data-repmat(mn,1,N);
For one point in time my data was a [1 x 20] (i.e. one data frame for 20 trials) where each predictor variable (x) was:
x1 = dev (1,:); x2 = dev (2,:); x3 = dev (3,:);
before defining X as:
X = [ones(length(x1),1) x1' x2' x3'];
Therefore, how can I define X using the larger data sets (i.e.[alpha1rad(i,:);alpha2rad(i,:);alpha3rad(i,:); and would this require a for loop such as:
data = [alpha1rad(i,:);alpha2rad(i,:);alpha3rad(i,:);
No products are associated with this question.
Perhaps you can build on this. Here I set up some fake data with a known relationship with a single outcome variable. Then I loop over all rows and compute the coefficients, and assemble them into a coefficient matrix. I look at the first few to make sure they capture the known relationship.
>> x1 = rand(101,20); >> x2 = rand(101,20); >> x3 = rand(101,20); >> trial = (1:101)'; >> y = repmat(trial,1,20) + x1 + 2*x2 + 3*x3 + randn(101,20)/10; >> b = zeros(4,101); >> for j=1:101 X = [ones(20,1),x1(j,:)',x2(j,:)',x3(j,:)']; Y = y(j,:)'; b(:,j) = X\Y; end >> b(:,1:5) ans = 1.0319 1.8816 3.0233 4.0347 5.0205 0.9892 1.0496 1.0443 1.0919 0.9576 2.0266 2.0864 1.9609 1.9148 1.8656 2.9049 3.1052 2.9136 2.9238 3.1082
You could embellish this to add more outcome variables (more columns of the Y matrix) and to subtract means at any point.
Play games and win prizes!Learn more