# Algorithm to extract linearly dependent columns in a matrix

200 views (last 30 days)

Show older comments

Is there any general or standard approach to extract columns that are linearly dependent from the given matrix ?

Thanks and any help is apperciated !

##### 0 Comments

### Accepted Answer

John D'Errico
on 3 Aug 2020

It is difficult to answer this. Consider a simple matrix:

format short g

A = rand(4,5)

A =

0.73796 0.36725 0.042904 0.0056783 0.45642

0.13478 0.54974 0.60439 0.39645 0.36555

0.4903 0.25921 0.63407 0.77345 0.95769

0.43373 0.27658 0.6462 0.25778 0.23421

The matrix (since it is random) will be of full rank, thus 4 in this case.

EVERY column is linearly dependent. That is, We can write every column as a linear combination of the other 4 columns. I can argue problems exist with other matrices too.

A = magic(6)

A =

35 1 6 26 19 24

3 32 7 21 23 25

31 9 2 22 27 20

8 28 33 17 10 15

30 5 34 12 14 16

4 36 29 13 18 11

rank(A)

ans =

5

So A here has rank 5. But again, we can write any column of A as a linear combination of the other 5. So which column is linearly dependent? They all are!

Perhaps you might get something out of the null space vector(s).

Anull = null(A);

Anull = Anull/Anull(1)

Anull =

1

1

-0.5

-1

-1

0.5

This gives us the linear combination of importance as:

A(:,1) + A(:,2) - 0.5*A(:,3) - A(:,4) - A(:,5) + 0.5*A(:,6) = 0

We can now solve for ANY of those columns, in terms of the others.

How it helps you, I don't really know, because I have no idea what you really want to do.

If I had to guess, what you really need is to learn enough about linear algebra, and perhaps what a pivoted QR decomposition might provide. Because once you have that pivoted QR, you also have enough to do almost anything you want to do.

[Q,R,E] = qr(A,0)

Q =

-0.017647 0.64381 -0.20699 0.22818 0.49018 0.5

-0.56472 -0.11589 -0.45002 0.59421 -0.33476 -3.7638e-16

-0.15883 0.52674 -0.446 -0.47355 -0.15542 -0.5

-0.49413 -0.0017098 0.37629 0.14508 0.58583 -0.5

-0.088237 0.52963 0.62872 0.19923 -0.52605 -4.2484e-16

-0.63531 -0.11878 0.13728 -0.55665 -0.059779 0.5

R =

-56.666 -16.377 -42.107 -33.53 -33.53 -35.224

0 53.915 18.611 30.231 30.676 29.048

0 0 32.491 -7.9245 -8.9182 -11.289

0 0 0 10.101 5.6138 -0.56312

0 0 0 0 5.1649 -5.1649

0 0 0 0 0 -2.0136e-15

E =

2 1 3 6 4 5

We can use the QR to tell us much about the problem, as well as efficiently and stably provide all you need.

First, notice that Q(6,6) is essentially zero, compared to the other diagonal elements.

As well, the QR tells us that it decided column 5 (that is the last element of E) is the one it wants to drop out.

##### 17 Comments

Nguyen Le
on 8 Jun 2023

Matt J
on 8 Jun 2023

Edited: Matt J
on 8 Jun 2023

@Nguyen Le I suspect that method would be a lot slower because it could require O(N) svd operations to compute the rank. Additionally, it would be difficult to decide what numerical tolerance parameter to choose for the submatrix rank computations. Consider for example,

A=[1e-20 1;

1e-20 1];

B=[1e-20 2e-20;

1e-20 2e-20];

Numerically, the first column of A should be considered to have rank 0, whereas the same column in matrix B should be considered to have rank 1. But there's no way to know that without somehow choosing rank()'s tolerance parameter adaptively based on the entire matrix. With the selections below, we get the opposite results from what we would want:

rank(A(:,1))

rank(B(:,1),1e-10)

### More Answers (2)

Bruno Luong
on 3 Aug 2020

Edited: Bruno Luong
on 3 Aug 2020

Test matrix (10 x 6) with rank 4

M = rand(10,4)*rand(4,6)

Automatic selection of independent columns of M

[Q,R,p] = qr(M,'vector');

dr = abs(diag(R));

if dr(1)

tol = 1e-10;

r = find(dr>=tol*dr(1),1,'last');

ci = p(1:r) % here is the index of independent columns

else

r = 0;

ci = [];

end

% Submatrix with r columns (and full column rank).

Mind=M(:,ci)

% Those three rank estimation should be equals

% if it's not then the cause if MATLAB selection of tolerance for rank differs with the above

% and usage of more robust SVD algorithm for rank estimation

rank(Mind)

rank(M)

r

##### 14 Comments

Henry Wolkowicz
on 30 Oct 2021

Bruno Luong
on 31 Oct 2021

When you apply qr with permutation on sparse matrix S

[Q,R,p] = qr(S,'vector')

MATLAB returns the permutation to have R having "good" sparse pattern, and does not make the diagonal

abs(diag(R))

decreasing (this requirement usually makes R fully filled with non-zeros elements).

### See Also

### Categories

### Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!