Got Questions? Get Answers.
Discover MakerZone

MATLAB and Simulink resources for Arduino, LEGO, and Raspberry Pi

Learn more

Discover what MATLAB® can do for your career.

Opportunities for recent engineering grads.

Apply Today

Thread Subject:
Inverse of square block diagonal matrix

Subject: Inverse of square block diagonal matrix

From: Lam

Date: 1 May, 2013 08:59:10

Message: 1 of 8

I am trying to speed up the calculation of inverse of square block diagonal matrix.

Currently I am taking inverse of the block one by one. Here is the code:

function [ Ainv ] = blkinv( A, m )
C = cell(rows(A)/m,1);
k = 1;
for i = 1:m:rows(A)
    C{k} = inv(full(A(i:i+m-1,i:i+m-1)));
    k = k + 1;
end
Ainv = blkdiag(sparse(0,0),C{:});
end

I wonder how I can compute the inverse in a parallel way.
Any "multi-threaded function" can help?

Subject: Inverse of square block diagonal matrix

From: Lam

Date: 1 May, 2013 09:59:08

Message: 2 of 8

I just made anther attempt,

function [ R ] = blkinv2( A, m )
B = mat2cell(full(reshape(nonzeros(A), m, [])), m, ones(rows(A)/m,1)*m)';
D = cellfun(@(X) sparse(inv(X)), B, 'UniformOutput', false);
R = blkdiag(D{:});
end

This is much faster, but still not parallel...

"Lam" wrote in message <klqlgu$n97$1@newscl01ah.mathworks.com>...
> I am trying to speed up the calculation of inverse of square block diagonal matrix.
>
> Currently I am taking inverse of the block one by one. Here is the code:
>
> function [ Ainv ] = blkinv( A, m )
> C = cell(rows(A)/m,1);
> k = 1;
> for i = 1:m:rows(A)
> C{k} = inv(full(A(i:i+m-1,i:i+m-1)));
> k = k + 1;
> end
> Ainv = blkdiag(sparse(0,0),C{:});
> end
>
> I wonder how I can compute the inverse in a parallel way.
> Any "multi-threaded function" can help?

Subject: Inverse of square block diagonal matrix

From: Bruno Luong

Date: 1 May, 2013 11:07:09

Message: 3 of 8

http://www.mathworks.com/matlabcentral/fileexchange/25977-mtimesx-fast-matrix-multiply-with-multi-dimensional-support

http://www.mathworks.com/matlabcentral/fileexchange/37515-mmx-multithreaded-matrix-operations-on-n-d-matrices

So far I only use mtimesx and can strongly recommend it.

Bruno

Subject: Inverse of square block diagonal matrix

From: Lam

Date: 1 May, 2013 12:21:07

Message: 4 of 8

Thank you for your information. Seems it will take me some time to study it.

BTW, do you know other free toolbox for parallel calculation?
It will be wonderful if there is some parallel version of "cellfun".

"Bruno Luong" <b.luong@fogale.findmycountry> wrote in message <klqt0t$ckj$1@newscl01ah.mathworks.com>...
> http://www.mathworks.com/matlabcentral/fileexchange/25977-mtimesx-fast-matrix-multiply-with-multi-dimensional-support
>
> http://www.mathworks.com/matlabcentral/fileexchange/37515-mmx-multithreaded-matrix-operations-on-n-d-matrices
>
> So far I only use mtimesx and can strongly recommend it.
>
> Bruno

Subject: Inverse of square block diagonal matrix

From: Steven_Lord

Date: 1 May, 2013 13:41:24

Message: 5 of 8



"Lam " <lam.dota@gmail.com> wrote in message
news:klqlgu$n97$1@newscl01ah.mathworks.com...
> I am trying to speed up the calculation of inverse of square block
> diagonal matrix.

Why do you need to compute the inverse? If you're doing it to solve a system
of equations, stop and use backslash instead.

*snip*

--
Steve Lord
slord@mathworks.com
To contact Technical Support use the Contact Us link on
http://www.mathworks.com

Subject: Inverse of square block diagonal matrix

From: Lam

Date: 1 May, 2013 13:54:09

Message: 6 of 8

Running time of full matrix multiplication would be order O(n^3)
If it is sparse, it would be much less.
I guess it would be around O(n^2) for my case.

I am not sure about how fast matlab do the backslash for my case.
For full matrix gaussian elimination, the running time would be order O(n^3)

If I calculate the inverse first, then the running time can be reduced in the long run.

"Steven_Lord" <slord@mathworks.com> wrote in message <klr623$af9$1@newscl01ah.mathworks.com>...
>
>
> "Lam " <lam.dota@gmail.com> wrote in message
> news:klqlgu$n97$1@newscl01ah.mathworks.com...
> > I am trying to speed up the calculation of inverse of square block
> > diagonal matrix.
>
> Why do you need to compute the inverse? If you're doing it to solve a system
> of equations, stop and use backslash instead.
>
> *snip*
>
> --
> Steve Lord
> slord@mathworks.com
> To contact Technical Support use the Contact Us link on
> http://www.mathworks.com

Subject: Inverse of square block diagonal matrix

From: Steven_Lord

Date: 1 May, 2013 14:11:34

Message: 7 of 8



"Lam " <lam.dota@gmail.com> wrote in message
news:klr6q1$d19$1@newscl01ah.mathworks.com...
> Running time of full matrix multiplication would be order O(n^3)
> If it is sparse, it would be much less.
> I guess it would be around O(n^2) for my case.
>
> I am not sure about how fast matlab do the backslash for my case.
> For full matrix gaussian elimination, the running time would be order
> O(n^3)
>
> If I calculate the inverse first, then the running time can be reduced in
> the long run.

http://www.mathworks.com/help/matlab/ref/inv.html

"In practice, it is seldom necessary to form the explicit inverse of a
matrix. A frequent misuse of inv arises when solving the system of linear
equations Ax = b. One way to solve this is with x = inv(A)*b. A better way,
from both an execution time and numerical accuracy standpoint, is to use the
matrix division operator x = A\b. This produces the solution using Gaussian
elimination, without forming the inverse. See mldivide (\) for further
information."

If you're solving many systems (which I'm guessing is the case from your
statement about "the long run") then there are other alternatives that I
would explore before going to INV.

1) Concatenate all your right-hand side vectors together into a matrix and
solve using \ with your coefficient matrix and the matrix of right-hand
sides all at once.
2) Use one of the iterative solvers listed in the last section of this page.
This has the benefit that you may not even need to explicitly construct the
coefficient matrix, if you can write a function to compute A*x without it.

http://www.mathworks.com/help/matlab/math/systems-of-linear-equations.html

3) Factor the matrix first (CHOL, LU, QR, ILU, etc.) and solve the two
triangular systems either with \ or LINSOLVE (telling LINSOLVE the matrices
are triangular so it goes right to the triangular solver.)

http://www.mathworks.com/help/matlab/matrix-decomposition.html

In general, you SHOULD NOT INVERT your coefficient matrix unless you know
that you specifically need the inverse and you know that your matrix is
well-conditioned.

--
Steve Lord
slord@mathworks.com
To contact Technical Support use the Contact Us link on
http://www.mathworks.com

Subject: Inverse of square block diagonal matrix

From: Lam

Date: 1 May, 2013 15:45:10

Message: 8 of 8

Thank you for your comprehensive information.

However it seems
1) the iterative methods doesn't work for my situation
2) after testing, factoring the matrix doesn't help
e.g. Performing U\(L\y) is almost the same as A\y (Strange!)

Actually I am doing a time-stepping PDE solver,
So my solution would be something like (A^-N)(y).
And N is very large.

Fortunately the condition number isn't very large.

Lam


"Steven_Lord" <slord@mathworks.com> wrote in message <klr7qm$gon$1@newscl01ah.mathworks.com>...
>
>
> "Lam " <lam.dota@gmail.com> wrote in message
> news:klr6q1$d19$1@newscl01ah.mathworks.com...
> > Running time of full matrix multiplication would be order O(n^3)
> > If it is sparse, it would be much less.
> > I guess it would be around O(n^2) for my case.
> >
> > I am not sure about how fast matlab do the backslash for my case.
> > For full matrix gaussian elimination, the running time would be order
> > O(n^3)
> >
> > If I calculate the inverse first, then the running time can be reduced in
> > the long run.
>
> http://www.mathworks.com/help/matlab/ref/inv.html
>
> "In practice, it is seldom necessary to form the explicit inverse of a
> matrix. A frequent misuse of inv arises when solving the system of linear
> equations Ax = b. One way to solve this is with x = inv(A)*b. A better way,
> from both an execution time and numerical accuracy standpoint, is to use the
> matrix division operator x = A\b. This produces the solution using Gaussian
> elimination, without forming the inverse. See mldivide (\) for further
> information."
>
> If you're solving many systems (which I'm guessing is the case from your
> statement about "the long run") then there are other alternatives that I
> would explore before going to INV.
>
> 1) Concatenate all your right-hand side vectors together into a matrix and
> solve using \ with your coefficient matrix and the matrix of right-hand
> sides all at once.
> 2) Use one of the iterative solvers listed in the last section of this page.
> This has the benefit that you may not even need to explicitly construct the
> coefficient matrix, if you can write a function to compute A*x without it.
>
> http://www.mathworks.com/help/matlab/math/systems-of-linear-equations.html
>
> 3) Factor the matrix first (CHOL, LU, QR, ILU, etc.) and solve the two
> triangular systems either with \ or LINSOLVE (telling LINSOLVE the matrices
> are triangular so it goes right to the triangular solver.)
>
> http://www.mathworks.com/help/matlab/matrix-decomposition.html
>
> In general, you SHOULD NOT INVERT your coefficient matrix unless you know
> that you specifically need the inverse and you know that your matrix is
> well-conditioned.
>
> --
> Steve Lord
> slord@mathworks.com
> To contact Technical Support use the Contact Us link on
> http://www.mathworks.com

Tags for this Thread

No tags are associated with this thread.

What are tags?

A tag is like a keyword or category label associated with each thread. Tags make it easier for you to find threads of interest.

Anyone can tag a thread. Tags are public and visible to everyone.

Contact us