Path: news.mathworks.com!not-for-mail
From: "Steven_Lord" <slord@mathworks.com>
Newsgroups: comp.soft-sys.matlab
Subject: Re: Inverse of square block diagonal matrix
Date: Wed, 1 May 2013 10:11:34 -0400
Organization: MathWorks
Lines: 54
Message-ID: <klr7qm$gon$1@newscl01ah.mathworks.com>
References: <klqlgu$n97$1@newscl01ah.mathworks.com> <klr623$af9$1@newscl01ah.mathworks.com> <klr6q1$d19$1@newscl01ah.mathworks.com>
NNTP-Posting-Host: ah-slord.dhcp.mathworks.com
Mime-Version: 1.0
Content-Type: text/plain;
	format=flowed;
	charset="UTF-8";
	reply-type=response
Content-Transfer-Encoding: 7bit
X-Trace: newscl01ah.mathworks.com 1367417494 17175 172.28.9.169 (1 May 2013 14:11:34 GMT)
X-Complaints-To: news@mathworks.com
NNTP-Posting-Date: Wed, 1 May 2013 14:11:34 +0000 (UTC)
In-Reply-To: <klr6q1$d19$1@newscl01ah.mathworks.com>
X-Priority: 3
X-MSMail-Priority: Normal
Importance: Normal
X-Newsreader: Microsoft Windows Live Mail 14.0.8089.726
X-MimeOLE: Produced By Microsoft MimeOLE V14.0.8089.726
Xref: news.mathworks.com comp.soft-sys.matlab:794690



"Lam " <lam.dota@gmail.com> wrote in message 
news:klr6q1$d19$1@newscl01ah.mathworks.com...
> Running time of full matrix multiplication would be order O(n^3)
> If it is sparse, it would be much less.
> I guess it would be around O(n^2) for my case.
>
> I am not sure about how fast matlab do the backslash for my case.
> For full matrix gaussian elimination, the running time would be order 
> O(n^3)
>
> If I calculate the inverse first, then the running time can be reduced in 
> the long run.

http://www.mathworks.com/help/matlab/ref/inv.html

"In practice, it is seldom necessary to form the explicit inverse of a 
matrix. A frequent misuse of inv arises when solving the system of linear 
equations Ax = b. One way to solve this is with x = inv(A)*b. A better way, 
from both an execution time and numerical accuracy standpoint, is to use the 
matrix division operator x = A\b. This produces the solution using Gaussian 
elimination, without forming the inverse. See mldivide (\) for further 
information."

If you're solving many systems (which I'm guessing is the case from your 
statement about "the long run") then there are other alternatives that I 
would explore before going to INV.

1) Concatenate all your right-hand side vectors together into a matrix and 
solve using \ with your coefficient matrix and the matrix of right-hand 
sides all at once.
2) Use one of the iterative solvers listed in the last section of this page. 
This has the benefit that you may not even need to explicitly construct the 
coefficient matrix, if you can write a function to compute A*x without it.

http://www.mathworks.com/help/matlab/math/systems-of-linear-equations.html

3) Factor the matrix first (CHOL, LU, QR, ILU, etc.) and solve the two 
triangular systems either with \ or LINSOLVE (telling LINSOLVE the matrices 
are triangular so it goes right to the triangular solver.)

http://www.mathworks.com/help/matlab/matrix-decomposition.html

In general, you SHOULD NOT INVERT your coefficient matrix unless you know 
that you specifically need the inverse and you know that your matrix is 
well-conditioned.

-- 
Steve Lord
slord@mathworks.com
To contact Technical Support use the Contact Us link on 
http://www.mathworks.com