Asked by Shuster
on 30 Mar 2013

I have a matrix that is 100M x 100M , where M stands for million. I need to read it into matlab to compute eigen values. I have a i7 Mac with 8GB. Is it sufficient or should I go for some solid state device with ~128GB or so..?

Also, all I need is just the product of eigenvalues of my 100M x 100M matrix. If there is a shortcut to compute the product directly without having to compute each and every eigen value, that is preferable.

If computing eigen values isn't feasible using matlab for the size of matrix I'm dealing with, please suggest a different package that can do it at least on a supercomputing (HPC) facility

Can SVD be used instead of eigen value computation to get the product of eigen values? Any SVD package that can speed up my computation?

thanks in advance!

*No products are associated with this question.*

Answer by James Tursa
on 30 Mar 2013

Edited by James Tursa
on 30 Mar 2013

100M x 100M full matrix is over 71 PETA bytes! No, I don't think you have enough memory (or time) to do this. Is your matrix sparse? If not you need to reformulate your problem.

The product of the eigenvalues of a matrix is equal to the determinant of the matrix. Assuming you can even calculate it, what do you plan to do with this (probably very inaccurate) determinant anyway?

Show 4 older comments

Walter Roberson
on 30 Mar 2013

I wonder if the following would be of any assistance?

Shuster
on 1 Apr 2013

Opportunities for recent engineering grads.

## 2 Comments

## Cedric Wannaz (view profile)

Direct link to this comment:http://www.mathworks.com/matlabcentral/answers/69161#comment_139930

If your matrix is not sparse, you will need 625,000 solid state drives of 128GB each, just to store it. It might be a little challenging, without even talking about finding eigen v's. I think that you should first re-orient the general approach. Why do you need such a large matrix? What would be its structure (if it were possible to store it)?

## Shuster (view profile)

Direct link to this comment:http://www.mathworks.com/matlabcentral/answers/69161#comment_139931

the information need to populate the 100M X 100M matrix is currently stored on disk in ~27GB (binary file), exploring all the redundancy. I can't tell much about the structure of matrix but as you can see from the storage space used on disk, it has huge number of zeros. Some entries are repeated versions of a particular entry. Such redundant entries were also not stored. But to compute eigen values or determinant, I thought I might have to construct the matrix first to give to Matlab