Got Questions? Get Answers.
Discover MakerZone

MATLAB and Simulink resources for Arduino, LEGO, and Raspberry Pi

Learn more

Discover what MATLAB® can do for your career.

Opportunities for recent engineering grads.

Apply Today

Thread Subject:
row reducing very large, dense matrices

Subject: row reducing very large, dense matrices

From: Blake Tye

Date: 14 Oct, 2010 01:53:04

Message: 1 of 5

Hello all,
I am an undergraduate research assistant and am working with a professor of mine, who is a nuclear engineer. The problem he is currently working on requires row reducing very large matrices, approximately 17,000 x 17,000. Unfortunately, these matrices are very dense. So, my question to you guys is: is there a way to speed up the process of row-reducing such large, dense matrices? I know that his resources at his lab could handle it, but he would like to make it so that the applet I ultimately develop can be used by other scientists/researchers, and thus we have to assume limited computing power.

Also, the data to be processed is stored on a database, so I will be trying to query the database for the information rather than inputting it myself.

Is there a way to program in MATLAB row reduce in panels? i.e. breaking up the matrix into 1000 x 1000 matrices, row reducing that panel, and then continuing the process until the whole dataset was row-reduced? Would that even be a speed improvement/computer memory requirement over working on the entire matrix?

Thank you in advance for your help!

Blake

Subject: row reducing very large, dense matrices

From: Oleg Komarov

Date: 14 Oct, 2010 02:03:04

Message: 2 of 5

"Blake Tye" <btye@email.arizona.edu> wrote in message <i95nq0$1bg$1@fred.mathworks.com>...
> Hello all,
> I am an undergraduate research assistant and am working with a professor of mine, who is a nuclear engineer. The problem he is currently working on requires row reducing very large matrices, approximately 17,000 x 17,000. Unfortunately, these matrices are very dense. So, my question to you guys is: is there a way to speed up the process of row-reducing such large, dense matrices? I know that his resources at his lab could handle it, but he would like to make it so that the applet I ultimately develop can be used by other scientists/researchers, and thus we have to assume limited computing power.
>
> Also, the data to be processed is stored on a database, so I will be trying to query the database for the information rather than inputting it myself.
>
> Is there a way to program in MATLAB row reduce in panels? i.e. breaking up the matrix into 1000 x 1000 matrices, row reducing that panel, and then continuing the process until the whole dataset was row-reduced? Would that even be a speed improvement/computer memory requirement over working on the entire matrix?
>
> Thank you in advance for your help!
>
> Blake

You should be more precise about what you wanna acomplish.

Which database are you using. I have experience with sql server.

Oleg

Subject: row reducing very large, dense matrices

From: Blake Tye

Date: 14 Oct, 2010 02:27:03

Message: 3 of 5

Oleg,
My current goal is just to row reduce these larges sets of data. To be more precise, my professor works at a national lab observing the behavior of neutrons when they are shot at various materials. From these experiments, huge amounts of data are generated, and the information is hosted on a government database, which he has access to, and that is where I will be trying to extract the data to fill the matrix, which will be row-reduced for his purposes. I am not a nuclear engineer, but rather a mathematics major and chemical engineer major with programming skills who he has employed to work on automating this process, since entering in the information in 17000 x 17000 matrix for every experiment is unreasonable.

Any help you can offer is much appreciated!

Blake

"Oleg Komarov" <oleg.komarovRemove.this@hotmail.it> wrote in message <i95oco$8lc$1@fred.mathworks.com>...
> "Blake Tye" <btye@email.arizona.edu> wrote in message <i95nq0$1bg$1@fred.mathworks.com>...
> > Hello all,
> > I am an undergraduate research assistant and am working with a professor of mine, who is a nuclear engineer. The problem he is currently working on requires row reducing very large matrices, approximately 17,000 x 17,000. Unfortunately, these matrices are very dense. So, my question to you guys is: is there a way to speed up the process of row-reducing such large, dense matrices? I know that his resources at his lab could handle it, but he would like to make it so that the applet I ultimately develop can be used by other scientists/researchers, and thus we have to assume limited computing power.
> >
> > Also, the data to be processed is stored on a database, so I will be trying to query the database for the information rather than inputting it myself.
> >
> > Is there a way to program in MATLAB row reduce in panels? i.e. breaking up the matrix into 1000 x 1000 matrices, row reducing that panel, and then continuing the process until the whole dataset was row-reduced? Would that even be a speed improvement/computer memory requirement over working on the entire matrix?
> >
> > Thank you in advance for your help!
> >
> > Blake
>
> You should be more precise about what you wanna acomplish.
>
> Which database are you using. I have experience with sql server.
>
> Oleg

Subject: row reducing very large, dense matrices

From: Oleg Komarov

Date: 14 Oct, 2010 11:08:03

Message: 4 of 5

"Blake Tye" <btye@email.arizona.edu> wrote in message <i95ppn$8ep$1@fred.mathworks.com>...
> Oleg,
> My current goal is just to row reduce these larges sets of data. To be more precise, my professor works at a national lab observing the behavior of neutrons when they are shot at various materials. From these experiments, huge amounts of data are generated, and the information is hosted on a government database, which he has access to, and that is where I will be trying to extract the data to fill the matrix, which will be row-reduced for his purposes. I am not a nuclear engineer, but rather a mathematics major and chemical engineer major with programming skills who he has employed to work on automating this process, since entering in the information in 17000 x 17000 matrix for every experiment is unreasonable.
>
> Any help you can offer is much appreciated!
>
> Blake
>
> "Oleg Komarov" <oleg.komarovRemove.this@hotmail.it> wrote in message <i95oco$8lc$1@fred.mathworks.com>...
> > "Blake Tye" <btye@email.arizona.edu> wrote in message <i95nq0$1bg$1@fred.mathworks.com>...
> > > Hello all,
> > > I am an undergraduate research assistant and am working with a professor of mine, who is a nuclear engineer. The problem he is currently working on requires row reducing very large matrices, approximately 17,000 x 17,000. Unfortunately, these matrices are very dense. So, my question to you guys is: is there a way to speed up the process of row-reducing such large, dense matrices? I know that his resources at his lab could handle it, but he would like to make it so that the applet I ultimately develop can be used by other scientists/researchers, and thus we have to assume limited computing power.
> > >
> > > Also, the data to be processed is stored on a database, so I will be trying to query the database for the information rather than inputting it myself.
> > >
> > > Is there a way to program in MATLAB row reduce in panels? i.e. breaking up the matrix into 1000 x 1000 matrices, row reducing that panel, and then continuing the process until the whole dataset was row-reduced? Would that even be a speed improvement/computer memory requirement over working on the entire matrix?
> > >
> > > Thank you in advance for your help!
> > >
> > > Blake
> >
> > You should be more precise about what you wanna acomplish.
> >
> > Which database are you using. I have experience with sql server.
> >
> > Oleg

Have you given a look at chol, lu, qr factorizations?
I can't be of more help here.
Oleg

Subject: row reducing very large, dense matrices

From: Sean

Date: 14 Oct, 2010 14:32:03

Message: 5 of 5


> My current goal is just to row reduce these larges sets of data. To be more precise, my professor works at a national lab observing the behavior of neutrons when they are shot at various materials. From these experiments, huge amounts of data are generated, and the information is hosted on a government database, which he has access to, and that is where I will be trying to extract the data to fill the matrix, which will be row-reduced for his purposes. I am not a nuclear engineer, but rather a mathematics major and chemical engineer major with programming skills who he has employed to work on automating this process, since entering in the information in 17000 x 17000 matrix for every experiment is unreasonable.
>

The actual use, though interesting!, isn't what Oleg meant (I presume). How do you plan to "row reduce" it? Do you just want to remove every other row?" Do you want to average blocks? That's the part that we need to understand.

How is this data stored?
If it's stored in a binary file and you know the class (e.g. double, float32, uint8 etc), use fopen -> fseek -> fread to read in portions of it at a time, do your computations, and save/append it to a new file. This would be very easy to accomplish with a simple loop. You could calculate a couple rows at a time and never outside of your RAM.

If you have a 64bit processor with signifcantly more than 2.312e+09 Bytes of RAM you could handle the whole matrix at once. I personally do computations on matrices this size all of the time and it's not an issue.

Tags for this Thread

What are tags?

A tag is like a keyword or category label associated with each thread. Tags make it easier for you to find threads of interest.

Anyone can tag a thread. Tags are public and visible to everyone.

Contact us