Discover MakerZone

MATLAB and Simulink resources for Arduino, LEGO, and Raspberry Pi

Learn more

Discover what MATLAB® can do for your career.

Opportunities for recent engineering grads.

Apply Today

Thread Subject:
Matlab and large signals - I think I want my money back!

Subject: Matlab and large signals - I think I want my money back!

From: Jean

Date: 19 Feb, 2010 14:32:04

Message: 1 of 25

   Although Matlab is an excellent tool in signal processing, it has totally dissapointed me these days when I tried even the SIMPLEST function on large signals (let's say 2.5 million samples or higher). Not only that you must say goodbye to ffts, xcorrs, etc., but even smiple PLOTS cannot be showed because the "OUT OF MEMORY" error appears (in spite of my 4Gb of RAM).
  If you want to "play" research, then Matlab is the perfect tol for you. If you want to do some advanced signal processing o real signals, then orient yourself to another tool.
  Thank you Mathworks for dissapointing me!
 

Subject: Matlab and large signals - I think I want my money back!

From: Wayne King

Date: 19 Feb, 2010 14:46:05

Message: 2 of 25

"Jean " <domnul_jan@yahoo.com> wrote in message <hlm7d4$oru$1@fred.mathworks.com>...
> Although Matlab is an excellent tool in signal processing, it has totally dissapointed me these days when I tried even the SIMPLEST function on large signals (let's say 2.5 million samples or higher). Not only that you must say goodbye to ffts, xcorrs, etc., but even smiple PLOTS cannot be showed because the "OUT OF MEMORY" error appears (in spite of my 4Gb of RAM).
> If you want to "play" research, then Matlab is the perfect tol for you. If you want to do some advanced signal processing o real signals, then orient yourself to another tool.
> Thank you Mathworks for dissapointing me!

Hi Jean, can you be more specific? I don't have any trouble taking the Fourier transform of a signal with more samples than 2.5 million, or computing the autocorrelation on a Windows 32-bit system with 2 gigs of RAM. Can you give a specific example (you can create a vector of N(0,1) random variables) of where you are having trouble with a 1D signal with 2.5 million samples?

Wayne

Subject: Matlab and large signals - I think I want my money back!

From: Steven Lord

Date: 19 Feb, 2010 15:05:41

Message: 3 of 25


"Wayne King" <wmkingty@gmail.com> wrote in message
news:hlm87d$i5f$1@fred.mathworks.com...
> "Jean " <domnul_jan@yahoo.com> wrote in message
> <hlm7d4$oru$1@fred.mathworks.com>...
>> Although Matlab is an excellent tool in signal processing, it has
>> totally dissapointed me these days when I tried even the SIMPLEST
>> function on large signals (let's say 2.5 million samples or higher). Not
>> only that you must say goodbye to ffts, xcorrs, etc., but even smiple
>> PLOTS cannot be showed because the "OUT OF MEMORY" error appears (in
>> spite of my 4Gb of RAM).
>> If you want to "play" research, then Matlab is the perfect tol for you.
>> If you want to do some advanced signal processing o real signals, then
>> orient yourself to another tool.
>> Thank you Mathworks for dissapointing me!
>
> Hi Jean, can you be more specific? I don't have any trouble taking the
> Fourier transform of a signal with more samples than 2.5 million, or
> computing the autocorrelation on a Windows 32-bit system with 2 gigs of
> RAM. Can you give a specific example (you can create a vector of N(0,1)
> random variables) of where you are having trouble with a 1D signal with
> 2.5 million samples?

In addition to what Wayne said, Jean, if you're working with large data you
should be using a 64-bit version of MATLAB on a machine with a 64-bit OS,
for the reasons described in section 1 of this document from our support
website:

http://www.mathworks.com/support/tech-notes/1100/1106.html

Basically, if you're on a 32-bit system, many OSes don't allow MATLAB access
to all of your 4 GB of RAM, and MATLAB has to share what the OS does allow
access to with any other applications you're running. That can easily chew
up or fragment your memory.

--
Steve Lord
slord@mathworks.com
comp.soft-sys.matlab (CSSM) FAQ: http://matlabwiki.mathworks.com/MATLAB_FAQ

Subject: Matlab and large signals - I think I want my money back!

From: Jean

Date: 19 Feb, 2010 15:10:22

Message: 4 of 25

  Hello Wayne,

  I have some *.csv files or *.mat files which contain data . Importing these files takes time, but what bothers me is that when I load even 2 large signals, plot them, then Matlab gives the "out of memory" error if I try to do something fancy (like a xcorr for example). If I work with 10 million samples signals the entire analysis is compromised from the begining.
  What kind of Windows do you have? I have Win Vista (it is a new computer). What is the version of your Matlab? I have version 7.7.0.471.

  Thank you for your interest.

"Wayne King" <wmkingty@gmail.com> wrote in message <hlm87d$i5f$1@fred.mathworks.com>...
> "Jean " <domnul_jan@yahoo.com> wrote in message <hlm7d4$oru$1@fred.mathworks.com>...
> > Although Matlab is an excellent tool in signal processing, it has totally dissapointed me these days when I tried even the SIMPLEST function on large signals (let's say 2.5 million samples or higher). Not only that you must say goodbye to ffts, xcorrs, etc., but even smiple PLOTS cannot be showed because the "OUT OF MEMORY" error appears (in spite of my 4Gb of RAM).
> > If you want to "play" research, then Matlab is the perfect tol for you. If you want to do some advanced signal processing o real signals, then orient yourself to another tool.
> > Thank you Mathworks for dissapointing me!
>
> Hi Jean, can you be more specific? I don't have any trouble taking the Fourier transform of a signal with more samples than 2.5 million, or computing the autocorrelation on a Windows 32-bit system with 2 gigs of RAM. Can you give a specific example (you can create a vector of N(0,1) random variables) of where you are having trouble with a 1D signal with 2.5 million samples?
>
> Wayne

Subject: Matlab and large signals - I think I want my money back!

From: Jean

Date: 19 Feb, 2010 15:58:05

Message: 5 of 25

  I agree with what you said, I have read the documantation prior posting this message.
  The thing that bothers me the most is not being able to plot two 10 million sample signals in the same figure ( it gives me the "out of memory" error). I do not know if the problem is located in my computer or it is somethin Matlab designers have not forseen.

  Thank you very much for your interest in this problem.


"Steven Lord" <slord@mathworks.com> wrote in message <hlm9c0$2m8$1@fred.mathworks.com>...
>
> "Wayne King" <wmkingty@gmail.com> wrote in message
> news:hlm87d$i5f$1@fred.mathworks.com...
> > "Jean " <domnul_jan@yahoo.com> wrote in message
> > <hlm7d4$oru$1@fred.mathworks.com>...
> >> Although Matlab is an excellent tool in signal processing, it has
> >> totally dissapointed me these days when I tried even the SIMPLEST
> >> function on large signals (let's say 2.5 million samples or higher). Not
> >> only that you must say goodbye to ffts, xcorrs, etc., but even smiple
> >> PLOTS cannot be showed because the "OUT OF MEMORY" error appears (in
> >> spite of my 4Gb of RAM).
> >> If you want to "play" research, then Matlab is the perfect tol for you.
> >> If you want to do some advanced signal processing o real signals, then
> >> orient yourself to another tool.
> >> Thank you Mathworks for dissapointing me!
> >
> > Hi Jean, can you be more specific? I don't have any trouble taking the
> > Fourier transform of a signal with more samples than 2.5 million, or
> > computing the autocorrelation on a Windows 32-bit system with 2 gigs of
> > RAM. Can you give a specific example (you can create a vector of N(0,1)
> > random variables) of where you are having trouble with a 1D signal with
> > 2.5 million samples?
>
> In addition to what Wayne said, Jean, if you're working with large data you
> should be using a 64-bit version of MATLAB on a machine with a 64-bit OS,
> for the reasons described in section 1 of this document from our support
> website:
>
> http://www.mathworks.com/support/tech-notes/1100/1106.html
>
> Basically, if you're on a 32-bit system, many OSes don't allow MATLAB access
> to all of your 4 GB of RAM, and MATLAB has to share what the OS does allow
> access to with any other applications you're running. That can easily chew
> up or fragment your memory.
>
> --
> Steve Lord
> slord@mathworks.com
> comp.soft-sys.matlab (CSSM) FAQ: http://matlabwiki.mathworks.com/MATLAB_FAQ
>

Subject: Matlab and large signals - I think I want my money back!

From: Rune Allnor

Date: 19 Feb, 2010 16:10:02

Message: 6 of 25

On 19 Feb, 16:58, "Jean " <domnul_...@yahoo.com> wrote:
>   I agree with what you said, I have read the documantation prior posting this message.
>   The thing that bothers me the most is not being able to plot two 10 million sample signals in the same figure ( it gives me the "out of memory" error). I do not know if the problem is located in my computer or it is somethin Matlab designers have not forseen.

Nope. The problem is the user - you.

Most computer screens these days have, say, some 1500 x 2000 pixels,
less than 5 million pixels in all. So even if you plotted *all* your
data points in a way that let each point be represented by a single
individual pixel, you would end up with a blanked-out screen.

If you plot 10 mill points as a time series distributed across ~1000
pixels in the horizontal drection, each vertical column of pixels
will represent some 10000 data points.

So if you want to complain to somebody for sloppy work, start with
the person you see in the mirror every morning.

Rune

Subject: Matlab and large signals - I think I want my money back!

From: big data

Date: 19 Feb, 2010 16:16:03

Message: 7 of 25

"Jean " <domnul_jan@yahoo.com> wrote in message <hlm7d4$oru$1@fred.mathworks.com>...
> Although Matlab is an excellent tool in signal processing, it has totally dissapointed me these days when I tried even the SIMPLEST function on large signals (let's say 2.5 million samples or higher). Not only that you must say goodbye to ffts, xcorrs, etc., but even smiple PLOTS cannot be showed because the "OUT OF MEMORY" error appears (in spite of my 4Gb of RAM).
> If you want to "play" research, then Matlab is the perfect tol for you. If you want to do some advanced signal processing o real signals, then orient yourself to another tool.
> Thank you Mathworks for dissapointing me!
>
I run a 64-bit box with 64-bit OS and 64-bit Matlab and 16GB RAM. I routinely work with large datasets right up to 35GB page file size. Matlab is perfectly happy with large data provided you have an adequate computer.

Coming in here muttering about wanting your money back without having done the most basic research into the subject is lame. You owe TMW an apology.

Subject: Matlab and large signals - I think I want my money back!

From: Jean

Date: 19 Feb, 2010 16:30:24

Message: 8 of 25

  If with your 64 bit box you can handle 35gb of datasets, please tell me what and how much datasets can handle with my 32 bit box and 4Gb of RAM. Then I will try this to convince myself if I am wrong or not.

 Thank you for the support.


"big data" <bigdata@bigdata.com> wrote in message <hlmdg3$3k8$1@fred.mathworks.com>...
> "Jean " <domnul_jan@yahoo.com> wrote in message <hlm7d4$oru$1@fred.mathworks.com>...
> > Although Matlab is an excellent tool in signal processing, it has totally dissapointed me these days when I tried even the SIMPLEST function on large signals (let's say 2.5 million samples or higher). Not only that you must say goodbye to ffts, xcorrs, etc., but even smiple PLOTS cannot be showed because the "OUT OF MEMORY" error appears (in spite of my 4Gb of RAM).
> > If you want to "play" research, then Matlab is the perfect tol for you. If you want to do some advanced signal processing o real signals, then orient yourself to another tool.
> > Thank you Mathworks for dissapointing me!
> >
> I run a 64-bit box with 64-bit OS and 64-bit Matlab and 16GB RAM. I routinely work with large datasets right up to 35GB page file size. Matlab is perfectly happy with large data provided you have an adequate computer.
>
> Coming in here muttering about wanting your money back without having done the most basic research into the subject is lame. You owe TMW an apology.

Subject: Matlab and large signals - I think I want my money back!

From: Jean

Date: 19 Feb, 2010 16:37:02

Message: 9 of 25

   Thank you for your reply. Unfortunately I think you missunderstood what I said. Plotting one 10 million samples signal on my computer is possible, but if I want to plot another one on the same figure I cannot. I do not think I am doing sloppy work; my frustration results from the fact that Matlab is the only tool I employ in what I do because of it's ability to adapt to any kind of data. In my work I am faced with different types of data in many formats and sizes.

  Have a nice day.

Rune Allnor <allnor@tele.ntnu.no> wrote in message <dce3de6b-f8c8-43e3-9d21-ee7a2ba2cdfc@f29g2000yqa.googlegroups.com>...
> On 19 Feb, 16:58, "Jean " <domnul_...@yahoo.com> wrote:
> >   I agree with what you said, I have read the documantation prior posting this message.
> >   The thing that bothers me the most is not being able to plot two 10 million sample signals in the same figure ( it gives me the "out of memory" error). I do not know if the problem is located in my computer or it is somethin Matlab designers have not forseen.
>
> Nope. The problem is the user - you.
>
> Most computer screens these days have, say, some 1500 x 2000 pixels,
> less than 5 million pixels in all. So even if you plotted *all* your
> data points in a way that let each point be represented by a single
> individual pixel, you would end up with a blanked-out screen.
>
> If you plot 10 mill points as a time series distributed across ~1000
> pixels in the horizontal drection, each vertical column of pixels
> will represent some 10000 data points.
>
> So if you want to complain to somebody for sloppy work, start with
> the person you see in the mirror every morning.
>
> Rune

Subject: Matlab and large signals - I think I want my money back!

From: Mark Shore

Date: 19 Feb, 2010 16:59:07

Message: 10 of 25

> Thank you for your reply. Unfortunately I think you missunderstood >what I said. Plotting one 10 million samples signal on my computer is >possible, but if I want to plot another one on the same figure I cannot.

Jean, Rune's point is that your example is not a sensible test of anything. If I plot a time series containing 2.5M points I do so only for a very quick check AND with the intention of zooming in on either a representative or anomalous areas spotted. Any yes, the calculations required to resample and plot this to screen are rather slow, but what else can be expected?

I use XP 32 bit and ~3.4 GB accessible RAM for MATLAB workspaces with up to ~700 MB allocated to matrices; above that I reboot to Windows 7 x64 which can access the full 8 GB of installed RAM (plus an equivalent amount of much slower paged virtual memory).

Subject: Matlab and large signals - I think I want my money back!

From: ImageAnalyst

Date: 19 Feb, 2010 17:12:50

Message: 11 of 25

Jean:
I don't know what your code is but I made two 2.5 million element
arrays and cross correlated and plotted them in only 3.18 seconds with
no error messages whatsoever. And I just have a notebook computer
running 32 bit Windows XP. Here's the code:

clc;
clear all;
close all;
workspace;
% Generate 2 2.5 million element arrays.
tic;
data1 = rand(2500000,1);
data2 = rand(2500000,1);
% Cross correlate them.
result = xcorr(data1, data2);
toc; % Replies "Elapsed time is 3.178348 seconds."

% Plot the various arrays.
% Like Rune says, there's too many to see distinct points.
subplot(2,2,1);
plot(data1, 'r');
title('data1');
set(gcf, 'Position', get(0,'Screensize')); % Maximize figure.

subplot(2,2,2);
plot(data2, 'r');
title('data2');

subplot(2,2,3);
plot(result, 'r');
title('result');

% Show memory stats:
memory




And here are the results:

Elapsed time is 3.178348 seconds.
Maximum possible array: 753 MB (7.900e+008 bytes) *
Memory available for all arrays: 1206 MB (1.264e+009 bytes) **
Memory used by MATLAB: 581 MB (6.096e+008 bytes)
Physical Memory (RAM): 3036 MB (3.184e+009 bytes)

* Limited by contiguous virtual address space available.
** Limited by virtual address space available.

Subject: Matlab and large signals - I think I want my money back!

From: ImageAnalyst

Date: 19 Feb, 2010 17:25:27

Message: 12 of 25

On Feb 19, 11:30 am, "Jean " <domnul_...@yahoo.com> wrote:
>   If with your 64 bit box you can handle 35gb of datasets, please tell me what and how much datasets can  handle with my 32 bit box and 4Gb of RAM. Then I will try this to convince myself if I am wrong or not.
---------------------------------------------------------------
Jean:
Since I showed above that I can run functions on semi-large arrays
(just 2.5 million elements) I'm wondering if you have a bunch of
arrays hanging aroudn in memory that you don't need anymore. Is it
possible that you have some intermediate arrays that were used earlier
in your function but are no longer needed? If so they are just taking
up memory, and you can get rid of them using the clear command:
clear('oldUnneededArray', 'otherUnneededArray');
Perhaps that will help regain your memory.

Subject: Matlab and large signals - I think I want my money back!

From: John D'Errico

Date: 19 Feb, 2010 17:36:04

Message: 13 of 25

"Jean " <domnul_jan@yahoo.com> wrote in message <hlmene$mvr$1@fred.mathworks.com>...
> Thank you for your reply. Unfortunately I think you missunderstood what I said. Plotting one 10 million samples signal on my computer is possible, but if I want to plot another one on the same figure I cannot. I do not think I am doing sloppy work; my frustration results from the fact that Matlab is the only tool I employ in what I do because of it's ability to adapt to any kind of data. In my work I am faced with different types of data in many formats and sizes.
>

As has been pointed out you are complaining about
the tool, when it is the user who is at fault here.

If I hit my thumb with a hammer, should I sue the
manufacturer? Should I throw the hammer away
because I cannot use it properly? Yes, it may make
me feel better to curse the stupid hammer, but
really, who is at fault here?

Learn to use the tool. Don't be a lazy programmer.

John

Subject: Matlab and large signals - I think I want my money back!

From: big data

Date: 19 Feb, 2010 17:49:04

Message: 14 of 25

"Jean " <domnul_jan@yahoo.com> wrote in message <hlmeb0$rge$1@fred.mathworks.com>...
> If with your 64 bit box you can handle 35gb of datasets, please tell me what and how much datasets can handle with my 32 bit box and 4Gb of RAM. Then I will try this to convince myself if I am wrong or not.
>
One thing you can do is to implement the 3GB switch in your boot.ini file. Win32 XP can see 4GB. Normally 2GB is allocated to the kernel and 2GB is allocated to applications. Adding the 3GB line in your boot.ini will allow 1GB to be allocated to the kernel and 3GB allocated to applications. This definitely helped when I was running 32 bit processes using "large address aware" applications like Matlab.

http://msdn.microsoft.com/en-us/library/ms791558.aspx

Subject: Matlab and large signals - I think I want my money back!

From: Steve Amphlett

Date: 19 Feb, 2010 18:02:05

Message: 15 of 25

"John D'Errico" <woodchips@rochester.rr.com> wrote in message <hlmi64$98h$1@fred.mathworks.com>...
> "Jean " <domnul_jan@yahoo.com> wrote in message <hlmene$mvr$1@fred.mathworks.com>...
> > Thank you for your reply. Unfortunately I think you missunderstood what I said. Plotting one 10 million samples signal on my computer is possible, but if I want to plot another one on the same figure I cannot. I do not think I am doing sloppy work; my frustration results from the fact that Matlab is the only tool I employ in what I do because of it's ability to adapt to any kind of data. In my work I am faced with different types of data in many formats and sizes.
> >
>
> As has been pointed out you are complaining about
> the tool, when it is the user who is at fault here.
>
> If I hit my thumb with a hammer, should I sue the
> manufacturer? Should I throw the hammer away
> because I cannot use it properly? Yes, it may make
> me feel better to curse the stupid hammer, but
> really, who is at fault here?
>
> Learn to use the tool. Don't be a lazy programmer.
>
> John

Some more fuel on the fire...

Have you ever thought about block processing? Is there really a need to do FFT, XCORR, PLOT, etc on the entire dataset? Almost certainly not. When prototyping and testing algorithms, it's natural to use datasets that are workable. But when you get real data, you need to realise that there will always be a point where you can't work on the whole dataset in a single operation. In my experience, dataset sizes have grown as fast as the memory available to process them. It stands to reason, the systems used to measure and store them also grow at the same rate.

When I was in the field of analysing measured data, we always used to have an input of "memory" or equivalent. If you had more, you could use bigger blocks. Otherwise, it'd just take a bit longer.

Subject: Matlab and large signals - I think I want my money back!

From: ImageAnalyst

Date: 19 Feb, 2010 18:20:26

Message: 16 of 25

On Feb 19, 12:49 pm, "big data" <bigd...@bigdata.com> wrote:
> One thing you can do is to implement the 3GB switch in your boot.ini file.  Win32 XP can see 4GB. Normally 2GB is allocated to the kernel and 2GB is allocated to applications. Adding the 3GB line in your boot.ini will allow 1GB to be allocated to the kernel and 3GB allocated to applications. This definitely helped when I was running 32 bit processes using "large address aware" applications like Matlab.
>
> http://msdn.microsoft.com/en-us/library/ms791558.aspx

-------------------------------------------------------------------------------------------------------------------------------------
You can do that, and it might work, but I just thought maybe I would
share my experience when doing that. It's not like you're getting an
extra gig of RAM for free. Basically you're robbing Peter to pay
Paul. So you're taking a gig away from the operating system, and
giving it to your app. However I started having bizarre problems,
intermittent problems - like gibberish characters on the screen,
screens not repainting properly, transparent windows and title bars,
etc. We tried getting a new video adapter but the problems remained.
I ended up getting a new computer. Only after that did I remove the /
3GB switch and suddenly the problems instantly vanished. I've been
using that computer a lot lately (as a "hand me down") and it's still
fine. I'm convinced that it was the /3GB switch that was causing me
problems. It SHOULDN"T work that way, but how many times have we all
said "This just shouldn't be happening... but it is"? Just a warning
in advance in case you might notice any weird behaviour.

Subject: Matlab and large signals - I think I want my money back!

From: big data

Date: 19 Feb, 2010 18:32:04

Message: 17 of 25

ImageAnalyst <imageanalyst@mailinator.com> wrote in message <5f2df7b7-c2f0-4894-9d06-197dba2e6623@u9g2000yqb.googlegroups.com>...
> On Feb 19, 12:49 pm, "big data" <bigd...@bigdata.com> wrote:
> > One thing you can do is to implement the 3GB switch in your boot.ini file.  Win32 XP can see 4GB. Normally 2GB is allocated to the kernel and 2GB is allocated to applications. Adding the 3GB line in your boot.ini will allow 1GB to be allocated to the kernel and 3GB allocated to applications. This definitely helped when I was running 32 bit processes using "large address aware" applications like Matlab.
> >
> > http://msdn.microsoft.com/en-us/library/ms791558.aspx
>
> -------------------------------------------------------------------------------------------------------------------------------------
> You can do that, and it might work, but I just thought maybe I would
> share my experience when doing that. It's not like you're getting an
> extra gig of RAM for free. Basically you're robbing Peter to pay
> Paul. So you're taking a gig away from the operating system, and
> giving it to your app. However I started having bizarre problems,
> intermittent problems - like gibberish characters on the screen,
> screens not repainting properly, transparent windows and title bars,
> etc. We tried getting a new video adapter but the problems remained.
> I ended up getting a new computer. Only after that did I remove the /
> 3GB switch and suddenly the problems instantly vanished. I've been
> using that computer a lot lately (as a "hand me down") and it's still
> fine. I'm convinced that it was the /3GB switch that was causing me
> problems. It SHOULDN"T work that way, but how many times have we all
> said "This just shouldn't be happening... but it is"? Just a warning
> in advance in case you might notice any weird behaviour.

Agree 100%. Another area you can run into trouble with /3GB is on networked machines. These days the bloated netware can require the OS to use more than 1GB with all kinds of weird results.

What I do with 32-bit machines is add a second line to boot.ini with the /3GB switch. When the machine boots a pre-windows dialog comes up that alllows you to select which line to boot on so you can only use the /3GB switch when you really need it.

Subject: Matlab and large signals - I think I want my money back!

From: Mark Shore

Date: 19 Feb, 2010 19:10:37

Message: 18 of 25


> You can do that, and it might work, but I just thought maybe I would
> share my experience when doing that. It's not like you're getting an
> extra gig of RAM for free. Basically you're robbing Peter to pay
> Paul. So you're taking a gig away from the operating system, and
> giving it to your app. However I started having bizarre problems,
> intermittent problems - like gibberish characters on the screen,
> screens not repainting properly, transparent windows and title bars,
> etc. We tried getting a new video adapter but the problems remained.
> I ended up getting a new computer. Only after that did I remove the /
> 3GB switch and suddenly the problems instantly vanished. I've been
> using that computer a lot lately (as a "hand me down") and it's still
> fine. I'm convinced that it was the /3GB switch that was causing me
> problems. It SHOULDN"T work that way, but how many times have we all
> said "This just shouldn't be happening... but it is"? Just a warning
> in advance in case you might notice any weird behaviour.

I never noticed any weird behaviour when I tried the /3GB switch, except for the minor issue that the two computers I used it on subsequently failed to boot. Almost too trivial to mention, really...

Subject: Matlab and large signals - I think I want my money back!

From: Oleg Komarov

Date: 19 Feb, 2010 21:48:04

Message: 19 of 25

Although my working field doesn't have anything to do with signal processing I would give him his money back.
I think everybody would gain from that.

Oleg

Subject: Matlab and large signals - I think I want my money back!

From: Jean

Date: 22 Feb, 2010 10:07:02

Message: 20 of 25

     Hello,

    I am still waiting for your answer.

"big data" <bigdata@bigdata.com> wrote in message <hlmdg3$3k8$1@fred.mathworks.com>...
> "Jean " <domnul_jan@yahoo.com> wrote in message <hlm7d4$oru$1@fred.mathworks.com>...
> > Although Matlab is an excellent tool in signal processing, it has totally dissapointed me these days when I tried even the SIMPLEST function on large signals (let's say 2.5 million samples or higher). Not only that you must say goodbye to ffts, xcorrs, etc., but even smiple PLOTS cannot be showed because the "OUT OF MEMORY" error appears (in spite of my 4Gb of RAM).
> > If you want to "play" research, then Matlab is the perfect tol for you. If you want to do some advanced signal processing o real signals, then orient yourself to another tool.
> > Thank you Mathworks for dissapointing me!
> >
> I run a 64-bit box with 64-bit OS and 64-bit Matlab and 16GB RAM. I routinely work with large datasets right up to 35GB page file size. Matlab is perfectly happy with large data provided you have an adequate computer.
>
> Coming in here muttering about wanting your money back without having done the most basic research into the subject is lame. You owe TMW an apology.

Subject: Matlab and large signals - I think I want my money back!

From: Oleg Komarov

Date: 22 Feb, 2010 10:50:06

Message: 21 of 25

"Jean " <domnul_jan@yahoo.com> wrote in message <hltl06$8kj$1@fred.mathworks.com>...
> Hello,
>
> I am still waiting for your answer.
>
> "big data" <bigdata@bigdata.com> wrote in message <hlmdg3$3k8$1@fred.mathworks.com>...
> > "Jean " <domnul_jan@yahoo.com> wrote in message <hlm7d4$oru$1@fred.mathworks.com>...
> > > Although Matlab is an excellent tool in signal processing, it has totally dissapointed me these days when I tried even the SIMPLEST function on large signals (let's say 2.5 million samples or higher). Not only that you must say goodbye to ffts, xcorrs, etc., but even smiple PLOTS cannot be showed because the "OUT OF MEMORY" error appears (in spite of my 4Gb of RAM).
> > > If you want to "play" research, then Matlab is the perfect tol for you. If you want to do some advanced signal processing o real signals, then orient yourself to another tool.
> > > Thank you Mathworks for dissapointing me!
> > >
> > I run a 64-bit box with 64-bit OS and 64-bit Matlab and 16GB RAM. I routinely work with large datasets right up to 35GB page file size. Matlab is perfectly happy with large data provided you have an adequate computer.
> >
> > Coming in here muttering about wanting your money back without having done the most basic research into the subject is lame. You owe TMW an apology.

Don't wait somebody's answer, read the documentation on how to avoid memory problems.
Many CSSMers already showed you that the problem is probably due to poor coding.
If you need help to individuate where the out of memory occurs, post here what you've done so far.

Oleg

Subject: Matlab and large signals - I think I want my money back!

From: David R.

Date: 23 Feb, 2010 09:47:02

Message: 22 of 25

"Jean " <domnul_jan@yahoo.com> wrote in message <hltl06$8kj$1@fred.mathworks.com>...
> Hello,
>
> I am still waiting for your answer.
>
> "big data" <bigdata@bigdata.com> wrote in message <hlmdg3$3k8$1@fred.mathworks.com>...
> > "Jean " <domnul_jan@yahoo.com> wrote in message <hlm7d4$oru$1@fred.mathworks.com>...
> > > Although Matlab is an excellent tool in signal processing, it has totally dissapointed me these days when I tried even the SIMPLEST function on large signals (let's say 2.5 million samples or higher). Not only that you must say goodbye to ffts, xcorrs, etc., but even smiple PLOTS cannot be showed because the "OUT OF MEMORY" error appears (in spite of my 4Gb of RAM).
> > > If you want to "play" research, then Matlab is the perfect tol for you. If you want to do some advanced signal processing o real signals, then orient yourself to another tool.
> > > Thank you Mathworks for dissapointing me!
> > >
> > I run a 64-bit box with 64-bit OS and 64-bit Matlab and 16GB RAM. I routinely work with large datasets right up to 35GB page file size. Matlab is perfectly happy with large data provided you have an adequate computer.
> >
> > Coming in here muttering about wanting your money back without having done the most basic research into the subject is lame. You owe TMW an apology.


Hi.
Though I am not an expert of ML's internal solutions, here's my guess:
- If You plot an array, ML will have to copy (duplicate) the data to a memory space associated with the figure, so that You can zoom in even if You delete the array from workspace.
- Therefore, trying to plot large datasets will consume large amounts of memory.
- Since You cannot see more than say 2000 points at once (not eonught pixels there), therefore You could plot - say - yourdata(1:1000:end) and make Your zooming responsive, i.e. updating the properties of Your line object with the data You actually need for a certain degree of zoom.
(See zoom object, line object, axes properties, etc.)

Does that make sense?

Best regards,
David

Subject: Matlab and large signals - I think I want my money back!

From: Eric

Date: 23 Feb, 2010 19:12:19

Message: 23 of 25

Anybody claiming to brag that their research is "real" while everybody else here is playing ought to

1. understand why plotting 10 million data points on a monitor might be impractical. Let's say you want to print/display a plot of 10 million data points at 300 data points per inch. This will take 33,333 inches = 0.53 miles = 0.85 km of linear space. Scale the ordinate appropriately and you could see it from space!

2. have a better computer. You can get a Dell T5500 with two Xeon quad-core processors, 24 GB of RAM, and an NVidia Quadro FX 3800 video card for $5143. Combine this with a skilled analyst and Matlab x64 and processing large data sets can be done effectively. If you're doing "real" research I'm guessing your customer would rather pay for this rather than for your time playing with the /3GB switch.

I'm guessing for many of the people in this forum 10 million data points is small. I routinely work with data sets consisting of 30,000 images that are each 128x128 (491 million data points) or 60 images that are each 2048x2048 (251 million data points). These don't seem that big to me and I'm guessing many people here work with data sets orders of magnitude larger.

-Eric

Subject: Matlab and large signals - I think I want my money back!

From: Mark Hollingsworth

Date: 26 Nov, 2012 18:42:17

Message: 24 of 25

"Jean" wrote in message <hlm7d4$oru$1@fred.mathworks.com>...
> Although Matlab is an excellent tool in signal processing, it has totally dissapointed me these days when I tried even the SIMPLEST function on large signals (let's say 2.5 million samples or higher). Not only that you must say goodbye to ffts, xcorrs, etc., but even smiple PLOTS cannot be showed because the "OUT OF MEMORY" error appears (in spite of my 4Gb of RAM).
> If you want to "play" research, then Matlab is the perfect tol for you. If you want to do some advanced signal processing o real signals, then orient yourself to another tool.
> Thank you Mathworks for dissapointing me!
>

Jean. I think that many people missed your point that you can plot 10M of a single variable but can not plot two 2.5M lines. That sounds like a problem specific to the plot routine. Perhaps there are ways to plot two lines that avoids the bug (like using time series?) (P.S. I dislike answers to questions that start with "Why would you want to do that? Here is a totally different approach". Let's assume that people have a reason for what they are doing and try to help.)

Subject: Matlab and large signals - I think I want my money back!

From: dpb

Date: 26 Nov, 2012 20:03:37

Message: 25 of 25

On 11/26/2012 12:42 PM, Mark Hollingsworth wrote:
...

> ... (P.S. I dislike
> answers to questions that start with "Why would you want to do that?
> Here is a totally different approach". Let's assume that people have a
> reason for what they are doing and try to help.)

Perhaps.

But oftentimes the "why" turns out to be the requestor didn't know of a
better way or simply were implementing the naive, straightahead solution
instead of using any cleverness to help. So, I would say the answers
that suggest alternate approaches or question the possible
implementation idea are ok, too...

--

Tags for this Thread

What are tags?

A tag is like a keyword or category label associated with each thread. Tags make it easier for you to find threads of interest.

Anyone can tag a thread. Tags are public and visible to everyone.

Contact us