Issues using importdata function

2 views (last 30 days)
Katharine Morrill
Katharine Morrill on 20 May 2014
Commented: darova on 28 Jun 2020
I'm currently writing a script that reads in a CSV file of data generated by a digital o-scope and then manipulates the data (FFT) prior to plotting the manipulated data. The files I'm reading in are usually around 100MB in size. I also need to be flexible to allow the user to specify if there is a header on the file or not. Historically, I've had good results from utilizing the importdata function as it seems to handle large data files well and it gives me the flexibility to specify a number of headerlines to skip over.
Unfortunately, I am currently plagued by an odd issue where some of my coworkers using the script get memory errors of the ilk below when running the script:
Error using importdata (line 213) Unable to load file. Use TEXTSCAN or FREAD for more complex formats.
Error in FFT_current_harmonics_V17 (line 181) inData = importdata([pathname filename], ',', num_headers);
Caused by: Error using fileread (line 36) Out of memory. Type HELP MEMORY for your options.
They're running on a similar computer to mine, with actually more memory than my own, so I'm not certain why they would be running out of memory and I don't. Other coworkers (including myself) have no problem running the script.
Has anyone run into an issue like this before or knows what might be happening. Everyone likes using this script when it works, but it's a real pain for me to have to find less flexible workarounds (load function) for a few people rather than having everyone use the same script version.
  1 Comment
dpb
dpb on 21 May 2014
They're running on a similar computer to mine, with actually more memory than my own, so I'm not certain why they would be running out of memory and I don't
What about OS, ML version, 32/64-bit, etc., etc., on the various machines that succeed/fail?
Also, what else the user's got open could have a lot to do w/ free memory.

Sign in to comment.

Accepted Answer

Bo
Bo on 15 Apr 2015
Just noticed the same issue after upgrading Matlab from 2013b to 2015a. Seems like the implementation of importdata changed slightly and the new version introduced a function 'fixLineEndings', which apparently needs a lot of memory. Further investigation suggests MathWorks use fixLineEndings() first to convert line ending character(s) to '\n' and use such line ending character in the rest of the code.
I modified the code of importdata.m to avoid calling fixLineEndings() and remove the assumption of line ending character as '\n' in the rest of the code. Then I used pcode() to generate the corresponding importdata.p. Now the memory usage goes back to previous level.
  4 Comments
Ethan Beyak
Ethan Beyak on 27 Jun 2020
darova, I do think Bo's finding that "non-'\n'" line endings is huge. With my test, it was certainly causing importdata's performance issue in my case as well. His solution of modifying the code in importdata.m to avoid calling fixLineEndings() is not the solution I opted for, because I had the liberty to easily change the line endings of my input file. However I realize that may not always be the case, and someone may be stuck with "non-'\n'" line endings for their data file. With these in mind, I would agree that this answer can be marked as accepted. Thanks for doing so.
darova
darova on 28 Jun 2020
im just doing my job

Sign in to comment.

More Answers (0)

Products

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!