Importdata fails to import large files

5 views (last 30 days)
Hi everybody.
I've have a bunch of large files as the following one:
# x 1 2 3 4 ...
# y 5 6 7 8 ...
# z 10 11 12 13 ...
# Time
1 87 85 82 81 ...
2 67 86 19 34 ...
... ... ... ... ...
Since i don't know either the number of rows or the number of columns, usually I import them using
data=importdata(filePath,' ',4);
but for some of them (the biggest ones 1.32 and 1.7 GB) this command doesn't work and data is an empty variable.
I managed to solve this with an ad hoc solution deleting the first 4 rows and using the command
data=importdata(filePath,' ');
Since I have a large number of them I would like to have a solution that works for all of them. What can i do?
Thank you

Accepted Answer

Walter Roberson
Walter Roberson on 7 Oct 2013
fid = fopen('YourFile.txt');
for K = 1 : 4 %skip 4 lines
fgetl(fid);
end
here = ftell(fid); %remember where we are
fields = regexp( fgetl(), '\s+', 'split'); %read line, split it into columns
numcols = length(fields); %count them
fseek(fid, here, 'bof');; %reposition to prior line
fmt = repmat('%f', 1, numcols); %maybe %d if entries are integral
datacell = textscan( fid, fmt, 'CollectOutput', 1); %read file
fclose(fid); %we are done with it
data = datacell{1};
  1 Comment
Luca Amerio
Luca Amerio on 7 Oct 2013
That's PERFECT!!!
Just for future readers there's a very little error in
fields = regexp( fgetl(), '\s+', 'split');
that must be corrected in
fields = regexp( fgetl(fid ), '\s+', 'split');
except for that it's one of the best advice I've ever received. I reduced the memory usage and the time required for importing data by almost 60-70%.
That's awesome!
Thank you sooooo much!
Luca

Sign in to comment.

More Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!