filtfilt on large data sets
6 views (last 30 days)
Show older comments
I am using Matlab 2012a on a 32 Bit Windows PC with data sets that exceed 10 GB so I am breaking them into pieces when I load, process and save them using the matfile() command. For example I have a version 7.3 matfile containing a variable b [68 x 2000000] double so when I go to process it to turn it into the [68 x 2000000] double I perform the following:
ram_b = b_mat.b(:,1:1000); c = ram_b + 1000; c_mat.c(:,1:1000) = c;
The issue I have is that when the operation I want to perform on the data is smoothing using the filtfilt() function I cannot maintain continuity of data across the boundaries caused by lack of data artificially induced by the need to restrict the filter to use data loaded in a chunk at a time. Any thoughts?
0 Comments
Accepted Answer
Jan
on 30 Jul 2012
Edited: Jan
on 30 Jul 2012
Open the filtfilt.m file and look in the code. It can be modified such that it accepts data in chunks. But if your memory is exhausted, you have to save the chunks after filtering in forward direction and reload them for the backward direction later on.
To obtain more speed, try FEX: FilterM, which includes a C-Mex function for a faster filtering and a corresponding filter in forward and backward direction.
However, a [68 x 2000000] matrix required 1.09 GB, not 10 GB. But FILTFILT needs free memory of the double size, such that a 32-bit system will fail. But a single [1 x 2000000] row of the data can be handled efficiently and the splitting into chunks can be avoided. But perhaps this does not work with your data representation using "ram_b" etc, which I do not understand.
0 Comments
More Answers (0)
See Also
Categories
Find more on Digital Filtering in Help Center and File Exchange
Products
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!