This function will summarize a 1-D vector of data (i.e. 1 column) based on what the user specifies as the statistic (e.g. mean, sum, etc.). It uses no loops, so it is extremely fast. However, it will use up memory as a result. I've included memory checks such that if you do not have enough memory, it will perform the moving summary using loops. Thus what would take a few seconds, will take minute(s).
The function will summarize data over a moving window incrementing one value at a time. For example, if data are hourly and the user wants a 7-day average (i.e. 168 hours), the average will be computed for each 1-hour increment, sliding the 7-day window 1-hour at a time.
UPDATED: 10/14/2010 - streamlined function, uses less memory.
UPDATED: 10/14/2010 - no more guessing on memory usage. Exact solution is used.
dataout = movingstat(datain,20,@sum);
Well, I swear when I typed in conv I got an undefined function, but I do it again and it's there. Magic!! Thanks John.
Interestingly, when I type conv in the help search, it doesn't come up but a whole bunch of convert 's nothing convolution which lead to conv. Once I found conv in the help it shows up now.
The function conv is part of basic matlab. EVERYBODY has it. You can find it by typing
at the command line.
Thanks Cris. I've never even heard of conv before. Going online at mathworks, it's not redialy apparant what toolbox that function is in...maybe Signal Processing? I can find the function documentation but I can't tell what toolbox it's in. Thus, while I would like to use your solution, it's not available to me. I only have Statistical Toolbox and conv is not in that toolbox. At least I still have some utility to people with conv! Thanks for the input. I'm sure these dialogues are very helpful for all interested.
does the same as
but is 2 orders of magnitude faster (tested on a 1x1e6 data array).
Sure, that doesn't generalize to things other than mean or sum (but note that std and var can be built using the mean filter). Your function is still useful for the median, for example.
Anything is possible. :) Assuming the image is BW it seems conceptually doable. Are you wanting to summarize in a particular direction or a rectangle/square? Either way, I would think that to summarize your 2D, you would process it in 3D. The third deminsion would give you the window in 2D space. Like in this function the second dimension gives the window in 1D space. However, you'll have an edge effect that will reduce your image N pixels in both directions. I'm not sure if that would work for you. If you need the full image after the processing, you'll have to come up with a scheme to allow for a shrinking window or reflect back as it goes over the edge. I hope this makes sense. I don't really do much image processing except in ArcGIS, which has those types of tools/functions. Which makes me think that MATLAB image processing toolbox may have something like this already?
I haven't looked at the function yet but that's exactly what i am looking for but for an image (i.e. 2D).
Do you think i can generalize it? or will you be able to help me do that please.
After Rob's comment, it got me thinking about Kevin's comment. I greatly simplified the function and memory usage. Below is the entire code necessary to perform the action. I inserted SUM as an example. I've updated the file, should show up soon.
You're using a lot of RAM because of the repmat, obviously. Why not reshape instead? No loops and no memory hogging. Toy example:
1 2 3
1 2 3
1 2 3
1 2 3
Thanks Kevin. Cumsum would work, except the trick is how would one perform an action like that on a sliding window where one may want to sum up a 30 day total on a sliding one hour window? If there is a more intrinsic method using canned functions, I open to advice. Thanks for the feedback!!
You might want to handle some cases like @sum or @mean separately since these can both be easily computed without loops using cumsum and one subtraction. Almost all estimators which involve sums of functions of data (e.g. mean, var, std) can be computed without either loops or substantial additional memory allocation.
Based on comments received, I've modified the routine to use CONV where applicable.
One more update. I changed the memory test to be an exact solution using an error trap instead of guessing needed space. Again thanks to Kevin and Rob for making me think about it some more.
Improved function to use less memory.