Vectorizing cumsum of table-vectors in a for loop

2 views (last 30 days)
Dear everyone, I have the following code which works perfectly fine and does what I want: the cumsum of the specified vectors of n vectors from a table in a cell-array vector. Now, I would like to understand if I can, anyhow, vectorialize this cumsum so to speed up the whole process.
The following is the code which works just fine, but I would like to speed up:
% n
n = size(cellarray,2);
% Cell Array Space Preallocation
tot{1,n} = [];
for k = 1:n
tot{k} = cellarray{1,k}{:,'tot'};
end
tot_s = sum([tot{:}],2);
I have tried this variant, but it seems to increase the total time, rather than decreasing it
% n
n = size(table,2);
for k = 1:n
tot_s = cumsum(cellarray{1,k}{:,'tot'},2);
end
I was wondering if it could exist a way to not use the loop at all and "vectorialize" this cumsum. I have tried the following code:
% n
n = size(table,2);
tot_s = cumsum(cellarray{1,1:n}{:,'tot'},2);
but I got the following error
% Expected one output from a curly brace or dot indexing expression, but there were 50 results (n = 50 in this particular case).
Any suggestion would be very helpful. For what concern the "data" in input, you can use whatever data you want, it does not matter. The structure is very simple: a table into a vector of cell-array with the name of the variable in each table be 'tot' and all the data in the vector are double.

Accepted Answer

Guillaume
Guillaume on 27 Aug 2019
No, it's not possible to vectorise the loop unless you don't use tables. Even if you were using matrices in the cell array (which would require the table data to be homogeneous), I'm not sure you'd gain much over the loop.
Your first example uses sum instead of cumsum, if it's just the sum you're after, this is probably slightly faster:
tot = cellarray{1}.tot;
for idx = 2:numel(cellarray)
tot = tot + cellarray{idx}.tot;
end
If you're after cumsum, it may be more efficient to allocate the final matrix directly rather than going through a temporary cell array:
tot = zeros(height(cellarray{1}, numel(cellarray)));
for col = 1:numel(cellarray)
tot(:, col) = cellarray{col}.tot;
end
tot = cumsum(tot, 2);
Your second example doesn't make sense. You're cumsum'ing a single column at each step of the loop, that doesn't do much, and overwriting tot_s each step.
Also, I believe that dot indexing as I have used above is slightly faster than {} indexing for tables.
Finally, for preallocation:
tot = cell(size(cellarray));
would be clearer than the syntax you use.
  1 Comment
Andrea Pinto
Andrea Pinto on 3 Sep 2019
I accepted your answer cause most of your suggestions, adapted to my solution, worked pretty fine even if none, singularly were able to speed up my code. Nevertheless, you were very kind and give me interesting hints that were quite useful.
Best
AP

Sign in to comment.

More Answers (1)

Steven Lord
Steven Lord on 27 Aug 2019
You say you're using a cell array so I assume the arrays inside different cells are different sizes, shapes, or types. That would mean they cannot be concatenated together into an array. If they can be concatenated together into an array, do so and call cumsum on that array specifying the dimension over which you want to sum.
So what sizes, shapes, and/or types are the data arrays inside the cells in your cell array?
  1 Comment
Andrea Pinto
Andrea Pinto on 27 Aug 2019
Dear Steven, the table in the cell array are all of the same dimensions: therefore the vectors extracted from the table in the cellarrays have all the same dimensions. The reason why the table of same dimensions are in different cell-arrays would be to complicated to explain here.
% n is the dimension of the cellarrays
% x = table{:,'tot'} are all the same dimensions (h,1) / types (double)
What I am looking for is a way to vectorize, if it is possible, the "row cumsum" of the x-vectors or make it more efficient than what I have written in my previous comment. I would really appreciate it.
Best
AP

Sign in to comment.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!