## for loop count number of times logical matrix is true and store

### Catherine F (view profile)

on 13 Sep 2017
Latest activity Commented on by Rik Wisselink

on 15 Sep 2017

I have a for loop of a certain number of days, we'll say 20. I created a logical matrix for points with a value less than or equal to 33. Each salinity point corresponds to a specific lat/long so the position that this value occurs is important. I want to loop through each day and essentially tally which days each point has a value less than 33.

```for j = 1:20
if L = logical(salinity <=33)
```

Example: Let's say I have (3) 2x3 matrices

```0 1 1
0 1 0
```

and

```0 1 0
0 1 1
```

and

```1 1 1
0 1 0
```

I want one matrix of the same size at the end of the loop that looks like this:

```1 3 2
0 3 1
```

I can't get accumarray to work because I can't keep the tally of values in the same position in the matrix. If someone could show me how to do that, I think it would work. I'm also concerned about memory. Although I used 20 days as an example, I actually have decades of data to run, so I don't want each logical matrix saved in my workspace.

Stephen Cobeldick

### Stephen Cobeldick (view profile)

on 13 Sep 2017
```>> mat(:,:,1) = [0,1,1;0,1,0];
>> mat(:,:,2) = [0,1,0;0,1,1];
>> mat(:,:,3) = [1,1,1;0,1,0];
>> sum(mat,3)
ans =
1     3     2
0     3     1
```

???

on 14 Sep 2017

@Stephen, I had the same first idea, but from the question I gathered that memory is too much of an issue to hold all data (even when you convert double to logical).

### Tags

on 13 Sep 2017

If it is possible, you could use sum, specifying the dimension, but assuming that is not possible, you could write the loop as:

```total_count=zeros(2,3)
for j=1:20
%code that loads the salinity map for index j
L=salinity <=33;
total_count(L)=total_count(L)+1;
end
```

on 14 Sep 2017

Catherine, I don't think you need a loop at all. If you have some 3D salinity matrix sal, whose dimensions correspond to lat,lon,time or lon,lat,time, you can just use sum, similar to Stephen's suggestion, but here we'll do it all in one go.

Below I'm creating sal as some sample data so you can run the example. It says sal has a mean value of 35 psu, +/- some random noise. Then we count up all the sal values less than or equal to 33:

```% 5 years of 1-degree data centered on 35 psu:
sal = 35+randn(360,180,365*5);
```
```% Sum of days with low salinity:
Lessthan33 = sum(sal<=33,3);
```
```% Plot it up nice:
imagesc(-179.5:179.5,-89.5:89.5,Lessthan33)
axis xy
xlabel 'longitude'
ylabel 'latitude'
cb = colorbar;
ylabel(cb,'number of days when salinity is less than 33 psu')
```

Catherine F

### Catherine F (view profile)

on 14 Sep 2017

Chad - thank you! This is a great & simple way to accomplish what I wanted. I'll probably keep the for loop just so that I don't have to manually extract all the netcdf files before I sum them.

on 14 Sep 2017

Hi Catherine, glad you found something that works.

This is a matter of personal preference, but it can be good to keep importing data separate from analyzing data. There are a few reasons for this.

First, if you're trying to accomplish several different tasks in a single loop, like reading in data and computing a sum of data meeting certain criteria, that creates a situation where if something goes wrong, you have to figure out which of the operations contains the bug. Debugging is typically easier when there are fewer moving parts. Or if you ever want to share your code with a colleague, or a year from now when you revisit this code, it will be much more clear and easy to understand if there's a section of code labeled "Import Data" followed by a separate section labeled "Calculate sum of salinity values below critical threshold."

Second, by analyzing the data as it's being imported, you lose the ability to go back and try changing the threshold to say, 33.1 psu. Unless you start from scratch and read all the data again.

So as a general rule of thumb, I try to separate importing data from analyzing data.

on 15 Sep 2017

I totally agree with you, but sometimes loading everything in one go is not possible. I'm currently working on a project where I'm working with over 300 scans of 512x512x100 that need at least uint16. If this project is similar, it might not be possible load (my data would take up a 15GB chunk of contiguous RAM).

Bottom line: if possible in any way, try to do as little processing at loading and do as much of the processing separately. A further tip: use a separate function for loading data, so you can change that if need be (and find a possible bug more easily).