Creating Structure Names Dynamically
12 views (last 30 days)
Show older comments
Hello,
For those willing enough to help, please read my ENTIRE post to understand my "unique" problem before posting an answer!
I would like to dynamically create structure NAMES (not fields.) Please continue reading! I've already read and done research on this topic, and it seems that doing this is not good programming. However, the problem is that I'm dealing with large amounts of files and data that need to be split up and processed. Here's my question:
We have the following variable integers:
N = 1:infinity (theoretically of course, realistically maybe 1:30)
X = 1:10
Y = 1:14
I need to individually create the following structures, either manually or dynamically in a loop (hopefully!!!):
BiasStats_Dataset1.BandX.FPMY.Value = …
BiasStats_Dataset2.BandX.FPMY.Value = …
BiasStats_Dataset3.BandX.FPMY.Value = …
BiasStats_DatasetN.BandX.FPMY.Value = …
.
.
CStats_Dataset1.BandX.FPMY.Value = …
CStats_Dataset2.BandX.FPMY.Value = …
CStats_Dataset3.BandX.FPMY.Value = …
CStats_DatasetN.BandX.FPMY.Value = …
.
.
… and so on.
So as you can see, I will be creating many individual structures to decrease file sizes. The way I’m currently doing it is:
BiasStats.Dataset1.BandX.FPMY.Value = …
BiasStats.Dataset2.BandX.FPMY.Value = …
BiasStats.DatasetN.BandX.FPMY.Value = …
.
.
CStats.Dataset1.BandX.FPMY.Value = …
CStats.Dataset2.BandX.FPMY.Value = …
CStats.DatasetN.BandX.FPMY.Value = …
.
.
… and so on.
Doing it my way now, I will end up with a total of 6 individual structures (BiasStats and CStats are two of them.) This way works very well from an organizational standpoint, but each structure (BiasStats for example) takes up over 2GB of memory when the processing is finished! And I’m only working with 3 datasets at the moment (N = 3). This is where my problem lies. My code runs fine on our server because it has 24 GB of memory, but I feel I increase the number of datasets(N) to maybe 20 or 30, even our server might run out of memory because the structures are continually growing with each new dataset.
I will continue researching, and working with the ‘eval’ function, as that seems to be a possible way to go about this. I think it will be necessary for me in the future to create many individual structures for each dataset_contents, and save/load them from the hard drive rather than storing them in memory. Either that, or use an ENTIRELY different approach to this problem. Either way, data organization and dynamic accessibility to the data (with loops) is absolutely key. I'm working with far too much data to manually access and create structures by hand.
Thanks to anyone willing to take the time to help me on this, much appreciated!!!
0 Comments
Answers (3)
Walter Roberson
on 27 Jun 2011
BiasStats.Dataset(N).Band(X).FPM(Y).Value
But beyond that, to give you suggestions on how the data might be used effectively without filling memory, you will need to give us an idea of how you are going to use the data. Random access, or along a particular dimension? Will the access be read-only later or will it be read/write ?
Brad
on 27 Jun 2011
1 Comment
David Young
on 27 Jun 2011
You'd be a great deal better off using cell arrays. You're right that this is clumsy, and anything you can do like this you can do more efficiently with an array.
See Also
Categories
Find more on Matrix Indexing in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!