I have a large amount of data that is scattered in memory that I want to write as a single large data set in HDF5. The data is organized as follows:
64000 structs of type Lambert;
(The 64000 is actually flexible but I hard set it for this example along with the 2D sizes in the Lambert struct).
I figure I need to create a "chunked" data array in the "FileSpace" so that all the space is allocated in the file and with each write to the file the "north" and "south" arrays are written to the correct location in the file.
Is there an example that does something like this? Or am I thinking about this correctly?
Thanks for any help or pointers.
Mike Jackson BlueQuartz Software