I am working on an application that writes a double type dataset with rank of 1 to a file. That dataset can contain an arbitrary number of values that can vary each time it is written.
I want to keep the cache size at most 1MB (1024 x 1024 bytes). If the size of a double is 8 bytes, that means that the dataset can be chunked and compressed up to 131,072 values ((1024 x 1024) / 8). Unless there is some overhead I’m not taking into account.
Given that information, if the dataset actually contains more than 131,072 values but less 262,144 values (not quite a full second chunk), how are the values that exceed that chunk size handled? Are they in a chunk? Are they compressed?
Thank in advance for the input!