Dear all,
I’m using HDF5 1.10.6 in a Fortran code. I would like to write time series data into an extendable chunked and compressed dataset. I achieve this via the property list propID
:
call H5Pcreate_f(H5P_DATASET_CREATE_F, propID, info)
call H5Pset_chunk_f(propID, spaceRank, chunkDims, info)
call H5Pset_deflate_f(propID, 9, info)
...
call H5Dwrite_f(..., xfer_prp=propID)
I apply these commands at first, when I create the dataset. Once the dataset is in place I’m appending data to the dataset with the following commands:
call H5Pcreate_f(H5P_DATASET_XFER_F, propID, info)
call H5Pset_dxpl_mpio_f(propID, H5FD_MPIO_COLLECTIVE_F, info)
call H5Dwrite_f(..., xfer_prp = propID)
Here, while appending the data I don’t apply the compression filter under the assumption, that it is already in place after the initial dataset creation commands. Would you please confirm, that my assumption is correct? If not, would you please direct me: how to apply compression filter at the subsequent writes (appending the data)? My initial experiments show little difference in terms of HDF5 file sizes. I believe, that my assumption may be wrong. Thank you and have a good day ahead!
–
Best wishes,
Maxim