This is a strange issue as I have been able to do this correctly in the past.
I can create a dataset with GZIP compression enabled - I.E
const std::vector<hsize_t> chunkSize = { 1, (hsize_t)nY, (hsize_t)nX };
// ...
dcpl_id = H5Pcreate(H5P_DATASET_CREATE);
status = H5Pset_deflate(dcpl_id, compressLevel); // This is 0-9, default 4
status = H5Pset_chunk(dcpl_id, 3, chunkSize.data());
// ...
dapl_id = H5Pcreate(H5P_DATASET_ACCESS);
hid_t dset_id = H5Dcreate(file_id, datasetPath.c_str(), H5T_NATIVE_FLOAT, dspace_id, lcpl_id, dcpl_id, dapl_id);
// Close handles. Will need to return later to actually write the data
Which I then close these handles. I later come back, open the dataset, and write data to it.
const std::vector<hsize_t> chunk_offset = { (hsize_t)frame_idx, (hsize_t)0, (hsize_t)0 };
// Read the data from this frame
std::vector<float> data;
data.reserve(nX * nY);
for (int j = 0; j < nY * gridSize; j = j + gridSize) {
float* rowPtr = (float*)GetBufferRowPtr(in_buffer, j);
for (int i = 0; i < nX * gridSize; i = i + gridSize) {
data.push_back(rowPtr[i]);
}
}
herr_t status;
hid_t file_id, dset_id;
// Get the file
file_id = KFH5_GetHDF5File(archivePath, false); // This just opens the specified file, optionally creating it if it doesn't exist if desired
dset_id = H5Dopen(file_id, datasetPath.c_str(), H5P_DEFAULT);
status = H5Dwrite_chunk(dset_id, H5P_DEFAULT, 0, chunk_offset.data(), nX * nY * sizeof(float), data.data());
This works fine if I comment out the line enabling GZIP compression. I can open the resulting file in HDFView and view everything as expected.
However, with the deflate line uncommented, HDFView shows that the dataset is GZIP compressed, but I have two issues. First, the dataset shows a compression ratio of 1, regardless of the compression level selected (HDFView also shows the compression level I selected correctly). Second, HDFView throws a “Filter not available” exception when trying to view the data. These are some screenshots showing the issue.
This is under Windows 10, and the application creating the file is a DLL built with Visual Studio and run through a third party application.
Am I spacing on a requirement for GZIP compression? I was under the impression the HDF library came with all the pieces needed for GZIP compression so it should work out of the box. I’m guessing I’ve got something done silly here, but I’m at a loss as to what.