H5Z_FLAG_OPTIONAL flag is ignored by the Deflate filter

Hello,

We have some float data (27 exactly, 32-byte floats) that we write to a dataset. We use a Deflate and Shuffle filters. With this specific set of data, the Deflate fails to compress, i.e, the compressed data is larger than the original data (slightly, 110 instead of 106 bytes).

We use the H5Z_FLAG_OPTIONAL flag for the filter deflate, expecting that the failing Deflate filter will be skipped, and the original data should be written in the output file, as it is written in H5Pset_filter documentation page : “if the filter result would be larger than the input, then the compression filter returns failure and the uncompressed data is stored in the file”.

But the output file as seen in HDFView seems to contain the compress data, and the filter seems to have been applied.
And we have crashes at the reading, when using other additonal filters in the pipeline. It is like the data are corrupted.

I think we are in a special case where the compressed data size is larger than the original data only by a few bytes, that’s why the Deflate filter doesn’t fail.

Could someone tell us if is normal, or do we have an unexpected behavior here ?

Following is the C code to reproduce:

#include "hdf5.h" 
#define NX 9
#define NY 3

int main(void)
{
	hid_t file, data_space, dataset32, properties;
	float buf[NX*NY];
	hsize_t dims[2], chunk_size[2];
	unsigned int deflate_params[1];

	/*
	*  Data  values which make the Deflate fail to compress (compressed data is larger than original data)
	*  These data are a set of 27 32-byte float values, stored in an array of byte composants,
	*  (in order to store the exact value of float)
	*/
	unsigned long data[NX*NY] = {
		0XB9EA13A0, // (0,0)
		0X00000000, // (0,1)
		0X3AC0108E, // (0,2)
		0XBA12EFB0, // (1,0)
		0X00000000, // (1,1)
		0X3AC0108E, // (1,2)
		0XB9FF8D5A, // (2,0)
		0X00000000, // (2,1)
		0X3AC0108E, // (2,2)
		0XBA1B7A8A, // (3,0)
		0X00000000, // (3,1)
		0X3AC0108E, // (3,2)
		0XB8B175E1, // (4,0)
		0XBA19A6D8, // (4,1)
		0X3A1C414F, // (4,2)
		0XB8DB90E1, // (5,0)
		0XBA19A6D8, // (5,1)
		0X3A4EEDE0, // (5,2)
		0XB9AEFC13, // (6,0)
		0XBA19A6D8, // (6,1)
		0X3AAF8F22, // (6,2)
		0XB93AF6A2, // (7,0)
		0XBA19A6D8, // (7,1)
		0X3A8A7D3F, // (7,2)
		0X372F2840, // (8,0)
		0XBA19A6D8, // (8,1)
		0X3A12DA59  // (8,2)
	};
	/* Get the float from the composant bytes */
	for (int i = 0; i < NX*NY; i++)
	{
		memcpy(&buf[i], &data[i], sizeof(float));
	}

	/* Describe the size of the array. */
	dims[0] = NX;
	dims[1] = NY;
	data_space = H5Screate_simple(2, dims, NULL);

	/*
	* Create a new file using read/write access, default file creation & access properties.
	*/
	file = H5Fcreate("C:\\Tmp\\test.h5", H5F_ACC_TRUNC, H5P_DEFAULT, H5P_DEFAULT);

	/*
	* Set the chunk creation property list
	*/
	chunk_size[0] = NX;
	chunk_size[1] = NY;
	properties = H5Pcreate(H5P_DATASET_CREATE);
	H5Pset_chunk(properties, 2, chunk_size);

	/*
	*  Use Shuffle filter, necessary for the the compression failure
	*/
	H5Pset_shuffle(properties);

	/*
	*  Use the Deflate filter that fails, with agression level set to 5
	*/
	deflate_params[0] = 5;
	H5Pset_filter(properties, H5Z_FILTER_DEFLATE, H5Z_FLAG_OPTIONAL, 1, deflate_params);

	/*
	* Create a new dataset within the file.
	*/
	dataset32 = H5Dcreate(file, "datasetF32", H5T_NATIVE_FLOAT, data_space, H5P_DEFAULT, properties, H5P_DEFAULT);

	/*
	* Write the array to the file.
	*/
	H5Dwrite(dataset32, H5T_NATIVE_FLOAT, H5S_ALL, H5S_ALL,
		H5P_DEFAULT, buf);

	H5Dclose(dataset32);
	H5Sclose(data_space);
	H5Pclose(properties);
	H5Fclose(file);
}

Regards,

Simon Marchetto