Hello,
Currently, we are working on a library built on top of the HDF5 library. Using this library one can make use of the parallel version of HDF5, compression filter like Szip, and a data transform function. Using this combination, the parallel HDF5 library reports the following error during a write action:
Error: #005: H5Dio.c line 1170 in H5D__ioinfo_adjust(): Can’t perform independent write with filters in pipeline.
The following caused a break from collective I/O:
Local causes: data transforms needed to be applied
Global causes: data transforms needed to be applied
major: Low-level I/O
minor: Can’t perform independent IO
Error: #004: H5Dio.c line 757 in H5D__write(): unable to adjust I/O info for parallel I/O
major: Dataset
minor: Unable to initialize object
Error: #003: H5VLnative_dataset.c line 207 in H5VL__native_dataset_write(): can’t write data
major: Dataset
minor: Write failed
Error: #002: H5VLcallback.c line 2080 in H5VL__dataset_write(): dataset write failed
major: Virtual Object Layer
minor: Write failed
Error: #001: H5VLcallback.c line 2113 in H5VL_dataset_write(): dataset write failed
major: Virtual Object Layer
minor: Write failed
Error: #000: H5Dio.c line 291 in H5Dwrite(): can’t write data
major: Dataset
minor: Write failed
From this error, I understand that this combination is unfortunately not supported. Is there a special reason for this? If I switch from the parallel to the serial version, it all works.
Furthermore, I also tried pHDF5 with only a compression filter. This combination only works if one does a collective write. This is fine for me but it may be idea to mention it in the documentation. At least this is not mentioned in the H5P_SET_SZIP documentation.
Best regards,
Jan-Willem