Parallel Filters tests fails while running make check

Hi,

I have to set up parallel HDF5 for using it with h5py. I found that the default libraries available from Ubuntu apt-get does not support Parallel HDF5. So I compiled and installed mpich; and then ran CC=/usr/local/bin/mpicc ./…/configure --enable-shared --enable-parallel make && make check, after checking the location of installed mpicc.

The OS I am using is Ubuntu 18.04 with mpich 3.3.

Version that I’m trying to install : HDF5 v1.10.4

It passes t_mpi, testphdf5 and the previous tests.
I found that it fails to run parallel filters check as shown below.

============================
Testing  t_filters_parallel 
============================
 t_filters_parallel  Test Log
============================
==========================Parallel Filters tests
==========================

*** Hint ***
You can use environment variable HDF5_PARAPREFIX to run parallel test files in a
different directory or to add file type prefix. E.g.,
   HDF5_PARAPREFIX=pfs:/PFS/user/me
   export HDF5_PARAPREFIX
*** End of Hint ***
Testing write to one-chunk filtered dataset
HDF5-DIAG: Error detected in HDF5 (1.10.4) MPI-process 0:
  #000: .././../src/H5Dio.c line 336 in H5Dwrite(): can't write data
    major: Dataset
    minor: Write failed
  #001: .././../src/H5Dio.c line 828 in H5D__write(): can't write data
    major: Dataset
    minor: Write failed
  #002: .././../src/H5Dmpio.c line 893 in H5D__chunk_collective_write(): write error
    major: Dataspace
    minor: Write failed
  #003: .././../src/H5Dmpio.c line 811 in H5D__chunk_collective_io(): couldn't finish filtered linked chunk MPI-IO
    major: Low-level I/O
    minor: Can't get value
  #004: .././../src/H5Dmpio.c line 1317 in H5D__link_chunk_filtered_collective_io(): couldn't process chunk entry
    major: Dataset
    minor: Write failed
  #005: .././../src/H5Dmpio.c line 3099 in H5D__filtered_collective_chunk_entry_io(): couldn't unfilter chunk for modifying
    major: Dataset
    minor: Filter operation failed
  #006: .././../src/H5Z.c line 1303 in H5Z_pipeline(): required filter (name unavailable) is not registered
    major: Data filters
    minor: Read failed
  #007: .././../src/H5PLint.c line 270 in H5PL_load(): search in path table failed
    major: Plugin for dynamically loaded library
    minor: Can't get value
  #008: .././../src/H5PLpath.c line 604 in H5PL__find_plugin_in_path_table(): search in path /usr/local/hdf5/lib/plugin encountered an error
    major: Plugin for dynamically loaded library
    minor: Can't get value
  #009: .././../src/H5PLpath.c line 656 in H5PL__find_plugin_in_path(): can't open directory: /usr/local/hdf5/lib/plugin
    major: Plugin for dynamically loaded library
Proc 0: *** Parallel ERROR ***
    VRFY (Dataset write succeeded) failed at line  279 in .././../testpar/t_filters_parallel.c
aborting MPI processes
    minor: Can't open directory or file
application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0
Command exited with non-zero status 1
0.00user 0.00system 0:00.16elapsed 1%CPU (0avgtext+0avgdata 3672maxresident)k
0inputs+0outputs (0major+420minor)pagefaults 0swaps
Makefile:1436: recipe for target 't_filters_parallel.chkexe_' failed
make[4]: *** [t_filters_parallel.chkexe_] Error 1
make[4]: Leaving directory '/home/ubuntu/Downloads/hdf5-1.10.4/build/testpar'
Makefile:1545: recipe for target 'build-check-p' failed
make[3]: *** [build-check-p] Error 1
make[3]: Leaving directory '/home/ubuntu/Downloads/hdf5-1.10.4/build/testpar'
Makefile:1416: recipe for target 'test' failed
make[2]: *** [test] Error 2
make[2]: Leaving directory '/home/ubuntu/Downloads/hdf5-1.10.4/build/testpar'
Makefile:1217: recipe for target 'check-am' failed
make[1]: *** [check-am] Error 2
make[1]: Leaving directory '/home/ubuntu/Downloads/hdf5-1.10.4/build/testpar'
Makefile:654: recipe for target 'check-recursive' failed
make: *** [check-recursive] Error 1

I read online that parallel filters were not implemented in HDF5 v1.10.0. Is it still the case in the current version?

Can I run make install and ignore this error?

Hi @arvindsoma,

the parallel filters feature was added as of HDF5 1.10.2, with improvements having been made to the feature since its release.

The problem you’re encountering is that the tests used to fail if ZLIB was not available, the logic being that (usually) it is included as a pre-defined HDF5 filter and that ZLIB is often available on a system. The tests were later amended to actually check that zlib is available, but this hasn’t made it into a released version of HDF5 yet.

Having said that, it should be fine to ignore the error, albeit with the understanding that you may not (currently) have the ability to use the feature until you have filters available for use.

Hi @jhenderson,

Thanks for the response! You are right, I already had installed some zlib packages earlier, but they were not all covered. I ended up installing libhdf5-mpich-dev which installed all the necessary dependcies of zlibs, then I removed the libhdf5-mpich-dev package and finally compiled it from source. That solved my problem!