Dear HDF5 community,
I'm trying to compile HDF5 on a cluster. The configure script and make
run fine, but 'make check' reports 12 failures during
testcheck_version.sh: see http://chunk.io/f/f109dadc22ed4728a24d990028da4847
I followed the install instructions here
(https://www.hdfgroup.org/ftp/HDF5/current/src/unpacked/release_docs/INSTALL_parallel)
and connected to an interactive slurm session on a compute node.
The configure line was:
RUNSERIAL="srun -Q -n 1" RUNPARALLEL="srun -Q -n 6" ./configure
--with-zlib=${ZLIB_DIR} --with-szlib=${SZIP_DIR} --with-pthread
--enable-unsupported --enable-shared --enable-production=yes
--enable-parallel=yes --enable-largefile=yes
--with-default-api-version=v18 --prefix=${HOME}/dev/hdf5-1.8.16
Intel MPI compiler are used and set via export CC=mpiicc.. etc.
hdf5 is already installed on the system as module and there is a hdf5.h
in /usr/include, maybe causing the errors?
Is it just the testcheck_version preferring the system path (and the
warning can be ignored) or is something wrong with my installation? How
can I fix it?
Thanks a lot in advance!
David