Issue building HDF5 1.10.6 on Theta

I’m using spack to build HDF5 on the Theta supercomputer, with the gcc 7.3.0 compiler. At the moment, spack is pointing to version 1.10.6 as the latest, but this version doesn’t build. I successfully build version 1.10.1, so the problem appeared somewhere in between those two versions.

Looking at the build log, I see the following:

LD_LIBRARY_PATH="$LD_LIBRARY_PATH`echo  |                  \
        sed -e 's/-L/:/g' -e 's/ //g'`"                               \
 ./H5make_libsettings > H5lib_settings.c  ||                               \
    (test $HDF5_Make_Ignore && echo "*** Error ignored") ||          \
    (rm -f H5lib_settings.c ; exit 1)
  CC       H5lib_settings.lo
LD_LIBRARY_PATH="$LD_LIBRARY_PATH`echo  |                  \
        sed -e 's/-L/:/g' -e 's/ //g'`"                               \
 ./H5detect > H5Tinit.c  ||                               \
    (test $HDF5_Make_Ignore && echo "*** Error ignored") ||          \
    (rm -f H5Tinit.c ; exit 1)
/bin/sh: line 4: 26196 Illegal instruction     (core dumped) LD_LIBRARY_PATH="$LD_LIBRARY_PATH`echo  |                          sed -e 's/-L/:/g' -e 's/ //g'`" ./H5detect > H5Tinit.c
Makefile:1926: recipe for target 'H5Tinit.c' failed

It seems that HDF5 is building a program H5detect, then calling it, however this sort of thing cannot be done on Theta as the compiler actually cross-compiles for the compute node. Programs compiled that way cannot be (or are not guaranteed to) run on the login node. I suspect this sort of thing would happen on other supercomputers.

I found this FAQ page says cross compiling is not supported. Your best chance may be to ask Theta Help Desk for how they (or Cray) build HDF5 there. You can also consider submitting a job to run configure and make on a compute node.

I needed to unloading Darshan,

I use autotools, to build HDF5 on theta:

module unload craype-mic-knl
module load craype-haswell
…/configure --disable-hl --disable-tests --disable-tools --disable-static --enable-shared --without-pthread --enable-parallel --enable-build-mode=production --disable-fortran --disable-direct-vfd --without-zlib --without-szlib
module unload craype-haswell
module load craype-mic-knl
make -j 16
make -j 16 install

Hi Scot

I tried your commands, but still got the following errors.
Looks like ‘make’ tried to run command ‘H5make_libsettings’ under ./src

make[2]: Entering directory ‘/gpfs/mira-home/wkliao/HDF5/build/src’
LD_LIBRARY_PATH="$LD_LIBRARY_PATHecho -dynamic | \ sed -e 's/-L/:/g' -e 's/ //g'"
./H5make_libsettings > H5lib_settings.c ||
(test $HDF5_Make_Ignore && echo “*** Error ignored”) ||
(rm -f H5lib_settings.c ; exit 1)

Please verify that both the operating system and the processor support Intel® AVX512F, ADX, AVX512ER, AVX512PF and AVX512CD instructions.

Makefile:1926: recipe for target ‘H5lib_settings.c’ failed

Hmmm, I’m also getting the same error now.

I also tried with the cmake HPC option (script below) we added (which is only available in releases 1.10.6 and 1.12.0), and it also now gets that error.

In the ctest line you will find:

ctest -S HDF5config.cmake,BUILD_GENERATOR=Unix,SITE_OS_NAME=Theta,KNL=true,MPI=true,LOCAL_SKIP_TEST=true,HPC=qsub,LOCAL_BATCH_SCRIPT_ARGS="$ACCNT_ID" -C Debug -VV -O hdf5.log

The option


skips testing the HDF5 build. Testing would entail running the tests via a batch job, which is where “HPC=” comes in. If you don’t have local_SKIP_TEST=true, then a batch submission script is generated using the “HPC=“ parameters. In the above example, the job is submitted using ‘qsub,’ and it will use the allocation specified by LOCAL_BATCH_SCRIPT_ARGS=

I’ll have to look into why it no longer works and get back to you.


#!/bin/bash -l


module load cmake/3.14.5


git clone

ln -s $HDF5_VER/config/cmake/scripts/CTestScript.cmake .

ln -s $HDF5_VER/config/cmake/scripts/HDF5config.cmake .

ln -s $HDF5_VER/config/cmake/scripts/HDF5options.cmake .

ctest -S HDF5config.cmake,BUILD_GENERATOR=Unix,SITE_OS_NAME=Theta,KNL=true,MPI=true,LOCAL_SKIP_TEST=true,HPC=qsub,LOCAL_BATCH_SCRIPT_ARGS="$ACCNT_ID" -C Debug -VV -O hdf5.log