Parallel HDF5 build problem: t_pmulti_dset test fails

System Information

OS: Ubuntu 22.04
Processor: x86_64 (i5-10400F)

‘$ mpicc --version’
gcc (Ubuntu 11.4.0-1ubuntu1~22.04) 11.4.0

‘$ mpirun --version’
mpirun (Open MPI) 4.1.2

Dropbox link to the console output that represents the test log:
https://www.dropbox.com/scl/fi/ns2lrk5lttjunnwtw0tn9/TestLog-t_pmulti_dset.docx?rlkey=4m89yc8enpzzedeq8r65fxrm0&st=mngijhmp&dl=0

Summary

Trying to build parallel HDF5 with CMake so that I can use GitHub - hpc-io/vfd-gds in order to write .h5 files directly from GPU device data. Parallel HDF5 is necessary to run the tests for the above, and I’m averse to installing, and running, untested software.

Please let me know if there is anything that I missed that is necessary to sufficiently describe the problem.

That version of OpenMPI might be causing issues. Please try using a version that is 4.1.5 or newer.

1 Like

Thank you for the pointer, but unfortunately I have run into a new problem with $ make check, this time with testphdf5

Test Log Output

https://www.dropbox.com/scl/fi/c0elx1h8etj2ke0iu16wc/TestLog-testphdf5.docx?rlkey=qum2dluaxvi1g5bayc5vayifm&st=2cfiv05b&dl=0

System Information

$ mpicc --version
gcc (Ubuntu 11.4.0-1ubuntu1~22.04) 11.4.0

$ mpirun --version
mpirun (Open MPI) 5.0.4
Summary
~~~
Test filenames are:
    ParaTest.h5
Testing  -- fapl_mpio duplicate (mpiodup) 
Proc 3: *** Parallel ERROR ***
    VRFY (new and old nkeys equal) failed at line  104 in ../../hdf5-1.14.3/testpar/t_ph5basic.c

Same *** Parallel ERROR *** for Proc 0 - 5
~~~
OpenMPI was built from source, any help is appreciated, thank you.

Hi @russellmatt66,

see HDF5 Info test fails with ompi-main · Issue #12742 · open-mpi/ompi · GitHub. It’s not yet clear at this point, but it looks like some of the rules surrounding MPI Info objects may have changed with the MPI 4 standard and that’s causing issues with some assumptions we were making. While we figure this out on our end, I’ve been using OpenMPI 5.0.3 and it doesn’t appear to have this issue.

1 Like

@jhenderson Thanks for the advice, and pointing to me this conversation! I’ll try 5.0.3 as indicated.

Update: OpenMPI v5.0.3

All tests passed!