Parallel HDF5 make check errors (test cache)

System: x86_64 GNU/Linux, Ubuntu 18.04. Compiling using mpich 2.

export CC=mpicc
./configure --enable-parallel --enable-shared --prefix=/usr/local/hdf5
make
make check

I have checked and there are no versions of hdf5 in the environment (there is just hdf5-helper).
Things I have tried:
-using the latest version, 1.10.3, and 1.8.18
-using gcc/g++ 7 as the backend compiler for mpicc
-no optimization O0
-compiling the serial version

All yield the same error below.

============================
Testing cache

cache Test Log

=========================================
Internal cache tests
express_test = 1

Testing smoke check #1 – all clean, ins, dest, ren, 4/2 MB cache FAILED
smoke_check_1(): failure_mssg = “error in H5C_insert().”.
Testing smoke check #2 – ~1/2 dirty, ins, dest, ren, 4/2 MB cache FAILED
smoke_check_2(): failure_mssg = “error in H5C_insert().”.
Testing smoke check #3 – all clean, ins, dest, ren, 2/1 KB cache FAILED
smoke_check_3(): failure_mssg = “error in H5C_protect().”.
Testing smoke check #4 – ~1/2 dirty, ins, dest, ren, 2/1 KB cache FAILED
smoke_check_4(): failure_mssg = “error in H5C_protect().”.
Testing smoke check #5 – all clean, ins, prot, unprot, AR cache 1 PASSED
Testing smoke check #6 – ~1/2 dirty, ins, prot, unprot, AR cache 1 PASSED
Testing smoke check #7 – all clean, ins, prot, unprot, AR cache 2 PASSED
Testing smoke check #8 – ~1/2 dirty, ins, prot, unprot, AR cache 2 PASSED
Testing smoke check #9 – all clean, ins, dest, ren, 4/2 MB, corked FAILED
smoke_check_9(): failure_mssg = “error in H5C_protect().”.
Testing smoke check #10 – ~1/2 dirty, ins, dest, ren, 4/2 MB, corked FAILED
smoke_check_10(): failure_mssg = “error in H5C_insert().”.
Testing write permitted check – 1/0 MB cache FAILED
write_permitted_check(): failure_mssg = “error in H5C_protect().”.
Testing H5C_insert_entry() functionality PASSED
Testing H5C_flush_cache() functionality FAILED
check_flush_cache(): failure_mssg = “flush with flags 0x0 failed in flush op test #6.”.
Testing H5C_get_entry_status() functionality HDF5-DIAG: Error detected in HDF5 (1.10.5) thread 0:
#000: H5F.c line 444 in H5Fcreate(): unable to create file
major: File accessibilty
minor: Unable to open file
#001: H5Fint.c line 1531 in H5F_open(): unable to truncate a file which is already open
major: File accessibilty
minor: Unable to open file
setup_cache: H5Fcreate() failed.
FAILED
check_get_entry_status(): failure_mssg = “file_ptr NULL from setup_cache.”.
Testing H5C_expunge_entry() functionality HDF5-DIAG: Error detected in HDF5 (1.10.5) thread 0:
#000: H5F.c line 444 in H5Fcreate(): unable to create file
major: File accessibilty
minor: Unable to open file
#001: H5Fint.c line 1531 in H5F_open(): unable to truncate a file which is already open
major: File accessibilty
minor: Unable to open file
setup_cache: H5Fcreate() failed.
FAILED
check_expunge_entry(): failure_mssg = “H5Fcreate() failed.”.
Testing multiple read only protects on a single entry HDF5-DIAG: Error detected in HDF5 (1.10.5) thread 0:
#000: H5F.c line 444 in H5Fcreate(): unable to create file
major: File accessibilty
minor: Unable to open file
#001: H5Fint.c line 1531 in H5F_open(): unable to truncate a file which is already open
major: File accessibilty
minor: Unable to open file
setup_cache: H5Fcreate() failed.
FAILED
check_multiple_read_protect: failure_mssg = “H5Fcreate() failed.”.
Testing H5C_move_entry() functionality HDF5-DIAG: Error detected in HDF5 (1.10.5) thread 0:
#000: H5F.c line 444 in H5Fcreate(): unable to create file
major: File accessibilty
minor: Unable to open file
#001: H5Fint.c line 1531 in H5F_open(): unable to truncate a file which is already open
major: File accessibilty
minor: Unable to open file
setup_cache: H5Fcreate() failed.
FAILED
check_move_entry(): failure_mssg = “H5Fcreate() failed.”.
Testing H5C_pin_protected_entry() functionality HDF5-DIAG: Error detected in HDF5 (1.10.5) thread 0:
#000: H5F.c line 444 in H5Fcreate(): unable to create file
major: File accessibilty
minor: Unable to open file
#001: H5Fint.c line 1531 in H5F_open(): unable to truncate a file which is already open
major: File accessibilty
minor: Unable to open file
setup_cache: H5Fcreate() failed.
FAILED
check_pin_protected_entry(): failure_mssg = “file_ptr NULL from setup_cache.”.
Testing entry resize functionality HDF5-DIAG: Error detected in HDF5 (1.10.5) thread 0:
#000: H5F.c line 444 in H5Fcreate(): unable to create file
major: File accessibilty
minor: Unable to open file
#001: H5Fint.c line 1531 in H5F_open(): unable to truncate a file which is already open
major: File accessibilty
minor: Unable to open file
setup_cache: H5Fcreate() failed.
FAILED
check_resize_entry(): failure_mssg = “file_ptr NULL from setup_cache.”.
Testing evictions enabled/disabled functionality HDF5-DIAG: Error detected in HDF5 (1.10.5) thread 0:
#000: H5F.c line 444 in H5Fcreate(): unable to create file
major: File accessibilty
minor: Unable to open file
#001: H5Fint.c line 1531 in H5F_open(): unable to truncate a file which is already open
major: File accessibilty
minor: Unable to open file
setup_cache: H5Fcreate() failed.
FAILED
check_evictions_enabled(): failure_mssg = “file_ptr NULL from setup_cache.”.
Testing flush cache with protected entry error HDF5-DIAG: Error detected in HDF5 (1.10.5) thread 0:
#000: H5F.c line 444 in H5Fcreate(): unable to create file
major: File accessibilty
minor: Unable to open file
#001: H5Fint.c line 1531 in H5F_open(): unable to truncate a file which is already open
major: File accessibilty
minor: Unable to open file
Command terminated by signal 11
15.28user 62.67system 3:33.57elapsed 36%CPU (0avgtext+0avgdata 256932maxresident)k
0inputs+0outputs (0major+97924minor)pagefaults 0swaps
make[4]: *** [Makefile:3143: cache.chkexe_] Error 1
make[4]: Leaving directory ‘/home/sdtran/hdf5/hdf5-1.10.5/test’
make[3]: *** [Makefile:3129: build-check-s] Error 2
make[3]: Leaving directory ‘/home/sdtran/hdf5/hdf5-1.10.5/test’
make[2]: *** [Makefile:3123: test] Error 2
make[2]: Leaving directory ‘/home/sdtran/hdf5/hdf5-1.10.5/test’
make[1]: *** [Makefile:2906: check-am] Error 2
make[1]: Leaving directory ‘/home/sdtran/hdf5/hdf5-1.10.5/test’
make: *** [Makefile:654: check-recursive] Error 1

config log here (pastebin link)

For some reason, I built HDF5-1.6.7 from source, and it passed make check without errors. Unfortunately, with h5py, 1.8.4 or newer is needed.

Update: 1.8.4 passes test cache, but, like previous versions I’ve tested (e.g., 1.8.18), does not pass dt_arith on my machine:

Testing soft unsigned long -> long double conversions FAILED
elmt 126:
src = ff ff ff ff ff ff ff ff 18446744073709551615
dst = 00 00 00 00 00 00 40 3e ff ff ff ff ff ff ff ff 18446744073709551615.000000
ans = 00 00 00 00 00 00 40 3f 80 00 00 00 00 00 00 00 18446744073709551616.000000
elmt 127:
src = ff ff ff ff ff ff ff fe 18446744073709551614
dst = 00 00 00 00 00 00 40 3e ff ff ff ff ff ff ff fe 18446744073709551614.000000
ans = 00 00 00 00 00 00 40 3f 80 00 00 00 00 00 00 00 18446744073709551616.000000
elmt 128:
src = ff ff ff ff ff ff ff fc 18446744073709551612
dst = 00 00 00 00 00 00 40 3e ff ff ff ff ff ff ff fc 18446744073709551612.000000
ans = 00 00 00 00 00 00 40 3f 80 00 00 00 00 00 00 00 18446744073709551616.000000
elmt 129:
src = ff ff ff ff ff ff ff f8 18446744073709551608
dst = 00 00 00 00 00 00 40 3e ff ff ff ff ff ff ff f8 18446744073709551608.000000
ans = 00 00 00 00 00 00 40 3f 80 00 00 00 00 00 00 00 18446744073709551616.000000
elmt 130:
src = ff ff ff ff ff ff ff f0 18446744073709551600
dst = 00 00 00 00 00 00 40 3e ff ff ff ff ff ff ff f0 18446744073709551600.000000
ans = 00 00 00 00 00 00 40 3f 80 00 00 00 00 00 00 00 18446744073709551616.000000
elmt 131:
src = ff ff ff ff ff ff ff e0 18446744073709551584
dst = 00 00 00 00 00 00 40 3e ff ff ff ff ff ff ff e0 18446744073709551584.000000
ans = 00 00 00 00 00 00 40 3f 80 00 00 00 00 00 00 00 18446744073709551616.000000
elmt 132:
src = ff ff ff ff ff ff ff c0 18446744073709551552
dst = 00 00 00 00 00 00 40 3e ff ff ff ff ff ff ff c0 18446744073709551552.000000
ans = 00 00 00 00 00 00 40 3f 80 00 00 00 00 00 00 00 18446744073709551616.000000
elmt 133:
src = ff ff ff ff ff ff ff 80 18446744073709551488
dst = 00 00 00 00 00 00 40 3e ff ff ff ff ff ff ff 80 18446744073709551488.000000
ans = 00 00 00 00 00 00 40 3f 80 00 00 00 00 00 00 00 18446744073709551616.000000
elmt 134:
src = ff ff ff ff ff ff ff 00 18446744073709551360
dst = 00 00 00 00 00 00 40 3e ff ff ff ff ff ff ff 00 18446744073709551360.000000
ans = 00 00 00 00 00 00 40 3f 80 00 00 00 00 00 00 00 18446744073709551616.000000
elmt 135:
src = ff ff ff ff ff ff fe 00 18446744073709551104
dst = 00 00 00 00 00 00 40 3e ff ff ff ff ff ff fe 00 18446744073709551104.000000
ans = 00 00 00 00 00 00 40 3f 80 00 00 00 00 00 00 00 18446744073709551616.000000
elmt 136:
src = ff ff ff ff ff ff fc 00 18446744073709550592
dst = 00 00 00 00 00 00 40 3e ff ff ff ff ff ff fc 00 18446744073709550592.000000
ans = 00 00 00 00 00 00 40 3f 80 00 00 00 00 00 00 00 18446744073709551616.000000
***** 11 FAILURES! *****
Command exited with non-zero status 1

Which seems fine, according to this thread.

But it also does not pass t_posix_compliant :

Purpose:
This tests if the file system is posix compliant when POSIX and MPI IO APIs
are used. This is for information only and always exits with 0 even when
non-compliance errors are encounter. This is to prevent this test from
aborting the remaining parallel HDF5 tests unnecessarily.
Process 0: testfile=posix_test

Testing size 1024
Testing allwrite_allread_blocks with MPI IO Process 1: testfile=posix_test
Process 2: testfile=posix_test
PASSED
Testing allwrite_allread_interlaced with MPI IO
Arrays do not match! Prcoess 0, element 1: [1, -1]
Arrays do not match! Prcoess 0, element 1: [2, -1]
Arrays do not match! Prcoess 0, element 1: [3, -1]
Arrays do not match! Prcoess 0, element 1: [4, -1]
Arrays do not match! Prcoess 0, element 1: [5, -1]
Printed 5 errors. Omitting the rest
Arrays do not match! Prcoess 1, element 1: [1, -1]
Arrays do not match! Prcoess 1, element 1: [2, -1]
Arrays do not match! Prcoess 1, element 1: [3, -1]
Arrays do not match! Prcoess 1, element 1: [4, -1]
Arrays do not match! Prcoess 1, element 1: [5, -1]
Printed 5 errors. Omitting the rest
Arrays do not match! Prcoess 2, element 1: [1, -1]
Arrays do not match! Prcoess 2, element 1: [2, -1]
Arrays do not match! Prcoess 2, element 1: [3, -1]
Arrays do not match! Prcoess 2, element 1: [4, -1]
Arrays do not match! Prcoess 2, element 1: [5, -1]
Printed 5 errors. Omitting the rest
Testing allwrite_allread_overlap with MPI IO PASSED
Testing onewrite_allread_blocks with MPI IO PASSED
Testing onewrite_allread_interlaced with MPI IO PASSED
Testing allwrite_allread_blocks with POSIX IO Arrays do not match! Prcoess 2, element 1: [1, 0]
Arrays do not match! Prcoess 2, element 2: [2, 0]
Arrays do not match! Prcoess 2, element 3: [3, 0]
Arrays do not match! Prcoess 2, element 4: [4, 0]
Arrays do not match! Prcoess 2, element 5: [5, 0]
Printed 5 errors. Omitting the rest
Testing onewrite_allread_blocks with POSIX IO PASSED
Testing onewrite_allread_interlaced with POSIX IO PASSED

Testing size 4096
Testing allwrite_allread_blocks with MPI IO PASSED
Testing allwrite_allread_interlaced with MPI IO PASSED
Testing allwrite_allread_overlap with MPI IO
Arrays do not match! Prcoess 0, element 1: [1, -1]
Arrays do not match! Prcoess 0, element 3: [3, -1]
Arrays do not match! Prcoess 0, element 5: [5, -1]
Arrays do not match! Prcoess 0, element 7: [7, -1]
Arrays do not match! Prcoess 0, element 9: [9, -1]
Printed 5 errors. Omitting the rest
Arrays do not match! Prcoess 1, element 1: [1, -1]
Arrays do not match! Prcoess 1, element 3: [3, -1]
Arrays do not match! Prcoess 1, element 5: [5, -1]
Arrays do not match! Prcoess 1, element 7: [7, -1]
Arrays do not match! Prcoess 1, element 9: [9, -1]
Printed 5 errors. Omitting the rest
Arrays do not match! Prcoess 2, element 1: [1, -1]
Arrays do not match! Prcoess 2, element 3: [3, -1]
Arrays do not match! Prcoess 2, element 5: [5, -1]
Arrays do not match! Prcoess 2, element 7: [7, -1]
Arrays do not match! Prcoess 2, element 9: [9, -1]
Printed 5 errors. Omitting the rest
Testing onewrite_allread_blocks with MPI IO PASSED
Testing onewrite_allread_interlaced with MPI IO PASSED
Testing allwrite_allread_blocks with POSIX IO PASSED
Testing onewrite_allread_blocks with POSIX IO PASSED
Testing onewrite_allread_interlaced with POSIX IO PASSED

Testing size 16384
Testing allwrite_allread_blocks with MPI IO PASSED
Testing allwrite_allread_interlaced with MPI IO
Arrays do not match! Prcoess 1, element 2: [0, -1]
Arrays do not match! Prcoess 1, element 0: [1, -1]
Arrays do not match! Prcoess 1, element 2: [1, -1]
Arrays do not match! Prcoess 1, element 0: [2, -1]
Arrays do not match! Prcoess 1, element 2: [2, -1]
Printed 5 errors. Omitting the rest
Arrays do not match! Prcoess 2, element 2: [0, -1]
Arrays do not match! Prcoess 2, element 0: [1, -1]
Arrays do not match! Prcoess 2, element 2: [1, -1]
Arrays do not match! Prcoess 2, element 0: [2, -1]
Arrays do not match! Prcoess 2, element 2: [2, -1]
Printed 5 errors. Omitting the rest
Arrays do not match! Prcoess 0, element 2: [0, -1]
Arrays do not match! Prcoess 0, element 0: [1, -1]
Arrays do not match! Prcoess 0, element 2: [1, -1]
Arrays do not match! Prcoess 0, element 0: [2, -1]
Arrays do not match! Prcoess 0, element 2: [2, -1]
Printed 5 errors. Omitting the rest
Testing allwrite_allread_overlap with MPI IO Arrays do not match! Prcoess 1, element 1: [1, -1]
Arrays do not match! Prcoess 1, element 3: [3, -1]
Arrays do not match! Prcoess 1, element 5: [5, -1]
Arrays do not match! Prcoess 1, element 7: [7, -1]
Arrays do not match! Prcoess 1, element 9: [9, -1]
Printed 5 errors. Omitting the rest

Arrays do not match! Prcoess 0, element 1: [1, -1]
Arrays do not match! Prcoess 0, element 3: [3, -1]
Arrays do not match! Prcoess 0, element 5: [5, -1]
Arrays do not match! Prcoess 0, element 7: [7, -1]
Arrays do not match! Prcoess 0, element 9: [9, -1]
Printed 5 errors. Omitting the rest
Arrays do not match! Prcoess 2, element 1: [1, -1]
Arrays do not match! Prcoess 2, element 3: [3, -1]
Arrays do not match! Prcoess 2, element 5: [5, -1]
Arrays do not match! Prcoess 2, element 7: [7, -1]
Arrays do not match! Prcoess 2, element 9: [9, -1]
Printed 5 errors. Omitting the rest
Testing onewrite_allread_blocks with MPI IO PASSED
Testing onewrite_allread_interlaced with MPI IO PASSED
Testing allwrite_allread_blocks with POSIX IO PASSED
Testing onewrite_allread_blocks with POSIX IO PASSED
Testing onewrite_allread_interlaced with POSIX IO PASSED

Testing size 65536
Testing allwrite_allread_blocks with MPI IO PASSED
Testing allwrite_allread_interlaced with MPI IO Arrays do not match! Prcoess 1, element 2: [0, -1]
Arrays do not match! Prcoess 1, element 0: [1, -1]
Arrays do not match! Prcoess 1, element 2: [1, -1]
Arrays do not match! Prcoess 1, element 0: [2, -1]
Arrays do not match! Prcoess 1, element 2: [2, -1]
Arrays do not match! Prcoess 2, element 2: [0, -1]
Arrays do not match! Prcoess 2, element 0: [1, -1]
Arrays do not match! Prcoess 2, element 2: [1, -1]
Arrays do not match! Prcoess 2, element 0: [2, -1]
Arrays do not match! Prcoess 2, element 2: [2, -1]
Printed 5 errors. Omitting the rest

Arrays do not match! Prcoess 0, element 2: [0, -1]
Arrays do not match! Prcoess 0, element 0: [1, -1]
Arrays do not match! Prcoess 0, element 2: [1, -1]
Arrays do not match! Prcoess 0, element 0: [2, -1]
Arrays do not match! Prcoess 0, element 2: [2, -1]
Printed 5 errors. Omitting the rest
Printed 5 errors. Omitting the rest
Testing allwrite_allread_overlap with MPI IO
Arrays do not match! Prcoess 0, element 1: [1, -1]
Arrays do not match! Prcoess 0, element 3: [3, -1]
Arrays do not match! Prcoess 0, element 5: [5, -1]
Arrays do not match! Prcoess 0, element 7: [7, -1]
Arrays do not match! Prcoess 0, element 9: [9, -1]
Printed 5 errors. Omitting the rest
Arrays do not match! Prcoess 1, element 1: [1, -1]
Arrays do not match! Prcoess 1, element 3: [3, -1]
Arrays do not match! Prcoess 1, element 5: [5, -1]
Arrays do not match! Prcoess 1, element 7: [7, -1]
Arrays do not match! Prcoess 1, element 9: [9, -1]
Printed 5 errors. Omitting the rest
Arrays do not match! Prcoess 2, element 1: [1, -1]
Arrays do not match! Prcoess 2, element 3: [3, -1]
Arrays do not match! Prcoess 2, element 5: [5, -1]
Arrays do not match! Prcoess 2, element 7: [7, -1]
Arrays do not match! Prcoess 2, element 9: [9, -1]
Printed 5 errors. Omitting the rest
Testing onewrite_allread_blocks with MPI IO PASSED
Testing onewrite_allread_interlaced with MPI IO PASSED
Testing allwrite_allread_blocks with POSIX IO Arrays do not match! Prcoess 2, element 5376: [5376, 0]
Arrays do not match! Prcoess 2, element 5377: [5377, 0]
Arrays do not match! Prcoess 2, element 5378: [5378, 0]
Arrays do not match! Prcoess 2, element 5379: [5379, 0]
Arrays do not match! Prcoess 2, element 5380: [5380, 0]
Printed 5 errors. Omitting the rest
Testing onewrite_allread_blocks with POSIX IO PASSED
Testing onewrite_allread_interlaced with POSIX IO PASSED

Testing size 262144
Testing allwrite_allread_blocks with MPI IO PASSED
Testing allwrite_allread_interlaced with MPI IO Arrays do not match! Prcoess 2, element 1: [0, -1]
Arrays do not match! Prcoess 2, element 2: [0, -1]
Arrays do not match! Prcoess 2, element 1: [1, -1]
Arrays do not match! Prcoess 2, element 2: [1, -1]
Arrays do not match! Prcoess 2, element 1: [2, -1]
Printed 5 errors. Omitting the rest

Arrays do not match! Prcoess 0, element 1: [0, -1]
Arrays do not match! Prcoess 0, element 2: [0, -1]
Arrays do not match! Prcoess 0, element 1: [1, -1]
Arrays do not match! Prcoess 0, element 2: [1, -1]
Arrays do not match! Prcoess 0, element 1: [2, -1]
Printed 5 errors. Omitting the rest
Arrays do not match! Prcoess 1, element 1: [0, -1]
Arrays do not match! Prcoess 1, element 2: [0, -1]
Arrays do not match! Prcoess 1, element 1: [1, -1]
Arrays do not match! Prcoess 1, element 2: [1, -1]
Arrays do not match! Prcoess 1, element 1: [2, -1]
Printed 5 errors. Omitting the rest
Testing allwrite_allread_overlap with MPI IO
Arrays do not match! Prcoess 0, element 1: [1, -1]
Arrays do not match! Prcoess 0, element 5: [5, -1]
Arrays do not match! Prcoess 0, element 7: [7, -1]
Arrays do not match! Prcoess 0, element 11: [11, -1]
Arrays do not match! Prcoess 0, element 13: [13, -1]
Printed 5 errors. Omitting the rest
Arrays do not match! Prcoess 1, element 1: [1, -1]
Arrays do not match! Prcoess 1, element 5: [5, -1]
Arrays do not match! Prcoess 1, element 7: [7, -1]
Arrays do not match! Prcoess 1, element 11: [11, -1]
Arrays do not match! Prcoess 2, element 1: [1, -1]
Arrays do not match! Prcoess 2, element 5: [5, -1]
Arrays do not match! Prcoess 2, element 7: [7, -1]
Arrays do not match! Prcoess 2, element 11: [11, -1]
Arrays do not match! Prcoess 2, element 13: [13, -1]
Printed 5 errors. Omitting the rest
Arrays do not match! Prcoess 1, element 13: [13, -1]
Printed 5 errors. Omitting the rest
Testing onewrite_allread_blocks with MPI IO PASSED
Testing onewrite_allread_interlaced with MPI IO PASSED
Testing allwrite_allread_blocks with POSIX IO PASSED
Testing onewrite_allread_blocks with POSIX IO PASSED
Testing onewrite_allread_interlaced with POSIX IO PASSED

Summary:
Process 0: encountered 374218 mismatches.
Process 1: encountered 374218 mismatches.
Process 2: encountered 374546 mismatches.
2.64user 4.76system 0:03.15elapsed 234%CPU (0avgtext+0avgdata 22616maxresident)k
0inputs+0outputs (0major+74380minor)pagefaults 0swaps

Finished testing t_posix_compliant

Or phdf5 tests:

MPI-process 2. hostname=LAPTOP-KJ7BN7BP
For help use: /home/sdtran/hdf5/hdf5-1.8.4-patch1/testpar/.libs/testphdf5 -help
Linked with hdf5 version 1.8 release 4
MPI-process 0. hostname=LAPTOP-KJ7BN7BP

For help use: /home/sdtran/hdf5/hdf5-1.8.4-patch1/testpar/.libs/testphdf5 -help
Linked with hdf5 version 1.8 release 4
MPI-process 1. hostname=LAPTOP-KJ7BN7BP

For help use: /home/sdtran/hdf5/hdf5-1.8.4-patch1/testpar/.libs/testphdf5 -help
Linked with hdf5 version 1.8 release 4
*** Hint ***
You can use environment variable HDF5_PARAPREFIX to run parallel test files in a
different directory or to add file type prefix. E.g.,
HDF5_PARAPREFIX=pfs:/PFS/user/me
export HDF5_PARAPREFIX
*** End of Hint ***
Test filenames are:
ParaTest.h5
Testing – fapl_mpio duplicate (mpiodup)
Test filenames are:
ParaTest.h5
Testing – fapl_mpio duplicate (mpiodup)
Test filenames are:
ParaTest.h5
Testing – fapl_mpio duplicate (mpiodup)
Testing – fapl_mpiposix duplicate (posixdup)
Testing – fapl_mpiposix duplicate (posixdup)
Testing – fapl_mpiposix duplicate (posixdup)
Testing – dataset using split communicators (split)
Testing – dataset using split communicators (split)
Testing – dataset using split communicators (split)
Testing – dataset independent write (idsetw)
Testing – dataset independent write (idsetw)
Testing – dataset independent write (idsetw)
Testing – dataset independent read (idsetr)
Testing – dataset independent read (idsetr)
Testing – dataset independent read (idsetr)
Testing – dataset collective write (cdsetw)
Testing – dataset collective write (cdsetw)
Testing – dataset collective write (cdsetw)
Testing – dataset collective read (cdsetr)
Testing – dataset collective read (cdsetr)
Testing – dataset collective read (cdsetr)
Testing – extendible dataset independent write (eidsetw)
Testing – extendible dataset independent write (eidsetw)
Testing – extendible dataset independent write (eidsetw)
Testing – extendible dataset independent read (eidsetr)
Testing – extendible dataset independent read (eidsetr)
Testing – extendible dataset independent read (eidsetr)
Testing – extendible dataset collective write (ecdsetw)
Testing – extendible dataset collective write (ecdsetw)
Testing – extendible dataset collective write (ecdsetw)
Testing – extendible dataset collective read (ecdsetr)
Testing – extendible dataset collective read (ecdsetr)
Testing – extendible dataset collective read (ecdsetr)
Testing – extendible dataset independent write #2 (eidsetw2)
Testing – extendible dataset independent write #2 (eidsetw2)
Testing – extendible dataset independent write #2 (eidsetw2)
Testing – chunked dataset with none-selection (selnone)
Testing – chunked dataset with none-selection (selnone)
Testing – chunked dataset with none-selection (selnone)
Testing – parallel extend Chunked allocation on serial file (calloc)
Testing – parallel extend Chunked allocation on serial file (calloc)
Testing – parallel extend Chunked allocation on serial file (calloc)
Testing – parallel read of dataset written serially with filters (fltread)
Testing – parallel read of dataset written serially with filters (fltread)
Testing – parallel read of dataset written serially with filters (fltread)
Testing – compressed dataset collective read (cmpdsetr)
Testing – compressed dataset collective read (cmpdsetr)
Testing – compressed dataset collective read (cmpdsetr)
Testing – multiple datasets write (ndsetw)
Testing – multiple datasets write (ndsetw)
Testing – multiple datasets write (ndsetw)
Testing – multiple groups write (ngrpw)
Testing – multiple groups write (ngrpw)
Testing – multiple groups write (ngrpw)
Testing – multiple groups read (ngrpr)
Testing – multiple groups read (ngrpr)
Testing – multiple groups read (ngrpr)
Testing – compact dataset test (compact)
Testing – compact dataset test (compact)
Testing – compact dataset test (compact)
Testing – collective group and dataset write (cngrpw)
Testing – collective group and dataset write (cngrpw)
Testing – collective group and dataset write (cngrpw)
Testing – independent group and dataset read (ingrpr)
Testing – independent group and dataset read (ingrpr)
Testing – independent group and dataset read (ingrpr)
Testing – big dataset test (bigdset)
Testing – big dataset test (bigdset)
Testing – big dataset test (bigdset)
Proc 2: *** PHDF5 ERROR ***
Assertion (H5Fclose succeeded) failed at line 457 in t_mdset.c
aborting MPI process
Proc 0: *** PHDF5 ERROR ***
Assertion (H5Fclose succeeded) failed at line 457 in t_mdset.c
aborting MPI process
Proc 1: *** PHDF5 ERROR ***
Assertion (H5Fclose succeeded) failed at line 457 in t_mdset.c
aborting MPI process

===================================================================================
= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
= PID 19104 RUNNING AT LAPTOP-KJ7BN7BP
= EXIT CODE: 139
= CLEANING UP REMAINING PROCESSES
= YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
YOUR APPLICATION TERMINATED WITH THE EXIT STRING: Segmentation fault (signal 11)
This typically refers to a problem with your application.
Please see the FAQ page for debugging suggestions
Command exited with non-zero status 139