Segfault from h5py in create_dataset with >4 MPI processes

Hello,

I’ve been trying to work with HDF5 via h5py, and seem to have run into what is probably a HDF5 bug.

My minimal failing example is shown below - the boolean can be set to cause a segfault (or not), showing that it’s dependent on whether the maxshape is set for the new array.

This is MPI-enabled, and runs with mpirun -np 4 test.py

It works fine for 1,2,3 and 4 MPI processes, but segfaults on with more.

I’ve tried this with HDF5 1.10.1 (from the openSUSE official repos) and 1.10.4 (source from the hdf5 website and compiled myself), and with h5py 2.8.0 and 2.9.0 for Python (Anaconda) 2.7. I’m using MPI 1.10.7.

Has anyone seen this before?

Thank you,
Chris

from mpi4py import MPI
import h5py
import numpy as np

with h5py.File('test.h5', 'w', driver='mpio', comm=MPI.COMM_WORLD) as myfile:
    shape_tuple = (10,1)
    max_shape_tuple = (None,1)

    segfault = True 

    if segfault:
        new_array_piece = myfile.create_dataset(
                "1", shape_tuple, maxshape=max_shape_tuple
            )
    else:
        new_array_piece = myfile.create_dataset(
                    "1", shape_tuple
                )