External datastore accessed relative to cwd

Hello,

When accessing a dataset stored in an external file (relative path), it
depends on the current working directory (cwd) if the dataset can be
accessed. This is a problem for us and HDF5 tools like h5dump also
suffer from it:
$ h5dump data1.h5
works fine, but
$ h5dump data/data1.h5
fails if the external files are also in data/. All groups and attributes
are printed correctly, but the external dataset(s) cannot be accessed:
h5dump error: unable to print data

We are developing a thin layer on top of libhdf5
(https://github.com/nextgen-astrodata/DAL) that implements data product
specifications for the LOFAR radio astronomy telescope.
To avoid the external dataset access problem, we could change the cwd
just before libhdf5 opens an external file (and restore it when the
function returns).

To perform 'matrix I/O' on a dataset, we make a series of hdf5 calls,
roughly:
H5Dget_space() to retrieve the dataspace
H5Sselect_hyperslab() on the dataspace
H5Screate_simple() to create memspace
H5Sselect_hyperslab() on the memspace
H5DRead() or H5DWrite()
The last call appears to fail for external datasets if the file is not
in the cwd.

Does this mean that for every H5DRead()/H5DWrite() access, libhdf5 opens
and closes the external file and thus that we have to chdir() around
that call? Or does it keep files open with a timeout?

(Why doesn't libhdf5 interpret a relative path to externals as relative
to the h5 file?)

Regards,
Alexander van Amesfoort