HDF5 file corruption (no concurrent access)

Hi there,

I have the following scenario:

HDF-1.8.8 compiled from source tarball
hdf-java-2.8 compiled from source and referencing the above.

A Java process that opens an hdf file, creating one if necessary and writes data into compound type datasets which have one dimension and are extensible (UNLIMITED) in that dimension.

Initial runs of the Java process produce the expected results, typically a dataset will have 300-750 elements of the compound type written to it. Once the process has terminated it is possible to view the data with HDFView and see data as expected.

The problem occurs on the next attempt to write where there is data available, usually only a single scalar element of the compound type. The write apparently succeeds as there are no error messages produced. The handles and file are closed and from this point on the entire dataset appears to be corrupted.

I need to repeat and confirm by experiment that the corruption only occurs after writing of a single element to the dataset.

The compound datatype looks like this:

   private static final int[] fileDatatypes = new int[] {
     HDF5Constants.H5T_UNIX_D64LE,
     HDF5Constants.H5T_IEEE_F64LE,
     HDF5Constants.H5T_IEEE_F64LE
   };

   private static final int[] memoryDatatypes = new int[] {
     HDF5Constants.H5T_UNIX_D64LE,
     HDF5Constants.H5T_NATIVE_DOUBLE,
     HDF5Constants.H5T_NATIVE_DOUBLE
};

I am interested to know of any pointers to start debugging this kind of problem as without posting the entire code which is quite large, I expect it will be difficult to assist much more.

I am suspicious about the native double usage as I've seen at least one post about corruption related to this. The processes all terminate cleanly in each case, the filehandles being closed for each object in flight (to my knowledge).

Regards...Jeremy