HDF5: infinite loop closing library (v1.10.6)

Recently I started getting this error with HDF5 v1.10.6.

HDF5: infinite loop closing library
      L,T_top,P,P,Z,FD,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E

Similar errors have been reported before with much older versions of the library, but no answer gives a generic strategy how to investigate this issue. What would you recommand? Is this supposed to happen in the first place or should be considered an HDF5 bug?

We get this error when a C++ exception is thrown, so maybe related to the order the handles are closed. The serial (not MPI/IO) version of the lib is used. Let me know if you need more context…

The context is a minimum working example; MWE is required to confirm/reject your finding. I have encountered exactly this printout while developing H5CPP just don’t quite recall what resource was left opened.

For me – it was never been in the HDF5 C library, but in my own work (I made sure it never made it to the public code base). When using/developing C++ there are restrictions on error handling callbacks.
steve

Hi Steve, thanks for looking into this,

The context is a minimum working example;

Obvisouly :slight_smile: but that is a lot of work. So before diving into that, if someone could give a clue where to look at, I would appreciate!

just don’t quite recall what resource was left opened.

So leaving a resource open could lead to infinite loop closing library? All my resources are RAII so", even in presence of C++ exceptions, I am pretty confident that I am not leaking any.

When using/developing C++ there are restrictions on error handling callbacks.

I am using the default error reporting mechanism, but would be interested to know what these restrictions are?

Do you know how the lifetime of a dataset relates to its parent (file or group)? Can I close a file while keeping a handle on a dataset and continue to write to it? It works, but I wonder if it’s allowed…

{
    herr_t status;
    int i, j, dset_data[4][6];

    /* Initialize the dataset. */
    for (i = 0; i < 4; i++)
        for (j = 0; j < 6; j++)
            dset_data[i][j] = i * 6 + j + 1;

    /* Open an existing file. */
    hid_t file_id = H5Fcreate("foo.h5", H5F_ACC_TRUNC, H5P_DEFAULT, H5P_DEFAULT);

    /* Create the data space for the dataset. */
    hsize_t dims[] = {4, 6};
    hid_t dataspace_id = H5Screate_simple(2, dims, NULL);

    /* Create the dataset. */
    hid_t dataset_id = H5Dcreate2(file_id, "/bar", H5T_STD_I32BE, dataspace_id, H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT);

    /* Close the file. */
    status = H5Fclose(file_id);

    /* Write the dataset. */
    status = H5Dwrite(dataset_id, H5T_NATIVE_INT, H5S_ALL, H5S_ALL, H5P_DEFAULT, dset_data);

    /* Close the dataset. */
    status = H5Dclose(dataset_id);
}

I guess the same question could be asked for attributes?

See the table in the documentation of H5Pset_fclose_degree. Does that answer your question?

Let’s be careful with terms such as parent, etc. Other than in a metaphorical sense, there is no such thing in HDF5 in a technical sense.

G.

Thank you @gheber, sorry if I used the wrong terminology, I was not aware of this file close degree policy, so with H5F_CLOSE_WEAK being the default for my use, it explains why it works and that this usage is indeed supported.

I am having a hard time extracting a repro for my original issue.