Flushing files sometimes corrupts them

The HDF5 version is 1.10.2. We’re building with GCC 4.8.4 with C++11 activated on RHEL 6.9. This is serial access only; no MPI.

To reduce the risk of file corruption due to crashes, I wanted to experiment with flushing the cache immediately after writing datasets. When I add a call to DataSet::flush() (we’re using the C++ API where possible) immediately after the call to DataSet::write(), the resulting HDF5 files appear to be corrupted. Without explicit flushes, the files are fine. Also, if instead of flushing immediately after writing a dataset, I flush just before the file object goes out of scope, there’s no problem.

The only non-default property we are setting on the datasets is for link creation; we are setting the character encoding to H5T_CSET_UTF8. The file is being created in RDRW mode with no non-default properties set. No errors appear to occur while the HDF5 file is being written to or flushed.

Am I doing something wrong, or have I perhaps found a bug?


It sounds like an issue in HDF5. Would it be possible for you to post an example that reproduces the problem or send it to help@hdfgroup.org?

Thank you!


Hi Elena,

Thanks for your reply. I’ve been trying without success to create a small, self-contained example that produces the error. So far it’s only our main application, which is unwieldy to build, that is having a problem. I’ll reply again when I can provide an example. :frowning:

Thank you!

I am wondering… Are you running a multi-threaded C++ application?


It is not multi-threaded (no POSIX threads, no OpenMP, nothing like that).

Without my making any changes that seem relevant, the problem has mysteriously gone away this morning. Like I said, if I manage to reproduce this in a manageable example, I’ll let you know. For now it seems like kind of a heisenbug.