I am working with HDF5, where I am writing a data to HDF5 file. I am using
a hyperslab with extending a dataset on every write operation. Everything is
(if debug the application), but while closing a dataset and all other
objects associated with it I am getting errors, like,
1. H5Dclose() can't free
2. Unable to flush cached dataset
And hence all lower layer APIs getting failed.
Hereinwith attaching a screenshot of the error logs.
I smell a rotten, as some of the objects associated with the dataset are not
getting closed and so is the dataset itself.
Can you please help me to solve this issue.
Thanks in advance.