Closing a dataset

Hi,

I am working with HDF5, where I am writing a data to HDF5 file. I am using
a hyperslab with extending a dataset on every write operation. Everything is
going right

(if debug the application), but while closing a dataset and all other
objects associated with it I am getting errors, like,

1. H5Dclose() can't free

2. Unable to flush cached dataset
.
.
.
.
And hence all lower layer APIs getting failed.

Hereinwith attaching a screenshot of the error logs.

I smell a rotten, as some of the objects associated with the dataset are not
getting closed and so is the dataset itself.

Can you please help me to solve this issue.

Thanks in advance.

Regards,
Santosh Darekar.

Hi Santosh,

···

On Sep 18, 2009, at 6:00 AM, Santosh Darekar wrote:

Hi,

I am working with HDF5, where I am writing a data to HDF5 file. I am using a hyperslab with extending a dataset on every write operation. Everything is going right

(if debug the application), but while closing a dataset and all other objects associated with it I am getting errors, like,

1. H5Dclose() can't free

2. Unable to flush cached dataset
.
And hence all lower layer APIs getting failed.

Hereinwith attaching a screenshot of the error logs.

I smell a rotten, as some of the objects associated with the dataset are not getting closed and so is the dataset itself.

Can you please help me to solve this issue.

  Hmm, I don't see anything obvious from your screenshot, but upgrading to the 1.8.3 release may help.

    Quincey