Assertion fail on write

Hi,

On certain datasets, I receive following error on write call:

Assertion failed: entry_ptr->size < ((size_t)(10*1024*1024)), file H5C.c,
line 10619

Looking at the header file where H5C_MAX_ENTRY_SIZE is defined, I see a
comment that user can increase or decrease it as appropriate.
This error happens on a dataset which attempts to write 240 elements of 8
bytes each at a given time. Assertion error occurs on 1035th write. So the
size of the dataset on the disk should be around ~2MB .

Can someone highlight when this kind of error shows up? Does increasing
the size resolve the issue - sideaffects?

Thanks,
Anal Patel
Software Engineer
Raytheon, Inc.

Hi,

On certain datasets, I receive following error on write call:

Assertion failed: entry_ptr->size < ((size_t)(10*1024*1024)), file H5C.c, line 10619

Looking at the header file where H5C_MAX_ENTRY_SIZE is defined, I see a comment that user can increase or decrease it as appropriate.
This error happens on a dataset which attempts to write 240 elements of 8 bytes each at a given time. Assertion error occurs on 1035th write. So the size of the dataset on the disk should be around ~2MB .

  That's odd - what version of the HDF5 library are you using?

Can someone highlight when this kind of error shows up? Does increasing the size resolve the issue - sideaffects?

  The metadata cache isn't directly involved with writing dataset elements, so I'm not certain that increasing the size is the correct action here. Is the dataset contiguous or chunked (or compact)? Is this occurring during the dataset creation or when writing elements to the dataset?

  Quincey

···

On Feb 26, 2009, at 9:50 AM, Anal K Patel wrote:

Hi Quincey,

We are using 1.8 version of hdf libraries.

The error occurs on writing the elements to dataset using C++ API . The
dataset is created using chunk properties (chunk dimensions of 1x240).

So the flow is: (a) collect data of 240 elements (b) create a dataset with
chunk props (c) write dataset (d) collect data of 240 elements (e) extend
dataset (becomes 2x240) (f) write dataset
Steps (d), (e), & (f) continue to successfully extend and write for 1034
times. On 1035th time, extend dataset is good, but write call fails with
assertion error.

Thanks for your prompt reply and help.

- Anal Patel

Quincey Koziol <koziol@hdfgroup.org>
02/26/2009 11:28 AM

To
Anal K Patel <Anal.K.Patel@raytheon.com>
cc
hdf-forum@hdfgroup.org
Subject
Re: [hdf-forum] Assertion fail on write

Hi,

On certain datasets, I receive following error on write call:

Assertion failed: entry_ptr->size < ((size_t)(10*1024*1024)), file
H5C.c, line 10619

Looking at the header file where H5C_MAX_ENTRY_SIZE is defined, I
see a comment that user can increase or decrease it as appropriate.
This error happens on a dataset which attempts to write 240 elements
of 8 bytes each at a given time. Assertion error occurs on 1035th
write. So the size of the dataset on the disk should be around ~2MB .

                 That's odd - what version of the HDF5 library are you
using?

Can someone highlight when this kind of error shows up? Does
increasing the size resolve the issue - sideaffects?

                 The metadata cache isn't directly involved with writing
dataset
elements, so I'm not certain that increasing the size is the correct
action here. Is the dataset contiguous or chunked (or compact)? Is
this occurring during the dataset creation or when writing elements to
the dataset?

                 Quincey

···

On Feb 26, 2009, at 9:50 AM, Anal K Patel wrote:

Hi Anal,

Hi Quincey,

We are using 1.8 version of hdf libraries.

The error occurs on writing the elements to dataset using C++ API . The dataset is created using chunk properties (chunk dimensions of 1x240).

So the flow is: (a) collect data of 240 elements (b) create a dataset with chunk props (c) write dataset (d) collect data of 240 elements (e) extend dataset (becomes 2x240) (f) write dataset
Steps (d), (e), & (f) continue to successfully extend and write for 1034 times. On 1035th time, extend dataset is good, but write call fails with assertion error.

  Hmm, I can't see any particular problem with that sequence, in general. Can you send a standalone test program that duplicates the error?

  Thanks,
    Quincey

···

On Feb 26, 2009, at 11:14 AM, Anal K Patel wrote:

Thanks for your prompt reply and help.

- Anal Patel

Quincey Koziol <koziol@hdfgroup.org>
02/26/2009 11:28 AM

To
Anal K Patel <Anal.K.Patel@raytheon.com>
cc
hdf-forum@hdfgroup.org
Subject
Re: [hdf-forum] Assertion fail on write

On Feb 26, 2009, at 9:50 AM, Anal K Patel wrote:

>
> Hi,
>
> On certain datasets, I receive following error on write call:
>
> Assertion failed: entry_ptr->size < ((size_t)(10*1024*1024)), file
> H5C.c, line 10619
>
> Looking at the header file where H5C_MAX_ENTRY_SIZE is defined, I
> see a comment that user can increase or decrease it as appropriate.
> This error happens on a dataset which attempts to write 240 elements
> of 8 bytes each at a given time. Assertion error occurs on 1035th
> write. So the size of the dataset on the disk should be around ~2MB .

                That's odd - what version of the HDF5 library are you using?

> Can someone highlight when this kind of error shows up? Does
> increasing the size resolve the issue - sideaffects?

                The metadata cache isn't directly involved with writing dataset
elements, so I'm not certain that increasing the size is the correct
action here. Is the dataset contiguous or chunked (or compact)? Is
this occurring during the dataset creation or when writing elements to
the dataset?

                Quincey