"No space available for allocation" Error in H5Dwrite function

Hi Anupal,

···

--
Pieter van der Meer
Software Engineer / Scientist
E-mail: P.van.der.Meer@sron.nl
Tel. +31 (0) 30 - 2535725

SRON Netherlands Institute for Space Research

"Gopal, Anupam (GE Infra, Energy)" <anupam.gopal@ge.com> 06/02/08 3:38 PM >>>

HDF 5 comes up with an error message saying "No space available for allocation" during the H5Dwrite() function.

We had similar experiences in the past. With the HDF5-1.6 library there was no error message. But with HDF5-1.8 there was perhaps a similar error (address overflow). The precise message is listed below for the sake of completeness. Could you also please mail your exact error message?

Typically, the file contains 8 datasets, with one of them being of size (13000*5*26000). Initially the dataset has size (13000*5*1) and grows in the Z direction.

We are also working with many big chunked datasets in one file. And I assume you are also using arrays, not tables? Data corruption was imminent after a while. Arrays were 2D and typically ~1000 floats wide and many 10.000's long. We now tried additional tests where there is just 1 array in the HDF5 file. This works absolutely fine: the array can grow very big without any problems. But with a few dozen of arrays in a HDF5 file the problem always seems to occur.

I hope this helps. Maybe others are experiencing the same problem too.

-- start of error message --

The programs which fills the database gives the following stack trace:
HDF5-DIAG: Error detected in HDF5 (1.8.0) thread 3073001152:
#000: H5Dio.c line 441 in H5Dwrite(): can't write data
   major: Dataset
   minor: Write failed
#001: H5Dio.c line 740 in H5D_write(): can't write data
   major: Dataset
   minor: Write failed
#002: H5Dio.c line 1841 in H5D_chunk_write(): unable to read raw data chunk
   major: Low-level I/O
   minor: Read failed
#003: H5Distore.c line 1906 in H5D_istore_lock(): unable to read raw data chunk
   major: Low-level I/O
   minor: Read failed
#004: H5F.c line 2974 in H5F_block_read(): file read failed
   major: Low-level I/O
   minor: Read failed
#005: H5FD.c line 2061 in H5FD_read(): driver read request failed
   major: Virtual File Layer
   minor: Read failed
#006: H5FDsec2.c line 725 in H5FD_sec2_read(): addr overflow
   major: Invalid arguments to routine
   minor: Address overflowed

-- start of error message --

With kind regards,

Pieter van der Meer

----------------------------------------------------------------------
This mailing list is for HDF software users discussion.
To subscribe to this list, send a message to hdf-forum-subscribe@hdfgroup.org.
To unsubscribe, send a message to hdf-forum-unsubscribe@hdfgroup.org.

Hi Pieter and Anupam,

Hi Anupal,

-- Pieter van der Meer
Software Engineer / Scientist
E-mail: P.van.der.Meer@sron.nl
Tel. +31 (0) 30 - 2535725

SRON Netherlands Institute for Space Research
http://www.sron.nl

"Gopal, Anupam (GE Infra, Energy)" <anupam.gopal@ge.com> 06/02/08 3:38 PM >>>

HDF 5 comes up with an error message saying "No space available for allocation" during the H5Dwrite() function.

We had similar experiences in the past. With the HDF5-1.6 library there was no error message. But with HDF5-1.8 there was perhaps a similar error (address overflow). The precise message is listed below for the sake of completeness. Could you also please mail your exact error message?

Typically, the file contains 8 datasets, with one of them being of size (13000*5*26000). Initially the dataset has size (13000*5*1) and grows in the Z direction.

We are also working with many big chunked datasets in one file. And I assume you are also using arrays, not tables? Data corruption was imminent after a while. Arrays were 2D and typically ~1000 floats wide and many 10.000's long. We now tried additional tests where there is just 1 array in the HDF5 file. This works absolutely fine: the array can grow very big without any problems. But with a few dozen of arrays in a HDF5 file the problem always seems to occur.

  Hmm, we've got plenty of users who are storing many large chunked datasets in HDF5 files, so I know it's not a pervasive problem. Can you duplicate this problem with the 1.8.1 release (on it's way this week)? If so, can you get us a small C program that shows the issue, so we can debug it?

I hope this helps. Maybe others are experiencing the same problem too.

-- start of error message --

  Anupam, is your error stack similar? (I'm expecting it's not the same)

    Quincey

···

On Jun 2, 2008, at 10:48 AM, Pieter van der Meer wrote:

The programs which fills the database gives the following stack trace:
HDF5-DIAG: Error detected in HDF5 (1.8.0) thread 3073001152:
#000: H5Dio.c line 441 in H5Dwrite(): can't write data
  major: Dataset
  minor: Write failed
#001: H5Dio.c line 740 in H5D_write(): can't write data
  major: Dataset
  minor: Write failed
#002: H5Dio.c line 1841 in H5D_chunk_write(): unable to read raw data chunk
  major: Low-level I/O
  minor: Read failed
#003: H5Distore.c line 1906 in H5D_istore_lock(): unable to read raw data chunk
  major: Low-level I/O
  minor: Read failed
#004: H5F.c line 2974 in H5F_block_read(): file read failed
  major: Low-level I/O
  minor: Read failed
#005: H5FD.c line 2061 in H5FD_read(): driver read request failed
  major: Virtual File Layer
  minor: Read failed
#006: H5FDsec2.c line 725 in H5FD_sec2_read(): addr overflow
  major: Invalid arguments to routine
  minor: Address overflowed

-- start of error message --

With kind regards,

Pieter van der Meer

----------------------------------------------------------------------
This mailing list is for HDF software users discussion.
To subscribe to this list, send a message to hdf-forum-subscribe@hdfgroup.org.
To unsubscribe, send a message to hdf-forum-unsubscribe@hdfgroup.org.

----------------------------------------------------------------------
This mailing list is for HDF software users discussion.
To subscribe to this list, send a message to hdf-forum-subscribe@hdfgroup.org.
To unsubscribe, send a message to hdf-forum-unsubscribe@hdfgroup.org.

Hi Quincey & Pieter,

I just printed out the stack trace for the error message, Does it make
sense to you. This error message does not show up initially while
writing a dataset of size (13000*5*26000). It shows up while writing the
5000 th chunk. If we increase the java max heap size the error shows up
sooner while writing the 3000th chunk. Each chunk again is a plane of
size (13000*5). This chunk is being added 26000 times to complete the
3-d dataset. One quick note, although it might not seem that important,
the H5 file also has 7 other three dimensional datasets, which are of
smaller size , and are written in a similar fashion. However we do not
experience any problem while writing those. Thanks for your help and
looking forward to hear from you.

Total memory is: 47083520
Initial free memory: 28906904
free memory: 40158512
DatasetRdWt.H5Dwrite_wrap() with HDF5Exception: No space available for
allocation
ncsa.hdf.hdf5lib.exceptions.HDF5ResourceUnavailableException: No space
available for allocation
release\hdf5_vs6\hdf5\src\H5Dio.c line 958 in H5D_write(): can't write
data
    major(15): Dataset interface
    minor(25): Write failed
  #002:
D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dio.c line
1996 in H5D_chunk_write(): optimized write failed
    major(15): Dataset interface
    minor(25): Write failed
  #003:
D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dselect.c
line 654 in H5D_select_write(): write error
    major(14): Dataspace interface
    minor(25): Write failed
  #004:
D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Distore.c
line 2175 in H5D_istore_writevv(): unable to read raw data chunk
    major(05): Low-level I/O layer
    minor(25): Write failed
  #005:
D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Distore.c
line 1576 in H5D_istore_lock(): memory allocation failed for raw data
chunk
    major(02): Resource unavailable
    minor(06): No space available for allocation
HDF5-DIAG: Error detected in HDF5 library version: 1.6.5 thread 0. Back
trace follows.
  #000:
D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dio.c line
587 in H5Dwrite(): can't write data
    major(15): Dataset interface
    minor(25): Write failed
  #001:
D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dio.c line
958 in H5D_write(): can't write data
    major(15): Dataset interface
    minor(25): Write failed
  #002:
D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dio.c line
1996 in H5D_chunk_write(): optimized write failed
    major(15): Dataset interface
    minor(25): Write failed
  #003:
D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dselect.c
line 654 in H5D_select_write(): write error
    major(14): Dataspace interface
    minor(25): Write failed
  #004:
D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Distore.c
line 2175 in H5D_istore_writevv(): unable to read raw data chunk
    major(05): Low-level I/O layer
    minor(25): Write failed
  #005:
D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Distore.c
line 1576 in H5D_istore_lock(): memory allocation failed for raw data
chunk
    major(02): Resource unavailable
    minor(06): No space available for allocation
HDF5-DIAG: Error detected in HDF5 library version: 1.6.5 thread 0. Back
trace follows.
  #000:
D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dio.c line
587 in H5Dwrite(): can't write data
    major(15): Dataset interface
    minor(25): Write failed
  #001:
D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dio.c line
958 in H5D_write(): can't write data
    major(15): Dataset interface
    minor(25): Write failed
  #002:
D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dio.c line
1996 in H5D_chunk_write(): optimized write failed
    major(15): Dataset interface
    minor(25): Write failed
  #003:
D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dselect.c
line 654 in H5D_select_write(): write error
    major(14): Dataspace interface
    minor(25): Write failed
  #004:
D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Distore.c
line 2175 in H5D_istore_writevv(): unable to read raw data chunk
    major(05): Low-level I/O layer
    minor(25): Write failed
  #005:
D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Distore.c
line 1576 in H5D_istore_lock(): memory allocation failed for raw data
chunk
    major(02): Resource unavailable
    minor(06): No space available for allocation
HDF5-DIAG: Error detected in HDF5 library version: 1.6.5 thread 0. Back
trace follows.
  #000:
D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dio.c line
587 in H5Dwrite(): can't write data
    major(15): Dataset interface
    minor(25): Write failed
  #001:
D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dio.c line
958 in H5D_write(): can't write data
    major(15): Dataset interface
    minor(25): Write failed
  #002:
D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dio.c line
1996 in H5D_chunk_write(): optimized write failed
    major(15): Dataset interface

Anupam Gopal
Energy Application & Systems Engineering
GE Energy
T518-385-4586
F 518-385-5703
E anupam.gopal@ge.com
http://www.gepsec.com
General Electric International, Inc.

···

-----Original Message-----
From: Quincey Koziol [mailto:koziol@hdfgroup.org]
Sent: Tuesday, June 03, 2008 9:15 AM
To: Pieter van der Meer
Cc: Gopal, Anupam (GE Infra, Energy); hdf-forum@hdfgroup.org
Subject: Re: [hdf-forum] "No space available for allocation" Error in
H5Dwrite function

Hi Pieter and Anupam,

On Jun 2, 2008, at 10:48 AM, Pieter van der Meer wrote:

Hi Anupal,

--
Pieter van der Meer
Software Engineer / Scientist
E-mail: P.van.der.Meer@sron.nl
Tel. +31 (0) 30 - 2535725

SRON Netherlands Institute for Space Research
http://www.sron.nl

"Gopal, Anupam (GE Infra, Energy)" <anupam.gopal@ge.com> 06/02/08
3:38 PM >>>

HDF 5 comes up with an error message saying "No space available for
allocation" during the H5Dwrite() function.

We had similar experiences in the past. With the HDF5-1.6 library
there was no error message. But with HDF5-1.8 there was perhaps a
similar error (address overflow). The precise message is listed
below for the sake of completeness. Could you also please mail your
exact error message?

Typically, the file contains 8 datasets, with one of them being of
size (13000*5*26000). Initially the dataset has size (13000*5*1)
and grows in the Z direction.

We are also working with many big chunked datasets in one file. And
I assume you are also using arrays, not tables? Data corruption was
imminent after a while. Arrays were 2D and typically ~1000 floats
wide and many 10.000's long. We now tried additional tests where
there is just 1 array in the HDF5 file. This works absolutely fine:
the array can grow very big without any problems. But with a few
dozen of arrays in a HDF5 file the problem always seems to occur.

        Hmm, we've got plenty of users who are storing many large
chunked
datasets in HDF5 files, so I know it's not a pervasive problem. Can
you duplicate this problem with the 1.8.1 release (on it's way this
week)? If so, can you get us a small C program that shows the issue,
so we can debug it?

I hope this helps. Maybe others are experiencing the same problem too.

-- start of error message --

        Anupam, is your error stack similar? (I'm expecting it's not
the same)

                Quincey

The programs which fills the database gives the following stack trace:
HDF5-DIAG: Error detected in HDF5 (1.8.0) thread 3073001152:
#000: H5Dio.c line 441 in H5Dwrite(): can't write data
  major: Dataset
  minor: Write failed
#001: H5Dio.c line 740 in H5D_write(): can't write data
  major: Dataset
  minor: Write failed
#002: H5Dio.c line 1841 in H5D_chunk_write(): unable to read raw
data chunk
  major: Low-level I/O
  minor: Read failed
#003: H5Distore.c line 1906 in H5D_istore_lock(): unable to read raw
data chunk
  major: Low-level I/O
  minor: Read failed
#004: H5F.c line 2974 in H5F_block_read(): file read failed
  major: Low-level I/O
  minor: Read failed
#005: H5FD.c line 2061 in H5FD_read(): driver read request failed
  major: Virtual File Layer
  minor: Read failed
#006: H5FDsec2.c line 725 in H5FD_sec2_read(): addr overflow
  major: Invalid arguments to routine
  minor: Address overflowed

-- start of error message --

With kind regards,

Pieter van der Meer

----------------------------------------------------------------------
This mailing list is for HDF software users discussion.
To subscribe to this list, send a message to

hdf-forum-subscribe@hdfgroup.org

.
To unsubscribe, send a message to hdf-forum-unsubscribe@hdfgroup.org.

----------------------------------------------------------------------
This mailing list is for HDF software users discussion.
To subscribe to this list, send a message to
hdf-forum-subscribe@hdfgroup.org.
To unsubscribe, send a message to hdf-forum-unsubscribe@hdfgroup.org.

Hi Anupam,

Hi Quincey & Pieter,

I just printed out the stack trace for the error message, Does it make sense to you. This error message does not show up initially while writing a dataset of size (13000*5*26000). It shows up while writing the 5000 th chunk. If we increase the java max heap size the error shows up sooner while writing the 3000th chunk. Each chunk again is a plane of size (13000*5). This chunk is being added 26000 times to complete the 3-d dataset. One quick note, although it might not seem that important, the H5 file also has 7 other three dimensional datasets, which are of smaller size , and are written in a similar fashion. However we do not experience any problem while writing those. Thanks for your help and looking forward to hear from you.

  Hmm, I can't think of anything really obvious that this error stack is showing, except that your process seems to be running out of memory somehow. If you can boil this down to a simple C program that shows the issue, we can determine if this is a bug and fix it if so. I do find it curious that the file paths in the error stack seem to be for version 1.6.x and you mentioned that you are using version 1.8.x when you get the error. Are you sure you've got everything configured correctly?

  Unfortunately, intensive debugging of user applications is out of scope with the effort we can expend over the forum e-mail. If you'd like to sign up for one of our service contracts, we can work with you more intensively to resolve difficult application issues:

  http://www.hdfgroup.org/services/index.html

  Regards,
    Quincey

···

On Jun 3, 2008, at 10:07 AM, Gopal, Anupam (GE Infra, Energy) wrote:

Total memory is: 47083520
Initial free memory: 28906904
free memory: 40158512
DatasetRdWt.H5Dwrite_wrap() with HDF5Exception: No space available for allocation
ncsa.hdf.hdf5lib.exceptions.HDF5ResourceUnavailableException: No space available for allocation
release\hdf5_vs6\hdf5\src\H5Dio.c line 958 in H5D_write(): can't write data
    major(15): Dataset interface
    minor(25): Write failed
  #002: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dio.c line 1996 in H5D_chunk_write(): optimized write failed
    major(15): Dataset interface
    minor(25): Write failed
  #003: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dselect.c line 654 in H5D_select_write(): write error
    major(14): Dataspace interface
    minor(25): Write failed
  #004: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Distore.c line 2175 in H5D_istore_writevv(): unable to read raw data chunk
    major(05): Low-level I/O layer
    minor(25): Write failed
  #005: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Distore.c line 1576 in H5D_istore_lock(): memory allocation failed for raw data chunk
    major(02): Resource unavailable
    minor(06): No space available for allocation
HDF5-DIAG: Error detected in HDF5 library version: 1.6.5 thread 0. Back trace follows.
  #000: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dio.c line 587 in H5Dwrite(): can't write data
    major(15): Dataset interface
    minor(25): Write failed
  #001: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dio.c line 958 in H5D_write(): can't write data
    major(15): Dataset interface
    minor(25): Write failed
  #002: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dio.c line 1996 in H5D_chunk_write(): optimized write failed
    major(15): Dataset interface
    minor(25): Write failed
  #003: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dselect.c line 654 in H5D_select_write(): write error
    major(14): Dataspace interface
    minor(25): Write failed
  #004: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Distore.c line 2175 in H5D_istore_writevv(): unable to read raw data chunk
    major(05): Low-level I/O layer
    minor(25): Write failed
  #005: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Distore.c line 1576 in H5D_istore_lock(): memory allocation failed for raw data chunk
    major(02): Resource unavailable
    minor(06): No space available for allocation
HDF5-DIAG: Error detected in HDF5 library version: 1.6.5 thread 0. Back trace follows.
  #000: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dio.c line 587 in H5Dwrite(): can't write data
    major(15): Dataset interface
    minor(25): Write failed
  #001: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dio.c line 958 in H5D_write(): can't write data
    major(15): Dataset interface
    minor(25): Write failed
  #002: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dio.c line 1996 in H5D_chunk_write(): optimized write failed
    major(15): Dataset interface
    minor(25): Write failed
  #003: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dselect.c line 654 in H5D_select_write(): write error
    major(14): Dataspace interface
    minor(25): Write failed
  #004: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Distore.c line 2175 in H5D_istore_writevv(): unable to read raw data chunk
    major(05): Low-level I/O layer
    minor(25): Write failed
  #005: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Distore.c line 1576 in H5D_istore_lock(): memory allocation failed for raw data chunk
    major(02): Resource unavailable
    minor(06): No space available for allocation
HDF5-DIAG: Error detected in HDF5 library version: 1.6.5 thread 0. Back trace follows.
  #000: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dio.c line 587 in H5Dwrite(): can't write data
    major(15): Dataset interface
    minor(25): Write failed
  #001: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dio.c line 958 in H5D_write(): can't write data
    major(15): Dataset interface
    minor(25): Write failed
  #002: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dio.c line 1996 in H5D_chunk_write(): optimized write failed
    major(15): Dataset interface

Anupam Gopal
Energy Application & Systems Engineering
GE Energy
T518-385-4586
F 518-385-5703
E anupam.gopal@ge.com
http://www.gepsec.com
General Electric International, Inc.

-----Original Message-----
From: Quincey Koziol [mailto:koziol@hdfgroup.org]
Sent: Tuesday, June 03, 2008 9:15 AM
To: Pieter van der Meer
Cc: Gopal, Anupam (GE Infra, Energy); hdf-forum@hdfgroup.org
Subject: Re: [hdf-forum] "No space available for allocation" Error in H5Dwrite function

Hi Pieter and Anupam,

On Jun 2, 2008, at 10:48 AM, Pieter van der Meer wrote:

> Hi Anupal,
>
> --
> Pieter van der Meer
> Software Engineer / Scientist
> E-mail: P.van.der.Meer@sron.nl
> Tel. +31 (0) 30 - 2535725
>
> SRON Netherlands Institute for Space Research
> http://www.sron.nl
>>>> "Gopal, Anupam (GE Infra, Energy)" <anupam.gopal@ge.com> 06/02/08
>>>> 3:38 PM >>>
>> HDF 5 comes up with an error message saying "No space available for
>> allocation" during the H5Dwrite() function.
>
> We had similar experiences in the past. With the HDF5-1.6 library
> there was no error message. But with HDF5-1.8 there was perhaps a
> similar error (address overflow). The precise message is listed
> below for the sake of completeness. Could you also please mail your
> exact error message?
>
>> Typically, the file contains 8 datasets, with one of them being of
>> size (13000*5*26000). Initially the dataset has size (13000*5*1)
>> and grows in the Z direction.
>
> We are also working with many big chunked datasets in one file. And
> I assume you are also using arrays, not tables? Data corruption was
> imminent after a while. Arrays were 2D and typically ~1000 floats
> wide and many 10.000's long. We now tried additional tests where
> there is just 1 array in the HDF5 file. This works absolutely fine:
> the array can grow very big without any problems. But with a few
> dozen of arrays in a HDF5 file the problem always seems to occur.

        Hmm, we've got plenty of users who are storing many large chunked
datasets in HDF5 files, so I know it's not a pervasive problem. Can
you duplicate this problem with the 1.8.1 release (on it's way this
week)? If so, can you get us a small C program that shows the issue,
so we can debug it?

> I hope this helps. Maybe others are experiencing the same problem too.
>
> -- start of error message --

        Anupam, is your error stack similar? (I'm expecting it's not the same)

                Quincey

>
> The programs which fills the database gives the following stack trace:
> HDF5-DIAG: Error detected in HDF5 (1.8.0) thread 3073001152:
> #000: H5Dio.c line 441 in H5Dwrite(): can't write data
> major: Dataset
> minor: Write failed
> #001: H5Dio.c line 740 in H5D_write(): can't write data
> major: Dataset
> minor: Write failed
> #002: H5Dio.c line 1841 in H5D_chunk_write(): unable to read raw
> data chunk
> major: Low-level I/O
> minor: Read failed
> #003: H5Distore.c line 1906 in H5D_istore_lock(): unable to read raw
> data chunk
> major: Low-level I/O
> minor: Read failed
> #004: H5F.c line 2974 in H5F_block_read(): file read failed
> major: Low-level I/O
> minor: Read failed
> #005: H5FD.c line 2061 in H5FD_read(): driver read request failed
> major: Virtual File Layer
> minor: Read failed
> #006: H5FDsec2.c line 725 in H5FD_sec2_read(): addr overflow
> major: Invalid arguments to routine
> minor: Address overflowed
>
> -- start of error message --
>
> With kind regards,
>
> Pieter van der Meer
>
> ----------------------------------------------------------------------
> This mailing list is for HDF software users discussion.
> To subscribe to this list, send a message to hdf-forum-subscribe@hdfgroup.org
> .
> To unsubscribe, send a message to hdf-forum-unsubscribe@hdfgroup.org.
>

----------------------------------------------------------------------
This mailing list is for HDF software users discussion.
To subscribe to this list, send a message to hdf-forum-subscribe@hdfgroup.org.
To unsubscribe, send a message to hdf-forum-unsubscribe@hdfgroup.org.

----------------------------------------------------------------------
This mailing list is for HDF software users discussion.
To subscribe to this list, send a message to hdf-forum-subscribe@hdfgroup.org.
To unsubscribe, send a message to hdf-forum-unsubscribe@hdfgroup.org.

Hi Anupam,

Hi Quincey & Pieter,

I just printed out the stack trace for the error message, Does it make sense to you. This error message does not show up initially while writing a dataset of size (13000*5*26000). It shows up while writing the 5000 th chunk. If we increase the java max heap size the error shows up sooner while writing the 3000th chunk. Each chunk again is a plane of size (13000*5). This chunk is being added 26000 times to complete the 3-d dataset. One quick note, although it might not seem that important, the H5 file also has 7 other three dimensional datasets, which are of smaller size , and are written in a similar fashion. However we do not experience any problem while writing those. Thanks for your help and looking forward to hear from you.

  Hmm, I can't think of anything really obvious that this error stack is showing, except that your process seems to be running out of memory somehow. If you can boil this down to a simple C program that shows the issue, we can determine if this is a bug and fix it if so. I do find it curious that the file paths in the error stack seem to be for version 1.6.x and you mentioned that you are using version 1.8.x when you get the error. Are you sure you've got everything configured correctly?

  Unfortunately, intensive debugging of user applications is out of scope with the effort we can expend over the forum e-mail. If you'd like to sign up for one of our service contracts, we can work with you more intensively to resolve difficult application issues:

  http://www.hdfgroup.org/services/index.html

  Regards,
    Quincey

···

On Jun 3, 2008, at 10:07 AM, Gopal, Anupam (GE Infra, Energy) wrote:

Total memory is: 47083520
Initial free memory: 28906904
free memory: 40158512
DatasetRdWt.H5Dwrite_wrap() with HDF5Exception: No space available for allocation
ncsa.hdf.hdf5lib.exceptions.HDF5ResourceUnavailableException: No space available for allocation
release\hdf5_vs6\hdf5\src\H5Dio.c line 958 in H5D_write(): can't write data
   major(15): Dataset interface
   minor(25): Write failed
#002: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dio.c line 1996 in H5D_chunk_write(): optimized write failed
   major(15): Dataset interface
   minor(25): Write failed
#003: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dselect.c line 654 in H5D_select_write(): write error
   major(14): Dataspace interface
   minor(25): Write failed
#004: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Distore.c line 2175 in H5D_istore_writevv(): unable to read raw data chunk
   major(05): Low-level I/O layer
   minor(25): Write failed
#005: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Distore.c line 1576 in H5D_istore_lock(): memory allocation failed for raw data chunk
   major(02): Resource unavailable
   minor(06): No space available for allocation
HDF5-DIAG: Error detected in HDF5 library version: 1.6.5 thread 0. Back trace follows.
#000: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dio.c line 587 in H5Dwrite(): can't write data
   major(15): Dataset interface
   minor(25): Write failed
#001: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dio.c line 958 in H5D_write(): can't write data
   major(15): Dataset interface
   minor(25): Write failed
#002: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dio.c line 1996 in H5D_chunk_write(): optimized write failed
   major(15): Dataset interface
   minor(25): Write failed
#003: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dselect.c line 654 in H5D_select_write(): write error
   major(14): Dataspace interface
   minor(25): Write failed
#004: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Distore.c line 2175 in H5D_istore_writevv(): unable to read raw data chunk
   major(05): Low-level I/O layer
   minor(25): Write failed
#005: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Distore.c line 1576 in H5D_istore_lock(): memory allocation failed for raw data chunk
   major(02): Resource unavailable
   minor(06): No space available for allocation
HDF5-DIAG: Error detected in HDF5 library version: 1.6.5 thread 0. Back trace follows.
#000: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dio.c line 587 in H5Dwrite(): can't write data
   major(15): Dataset interface
   minor(25): Write failed
#001: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dio.c line 958 in H5D_write(): can't write data
   major(15): Dataset interface
   minor(25): Write failed
#002: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dio.c line 1996 in H5D_chunk_write(): optimized write failed
   major(15): Dataset interface
   minor(25): Write failed
#003: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dselect.c line 654 in H5D_select_write(): write error
   major(14): Dataspace interface
   minor(25): Write failed
#004: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Distore.c line 2175 in H5D_istore_writevv(): unable to read raw data chunk
   major(05): Low-level I/O layer
   minor(25): Write failed
#005: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Distore.c line 1576 in H5D_istore_lock(): memory allocation failed for raw data chunk
   major(02): Resource unavailable
   minor(06): No space available for allocation
HDF5-DIAG: Error detected in HDF5 library version: 1.6.5 thread 0. Back trace follows.
#000: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dio.c line 587 in H5Dwrite(): can't write data
   major(15): Dataset interface
   minor(25): Write failed
#001: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dio.c line 958 in H5D_write(): can't write data
   major(15): Dataset interface
   minor(25): Write failed
#002: D:\fangguo\uptodate_test\hdf516_release\hdf5_vs6\hdf5\src\H5Dio.c line 1996 in H5D_chunk_write(): optimized write failed
   major(15): Dataset interface

Anupam Gopal
Energy Application & Systems Engineering
GE Energy
T518-385-4586
F 518-385-5703
E anupam.gopal@ge.com
http://www.gepsec.com
General Electric International, Inc.

-----Original Message-----
From: Quincey Koziol [mailto:koziol@hdfgroup.org]
Sent: Tuesday, June 03, 2008 9:15 AM
To: Pieter van der Meer
Cc: Gopal, Anupam (GE Infra, Energy); hdf-forum@hdfgroup.org
Subject: Re: [hdf-forum] "No space available for allocation" Error in H5Dwrite function

Hi Pieter and Anupam,

On Jun 2, 2008, at 10:48 AM, Pieter van der Meer wrote:

> Hi Anupal,
>
> --
> Pieter van der Meer
> Software Engineer / Scientist
> E-mail: P.van.der.Meer@sron.nl
> Tel. +31 (0) 30 - 2535725
>
> SRON Netherlands Institute for Space Research
> http://www.sron.nl
>>>> "Gopal, Anupam (GE Infra, Energy)" <anupam.gopal@ge.com> 06/02/08
>>>> 3:38 PM >>>
>> HDF 5 comes up with an error message saying "No space available for
>> allocation" during the H5Dwrite() function.
>
> We had similar experiences in the past. With the HDF5-1.6 library
> there was no error message. But with HDF5-1.8 there was perhaps a
> similar error (address overflow). The precise message is listed
> below for the sake of completeness. Could you also please mail your
> exact error message?
>
>> Typically, the file contains 8 datasets, with one of them being of
>> size (13000*5*26000). Initially the dataset has size (13000*5*1)
>> and grows in the Z direction.
>
> We are also working with many big chunked datasets in one file. And
> I assume you are also using arrays, not tables? Data corruption was
> imminent after a while. Arrays were 2D and typically ~1000 floats
> wide and many 10.000's long. We now tried additional tests where
> there is just 1 array in the HDF5 file. This works absolutely fine:
> the array can grow very big without any problems. But with a few
> dozen of arrays in a HDF5 file the problem always seems to occur.

       Hmm, we've got plenty of users who are storing many large chunked
datasets in HDF5 files, so I know it's not a pervasive problem. Can
you duplicate this problem with the 1.8.1 release (on it's way this
week)? If so, can you get us a small C program that shows the issue,
so we can debug it?

> I hope this helps. Maybe others are experiencing the same problem too.
>
> -- start of error message --

       Anupam, is your error stack similar? (I'm expecting it's not the same)

               Quincey

>
> The programs which fills the database gives the following stack trace:
> HDF5-DIAG: Error detected in HDF5 (1.8.0) thread 3073001152:
> #000: H5Dio.c line 441 in H5Dwrite(): can't write data
> major: Dataset
> minor: Write failed
> #001: H5Dio.c line 740 in H5D_write(): can't write data
> major: Dataset
> minor: Write failed
> #002: H5Dio.c line 1841 in H5D_chunk_write(): unable to read raw
> data chunk
> major: Low-level I/O
> minor: Read failed
> #003: H5Distore.c line 1906 in H5D_istore_lock(): unable to read raw
> data chunk
> major: Low-level I/O
> minor: Read failed
> #004: H5F.c line 2974 in H5F_block_read(): file read failed
> major: Low-level I/O
> minor: Read failed
> #005: H5FD.c line 2061 in H5FD_read(): driver read request failed
> major: Virtual File Layer
> minor: Read failed
> #006: H5FDsec2.c line 725 in H5FD_sec2_read(): addr overflow
> major: Invalid arguments to routine
> minor: Address overflowed
>
> -- start of error message --
>
> With kind regards,
>
> Pieter van der Meer
>
> ----------------------------------------------------------------------
> This mailing list is for HDF software users discussion.
> To subscribe to this list, send a message to hdf-forum-subscribe@hdfgroup.org
> .
> To unsubscribe, send a message to hdf-forum-unsubscribe@hdfgroup.org.
>

----------------------------------------------------------------------
This mailing list is for HDF software users discussion.
To subscribe to this list, send a message to hdf-forum-subscribe@hdfgroup.org.
To unsubscribe, send a message to hdf-forum-unsubscribe@hdfgroup.org.

----------------------------------------------------------------------
This mailing list is for HDF software users discussion.
To subscribe to this list, send a message to hdf-forum-subscribe@hdfgroup.org.
To unsubscribe, send a message to hdf-forum-unsubscribe@hdfgroup.org.