H5Dump dataset retrieve error

Hi All,

Currently on unix using the hdf5-1.8.14 package (standard installation).

I have a .gh5 file with many datasets in this format:
                        DATATYPE H5T_IEEE_F32LE
                        DATASPACE SIMPLE { ( 133004 ) / ( H5S_UNLIMITED ) }
                        STORAGE_LAYOUT {
                           CHUNKED ( 4096 )
                           SIZE 3406 (156.200:1 COMPRESSION)
                        }
                        FILTERS {
                           COMPRESSION DEFLATE { LEVEL 4 }
                        }

I'm simply trying to extract a dataset using the command:
"./tools/h5dump/h5dump -d", however I keep getting this error: h5dump
error: unable to print data. The output file is created nut the data area
is empty.

DATASET "/MEASURED" {
   DATATYPE H5T_IEEE_F32LE
   DATASPACE SIMPLE { ( 133004 ) / ( H5S_UNLIMITED ) }
   DATA {
   }

I have been able to run other variations and commands without any issues
(h5repack, h5stat, h5dump -a/H/n, etc).

When checking using "--enable-error-stack" this is the output

*** glibc detected ***
/training/config/hdf5/hdf5-1.8.14/tools/h5dump/.libs/lt-h5dump: double free
or corruption (!prev): 0x0000000001a9e430 ***

======= Backtrace: =========

/lib64/libc.so.6[0x34e4275e76]

/lib64/libc.so.6[0x34e42789b3]

/training/config/hdf5/hdf5-1.8.14/tools/h5dump/.libs/lt-h5dump[0x407ef1]

/lib64/libc.so.6(__libc_start_main+0xfd)[0x34e421ed5d]

/training/config/hdf5/hdf5-1.8.14/tools/h5dump/.libs/lt-h5dump[0x4053f9]

======= Memory map: ========

...

....

Aborted (core dumped)

I can create a brand new file with no compression, copy the dataset from
the file I can't extract from to the new file and then use h5dump on the
new file (so I don't think its memory related).. I'm leaning towards
something with the original file's compression? I'm unable to remove
compression/filter on the file, I receive an error for each dataset:
file cannot
be read, deflate filter is not available.

Any help/direction/insight is much appreciated.

Thank you

Patrick,

···

On Jan 16, 2015, at 3:29 PM, Patrick Weinandy <pweinandy@gmail.com<mailto:pweinandy@gmail.com>> wrote:

Hi All,

Currently on unix using the hdf5-1.8.14 package (standard installation).

I have a .gh5 file with many datasets in this format:
                        DATATYPE H5T_IEEE_F32LE
                        DATASPACE SIMPLE { ( 133004 ) / ( H5S_UNLIMITED ) }
                        STORAGE_LAYOUT {
                           CHUNKED ( 4096 )
                           SIZE 3406 (156.200:1 COMPRESSION)
                        }
                        FILTERS {
                           COMPRESSION DEFLATE { LEVEL 4 }
                        }

I'm simply trying to extract a dataset using the command: "./tools/h5dump/h5dump -d", however I keep getting this error: h5dump error: unable to print data.

This error usually occurs when a compression filter is not available to HDF5. Deflate filter (zlib compression) is enabled by default and will be configured in if libz.* libraries are present on the build system in the /usr/lib directory. (I am assuming you are on UNIX system).

Could you please check the libhdf5.settings file found under the lib directory of the HDF5 installation point? This is a text file. Check if the line as shown below contains “deflate(zlib)”:
        I/O filters (external): deflate(zlib),szip(encoder)

If “deflate" is not there, you will need to rebuild HDF5 to get your data, but first, please make sure that you have zlib on your system.

  The output file is created nut the data area is empty.

DATASET "/MEASURED" {
   DATATYPE H5T_IEEE_F32LE
   DATASPACE SIMPLE { ( 133004 ) / ( H5S_UNLIMITED ) }
   DATA {
   }

I have been able to run other variations and commands without any issues (h5repack, h5stat, h5dump -a/H/n, etc).

When checking using "--enable-error-stack" this is the output

*** glibc detected *** /training/config/hdf5/hdf5-1.8.14/tools/h5dump/.libs/lt-h5dump: double free or corruption (!prev):
Unfortunately, I cannot reproduce this error when I use HDF5 built without zlib. Could you please contact our Helpdesk (help@hdfgroup.org<mailto:help@hdfgroup.org>) and send us your file for further investigation?

Thank you!

Elena

0x0000000001a9e430 ***
======= Backtrace: =========
/lib64/libc.so.6[0x34e4275e76]
/lib64/libc.so.6[0x34e42789b3]
/training/config/hdf5/hdf5-1.8.14/tools/h5dump/.libs/lt-h5dump[0x407ef1]
/lib64/libc.so.6(__libc_start_main+0xfd)[0x34e421ed5d]
/training/config/hdf5/hdf5-1.8.14/tools/h5dump/.libs/lt-h5dump[0x4053f9]
======= Memory map: ========
...
....
Aborted (core dumped)

I can create a brand new file with no compression, copy the dataset from the file I can't extract from to the new file and then use h5dump on the new file (so I don't think its memory related).. I'm leaning towards something with the original file's compression? I'm unable to remove compression/filter on the file, I receive an error for each dataset: file cannot be read, deflate filter is not available.

Any help/direction/insight is much appreciated.

Thank you

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org<mailto:Hdf-forum@lists.hdfgroup.org>
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5

Yes, sorry I'm on Red Hat Enterprise Linux Server release 6.5.

I did not find any zlib/deflate text in the libhdf5.settings file. I will
work on adding zlib and rebuilding hdf5.

Features:

···

---------
Parallel HDF5: no
High Level library: yes
Threadsafety: no
Default API Mapping: v18
With Deprecated Public Symbols: yes
I/O filters (external):
I/O filters (internal): shuffle,fletcher32,nbit,scaleoffset
MPE: no
Direct VFD: no
dmalloc: no
Clear file buffers before write: yes
Using memory checker: no
Function Stack Tracing: no
Strict File Format Checks: no
Optimization Instrumentation: no
Large File Support (LFS): yes

In the meantime I will contact the helpdesk. Appreciate the assistance.

Thanks,
Patrick

On Sun, Jan 18, 2015 at 6:11 PM, Elena Pourmal <epourmal@hdfgroup.org> wrote:

Patrick,

On Jan 16, 2015, at 3:29 PM, Patrick Weinandy <pweinandy@gmail.com> > wrote:

Hi All,

Currently on unix using the hdf5-1.8.14 package (standard installation).

I have a .gh5 file with many datasets in this format:
                         DATATYPE H5T_IEEE_F32LE
                        DATASPACE SIMPLE { ( 133004 ) / ( H5S_UNLIMITED )
}
                        STORAGE_LAYOUT {
                           CHUNKED ( 4096 )
                           SIZE 3406 (156.200:1 COMPRESSION)
                        }
                        FILTERS {
                           COMPRESSION DEFLATE { LEVEL 4 }
                        }

I'm simply trying to extract a dataset using the command:
"./tools/h5dump/h5dump -d", however I keep getting this error: h5dump
error: unable to print data.

This error usually occurs when a compression filter is not available to
HDF5. Deflate filter (zlib compression) is enabled by default and will be
configured in if libz.* libraries are present on the build system in the
/usr/lib directory. (I am assuming you are on UNIX system).

Could you please check the libhdf5.settings file found under the lib
directory of the HDF5 installation point? This is a text file. Check if the
line as shown below contains “deflate(zlib)”:
         I/O filters (external): deflate(zlib),szip(encoder)

If “deflate" is not there, you will need to rebuild HDF5 to get your
data, but first, please make sure that you have zlib on your system.

    The output file is created nut the data area is empty.

DATASET "/MEASURED" {
   DATATYPE H5T_IEEE_F32LE
   DATASPACE SIMPLE { ( 133004 ) / ( H5S_UNLIMITED ) }
   DATA {
   }

I have been able to run other variations and commands without any issues
(h5repack, h5stat, h5dump -a/H/n, etc).

When checking using "--enable-error-stack" this is the output

*** glibc detected ***
/training/config/hdf5/hdf5-1.8.14/tools/h5dump/.libs/lt-h5dump: double free
or corruption (!prev):

Unfortunately, I cannot reproduce this error when I use HDF5 built without
zlib. Could you please contact our Helpdesk (help@hdfgroup.org) and send
us your file for further investigation?

Thank you!

Elena

  0x0000000001a9e430 ***

======= Backtrace: =========

/lib64/libc.so.6[0x34e4275e76]

/lib64/libc.so.6[0x34e42789b3]

/training/config/hdf5/hdf5-1.8.14/tools/h5dump/.libs/lt-h5dump[0x407ef1]

/lib64/libc.so.6(__libc_start_main+0xfd)[0x34e421ed5d]

/training/config/hdf5/hdf5-1.8.14/tools/h5dump/.libs/lt-h5dump[0x4053f9]

======= Memory map: ========

...

....

Aborted (core dumped)

I can create a brand new file with no compression, copy the dataset from
the file I can't extract from to the new file and then use h5dump on the
new file (so I don't think its memory related).. I'm leaning towards
something with the original file's compression? I'm unable to remove
compression/filter on the file, I receive an error for each dataset: file cannot
be read, deflate filter is not available.

Any help/direction/insight is much appreciated.

Thank you

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org

http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org

http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5

--
*Patrick Weinandy*
614-736-8522

Adding the zlib-package solved the problem! Thanks for your help.

##Checked in lib directory for libz.*
/usr/lib64 (no libz.*)

##installed zlib (zlib-devel-1.2.3-29.el6.x86_64)
sudo yum install zlib-devely

##reconfigured hdf5
./configure --with-zlib=/usr/lib64 (checked, config.log now see: I/O
filters (external): deflate(zlib))
sudo make
sudo make check
sudo make install
sudo make check-install

##executing ./tools/h5dump/h5dump -d no longer returns error "unable to
print data" and outputs data into file

able to successfully dump data out of gh5 data

···

On Mon, Jan 19, 2015 at 8:18 AM, Patrick Weinandy <pweinandy@gmail.com> wrote:

Yes, sorry I'm on Red Hat Enterprise Linux Server release 6.5.

I did not find any zlib/deflate text in the libhdf5.settings file. I
will work on adding zlib and rebuilding hdf5.

Features:
---------
Parallel HDF5: no
High Level library: yes
Threadsafety: no
Default API Mapping: v18
With Deprecated Public Symbols: yes
I/O filters (external):
I/O filters (internal): shuffle,fletcher32,nbit,scaleoffset
MPE: no
Direct VFD: no
dmalloc: no
Clear file buffers before write: yes
Using memory checker: no
Function Stack Tracing: no
Strict File Format Checks: no
Optimization Instrumentation: no
Large File Support (LFS): yes

In the meantime I will contact the helpdesk. Appreciate the assistance.

Thanks,
Patrick

On Sun, Jan 18, 2015 at 6:11 PM, Elena Pourmal <epourmal@hdfgroup.org> > wrote:

Patrick,

On Jan 16, 2015, at 3:29 PM, Patrick Weinandy <pweinandy@gmail.com> >> wrote:

Hi All,

Currently on unix using the hdf5-1.8.14 package (standard installation).

I have a .gh5 file with many datasets in this format:
                         DATATYPE H5T_IEEE_F32LE
                        DATASPACE SIMPLE { ( 133004 ) / ( H5S_UNLIMITED
) }
                        STORAGE_LAYOUT {
                           CHUNKED ( 4096 )
                           SIZE 3406 (156.200:1 COMPRESSION)
                        }
                        FILTERS {
                           COMPRESSION DEFLATE { LEVEL 4 }
                        }

I'm simply trying to extract a dataset using the command:
"./tools/h5dump/h5dump -d", however I keep getting this error: h5dump
error: unable to print data.

This error usually occurs when a compression filter is not available to
HDF5. Deflate filter (zlib compression) is enabled by default and will be
configured in if libz.* libraries are present on the build system in the
/usr/lib directory. (I am assuming you are on UNIX system).

Could you please check the libhdf5.settings file found under the lib
directory of the HDF5 installation point? This is a text file. Check if the
line as shown below contains “deflate(zlib)”:
         I/O filters (external): deflate(zlib),szip(encoder)

If “deflate" is not there, you will need to rebuild HDF5 to get your
data, but first, please make sure that you have zlib on your system.

    The output file is created nut the data area is empty.

DATASET "/MEASURED" {
   DATATYPE H5T_IEEE_F32LE
   DATASPACE SIMPLE { ( 133004 ) / ( H5S_UNLIMITED ) }
   DATA {
   }

I have been able to run other variations and commands without any
issues (h5repack, h5stat, h5dump -a/H/n, etc).

When checking using "--enable-error-stack" this is the output

*** glibc detected ***
/training/config/hdf5/hdf5-1.8.14/tools/h5dump/.libs/lt-h5dump: double free
or corruption (!prev):

Unfortunately, I cannot reproduce this error when I use HDF5 built
without zlib. Could you please contact our Helpdesk (help@hdfgroup.org)
and send us your file for further investigation?

Thank you!

Elena

  0x0000000001a9e430 ***

======= Backtrace: =========

/lib64/libc.so.6[0x34e4275e76]

/lib64/libc.so.6[0x34e42789b3]

/training/config/hdf5/hdf5-1.8.14/tools/h5dump/.libs/lt-h5dump[0x407ef1]

/lib64/libc.so.6(__libc_start_main+0xfd)[0x34e421ed5d]

/training/config/hdf5/hdf5-1.8.14/tools/h5dump/.libs/lt-h5dump[0x4053f9]

======= Memory map: ========

...

....

Aborted (core dumped)

I can create a brand new file with no compression, copy the dataset
from the file I can't extract from to the new file and then use h5dump on
the new file (so I don't think its memory related).. I'm leaning towards
something with the original file's compression? I'm unable to remove
compression/filter on the file, I receive an error for each dataset: file cannot
be read, deflate filter is not available.

Any help/direction/insight is much appreciated.

Thank you

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org

http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org

http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5

--
*Patrick Weinandy*
614-736-8522

--
*Patrick Weinandy*
614-736-8522