Hi All,
Currently on unix using the hdf5-1.8.14 package (standard installation).
I have a .gh5 file with many datasets in this format:
DATATYPE H5T_IEEE_F32LE
DATASPACE SIMPLE { ( 133004 ) / ( H5S_UNLIMITED ) }
STORAGE_LAYOUT {
CHUNKED ( 4096 )
SIZE 3406 (156.200:1 COMPRESSION)
}
FILTERS {
COMPRESSION DEFLATE { LEVEL 4 }
}
I'm simply trying to extract a dataset using the command:
"./tools/h5dump/h5dump -d", however I keep getting this error: h5dump
error: unable to print data. The output file is created nut the data area
is empty.
DATASET "/MEASURED" {
DATATYPE H5T_IEEE_F32LE
DATASPACE SIMPLE { ( 133004 ) / ( H5S_UNLIMITED ) }
DATA {
}
I have been able to run other variations and commands without any issues
(h5repack, h5stat, h5dump -a/H/n, etc).
When checking using "--enable-error-stack" this is the output
*** glibc detected ***
/training/config/hdf5/hdf5-1.8.14/tools/h5dump/.libs/lt-h5dump: double free
or corruption (!prev): 0x0000000001a9e430 ***
======= Backtrace: =========
/lib64/libc.so.6[0x34e4275e76]
/lib64/libc.so.6[0x34e42789b3]
/training/config/hdf5/hdf5-1.8.14/tools/h5dump/.libs/lt-h5dump[0x407ef1]
/lib64/libc.so.6(__libc_start_main+0xfd)[0x34e421ed5d]
/training/config/hdf5/hdf5-1.8.14/tools/h5dump/.libs/lt-h5dump[0x4053f9]
======= Memory map: ========
...
....
Aborted (core dumped)
I can create a brand new file with no compression, copy the dataset from
the file I can't extract from to the new file and then use h5dump on the
new file (so I don't think its memory related).. I'm leaning towards
something with the original file's compression? I'm unable to remove
compression/filter on the file, I receive an error for each dataset:
file cannot
be read, deflate filter is not available.
Any help/direction/insight is much appreciated.
Thank you