Hi,
After reading this post: "HDF5 Deleting Datasets AND Recovering Space", I thought I'd run h5repack over some of my datasets to see what the affect would be.
However, when I run it over one "test" dataset I get the following assertion:
h5repack: H5Dscatgath.c:501: H5D_scatgath_read: Assertion `(H5S_select_iter_nelmts(&file_iter)) == (nelmts - smine_start)' failed.
Aborted
What's interesting is that this only happens if before I close the dataset I call "H5Dset_extent" to the exact number of elements in my dataset. So, for example, if I don't call this function, or if I set the dataset extent to the "size +1", then h5repack runs without an error.
My main concern is that I'm not "misusing" the HDF5 format and this problem is only shows (at the moment) by using h5repack.
The command I'm using:
h5repack -v -f GZIP=1 file1.qa file2.qa
And the output of the run is:
Objects to modify layout are...
Objects to apply filter are...
All with GZIP, parameter 1
Making file <file2.qa>...
···
-----------------------------------------
Type Filter (Compression) Name
-----------------------------------------
group /
group /analysis
group /analysis/filenames
dset /analysis/filenames/filenames
group /analysis/locations
h5repack: H5Dscatgath.c:501: H5D_scatgath_read: Assertion
`(H5S_select_iter_nelmts(&file_iter)) == (nelmts - smine_start)' failed.
Aborted
Here's the contents of '/analysis/locations/locations' which at least from the output above, appears to be where the problem occurs:
GROUP "locations" {
DATASET "locations" {
DATATYPE H5T_COMPOUND {
H5T_STD_U64LE "filename";
H5T_STD_I32LE "line";
H5T_STD_I32LE "column";
H5T_STD_U64LE "incl_from";
H5T_STD_I32LE "hash_line_file";
H5T_STD_I32LE "hash_line_line";
H5T_REFERENCE "properties";
}
DATASPACE SIMPLE { ( 1 ) / ( H5S_UNLIMITED ) }
DATA {
(0): {
0,
184,
183,
18446744073709551615,
-1,
NULL
}
}
}
}
Thanks for your help.
Regards,
Richard