We are trying to use the H5repack.exe tool in order to change the
chunking of a large dataset comprising of (6000*5*14000) floating point
numbers. We used the below mentioned command:
"h5repack -i 01_nerc_c3_creation14000.h5 -o 01_nerc_c3_creation14000R.h5
However the tool throws an error saying:
"Cannot read into memory h5repack: <01_nerc_c3_creation14000.h5>: Could
not copy data to: 01_nerc_c3_creation14000R.h5".
On the other hand it did well with smaller datasets .
Is there a workaround to this problem ? Or is it a known bug ? Looks
like the tool is not equipped to handle datasets beyond certain size.
Any help is appreciated.
Energy Application & Systems Engineering
General Electric International, Inc.