All,
We are trying to use the H5repack.exe tool in order to change the
chunking of a large dataset comprising of (6000*5*14000) floating point
numbers. We used the below mentioned command:
"h5repack -i 01_nerc_c3_creation14000.h5 -o 01_nerc_c3_creation14000R.h5
-l 01_nerc_c3_creation14000GRP/Voltage_Results:CHUNK=1x1x14000"
However the tool throws an error saying:
"Cannot read into memory h5repack: <01_nerc_c3_creation14000.h5>: Could
not copy data to: 01_nerc_c3_creation14000R.h5".
On the other hand it did well with smaller datasets .
Is there a workaround to this problem ? Or is it a known bug ? Looks
like the tool is not equipped to handle datasets beyond certain size.
Any help is appreciated.
Regards,
Anupam Gopal
Energy Application & Systems Engineering
GE Energy
T518-385-4586
F 518-385-5703
E anupam.gopal@ge.com
http://www.gepsec.com <http://www.gepsec.com/>
General Electric International, Inc.