H5repack File Size Limits

Hello,

Is there a file size limit for h5repack? I am trying to repack 62Gb files using the command: h5repack -f SHUF -f GZIP=6 <file1> <file2>. I have tried a GZIP of 1 and it still does not work. The Linux operating system (or h5repack - not sure) is killing the process before finishing. I have been using h5repack on smaller files successfully.

Is this a memory issue? The machine I am working on generally has about 40-90 Gb free at any time.

Thanks,
Rob

Rob,

There shouldn't be size limit for h5repack. Unfortunately it is hard to say what could be wrong without information about the objects in the file. Could you please describe your data and its characteristics (dataset sizes, for example)?

You may try to specify chunk size using -l option and see what will happen.

Elena

···

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Elena Pourmal The HDF Group http://hdfgroup.org
1800 So. Oak St., Suite 203, Champaign IL 61820
217.531.6112
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

On Jun 5, 2013, at 2:49 PM, Robert Seigel wrote:

Hello,

Is there a file size limit for h5repack? I am trying to repack 62Gb files using the command: h5repack -f SHUF -f GZIP=6 <file1> <file2>. I have tried a GZIP of 1 and it still does not work. The Linux operating system (or h5repack - not sure) is killing the process before finishing. I have been using h5repack on smaller files successfully.

Is this a memory issue? The machine I am working on generally has about 40-90 Gb free at any time.

Thanks,
Rob

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

Hi Elena,

Thank you for the prompt response. The files contain a series of 2D (512 x 512 points) and 3D datasets (512 x 512 x 180 points) that have been created using parallel HDF5 (256 processors). The h5repack seems to be working fine, but then it all of a sudden gets killed after about 5 minutes or so (smaller files that work take about that long).

I will try the -l=CHUNK and let you know how it goes.

Thank you,
Rob

···

On Jun 6, 2013, at 12:23 AM, Elena Pourmal <epourmal@hdfgroup.org<mailto:epourmal@hdfgroup.org>> wrote:

Rob,

There shouldn't be size limit for h5repack. Unfortunately it is hard to say what could be wrong without information about the objects in the file. Could you please describe your data and its characteristics (dataset sizes, for example)?

You may try to specify chunk size using -l option and see what will happen.

Elena

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Elena Pourmal The HDF Group http://hdfgroup.org<http://hdfgroup.org/>
1800 So. Oak St., Suite 203, Champaign IL 61820
217.531.6112
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

On Jun 5, 2013, at 2:49 PM, Robert Seigel wrote:

Hello,

Is there a file size limit for h5repack? I am trying to repack 62Gb files using the command: h5repack -f SHUF -f GZIP=6 <file1> <file2>. I have tried a GZIP of 1 and it still does not work. The Linux operating system (or h5repack - not sure) is killing the process before finishing. I have been using h5repack on smaller files successfully.

Is this a memory issue? The machine I am working on generally has about 40-90 Gb free at any time.

Thanks,
Rob

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org<mailto:Hdf-forum@lists.hdfgroup.org>
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org<mailto:Hdf-forum@lists.hdfgroup.org>
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org