Extracting binary file from HDf5 file using H5Dump Results in file 3X too large

I have an HDF5 data file that has a single data set in it call "phi" which has dimensions 350x350x450 and is 32 bit float values. I am trying to use H5Dump to dump the data as a binary file.

507:[mjackson@Mine:MURI_Dendrite_Dataset]$ h5dump -d /H -o H.bin dendrite_0008.h5
HDF5 "dendrite_0008.h5" {
DATASET "/H" {
   DATATYPE H5T_IEEE_F32LE
   DATASPACE SIMPLE { ( 350, 350, 450 ) / ( 350, 350, 450 ) }
   DATA {
   }
}
}

The resulting file size is 625,492,563 bytes which is way too large. It should be 220,500,000 bytes. I am sure this is a "user error" but I can not see what i am doing wrong. Any help would be great. The original file size is about 220MB (which seems correct).

Thanks
Mike Jackson

Mike,

You may check that -o created an ascii file.

Please use -b option with LE or BE as shown

h5dump -d /H -b LE -o H.bin dendrite_0008.h5

···

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Elena Pourmal The HDF Group http://hdfgroup.org
1800 So. Oak St., Suite 203, Champaign IL 61820
217.531.6112
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

On Jan 21, 2014, at 10:46 AM, Michael Jackson <mike.jackson@bluequartz.net> wrote:

I have an HDF5 data file that has a single data set in it call "phi" which has dimensions 350x350x450 and is 32 bit float values. I am trying to use H5Dump to dump the data as a binary file.

507:[mjackson@Mine:MURI_Dendrite_Dataset]$ h5dump -d /H -o H.bin dendrite_0008.h5
HDF5 "dendrite_0008.h5" {
DATASET "/H" {
  DATATYPE H5T_IEEE_F32LE
  DATASPACE SIMPLE { ( 350, 350, 450 ) / ( 350, 350, 450 ) }
  DATA {
  }
}
}

The resulting file size is 625,492,563 bytes which is way too large. It should be 220,500,000 bytes. I am sure this is a "user error" but I can not see what i am doing wrong. Any help would be great. The original file size is about 220MB (which seems correct).

Thanks
Mike Jackson

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

Ah hah. Thanks. i knew it was something simple I was missing.

Mike J.

···

On Jan 21, 2014, at 12:17 PM, Elena Pourmal <epourmal@hdfgroup.org> wrote:

Mike,

You may check that -o created an ascii file.

Please use -b option with LE or BE as shown

h5dump -d /H -b LE -o H.bin dendrite_0008.h5
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Elena Pourmal The HDF Group http://hdfgroup.org
1800 So. Oak St., Suite 203, Champaign IL 61820
217.531.6112
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

On Jan 21, 2014, at 10:46 AM, Michael Jackson <mike.jackson@bluequartz.net> wrote:

I have an HDF5 data file that has a single data set in it call "phi" which has dimensions 350x350x450 and is 32 bit float values. I am trying to use H5Dump to dump the data as a binary file.

507:[mjackson@Mine:MURI_Dendrite_Dataset]$ h5dump -d /H -o H.bin dendrite_0008.h5
HDF5 "dendrite_0008.h5" {
DATASET "/H" {
  DATATYPE H5T_IEEE_F32LE
  DATASPACE SIMPLE { ( 350, 350, 450 ) / ( 350, 350, 450 ) }
  DATA {
  }
}
}

The resulting file size is 625,492,563 bytes which is way too large. It should be 220,500,000 bytes. I am sure this is a "user error" but I can not see what i am doing wrong. Any help would be great. The original file size is about 220MB (which seems correct).

Thanks
Mike Jackson

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org