I am trying to dump a data set from one of my HDF5 files. Here is the output from my command line invocation:
503:[mjackson@mine:~]$ h5dump -d /VoxelDataContainer/CELL_DATA/GrainIds -o /Users/Shared/Data/Ang_Data/Small_IN100_Output/GrainIds.bin /Users/Shared/Data/Ang_Data/Small_IN100_Output/Small_IN100.dream3d
HDF5 "/Users/Shared/Data/Ang_Data/Small_IN100_Output/Small_IN100.dream3d" {
DATASET "/VoxelDataContainer/CELL_DATA/GrainIds" {
DATATYPE H5T_STD_I32LE
DATASPACE SIMPLE { ( 4444713 ) / ( 4444713 ) }
DATA {
}
ATTRIBUTE "NumComponents" {
DATATYPE H5T_STD_I32LE
DATASPACE SIMPLE { ( 1 ) / ( 1 ) }
DATA {
(0): 1
}
}
ATTRIBUTE "ObjectType" {
DATATYPE H5T_STRING {
STRSIZE 19;
STRPAD H5T_STR_NULLTERM;
CSET H5T_CSET_ASCII;
CTYPE H5T_C_S1;
}
DATASPACE SCALAR
DATA {
(0): "DataArray<int32_t>"
}
}
}
}
Looking at the output file "GrainIds.bin" the file size is registered as 26,800,344 bytes which is confusing because if we take the DATASPACE from above and multiply by 4 (number of bytes in H5T_STD_I32LE) I get 17,778,852 bytes. This is on hdf5-1.8.10-patch1, compiled on OS X 10.8.2 using the latest XCode installation from Apple.
Thoughts?
···
___________________________________________________________
Mike Jackson Principal Software Engineer
BlueQuartz Software Dayton, Ohio
mike.jackson@bluequartz.net www.bluequartz.net
I used this:
h5dump --dataset=/VoxelDataContainer/CELL_DATA/GrainIds --output=/Users/Shared/Data/Ang_Data/Small_IN100_Output/GrainIds.bin --binary=NATIVE /Users/Shared/Data/Ang_Data/Small_IN100_Output/Small_IN100.dream3d
HDF5 "/Users/Shared/Data/Ang_Data/Small_IN100_Output/Small_IN100.dream3d" {
DATASET "/VoxelDataContainer/CELL_DATA/GrainIds" {
DATATYPE H5T_STD_I32LE
DATASPACE SIMPLE { ( 4444713 ) / ( 4444713 ) }
DATA {
}
}
}
and got what I expected. Not really sure what the difference was except the --binary=NATIVE argument but I would have thought that was implied with the -o option?
Sorry for the noise.
···
___________________________________________________________
Mike Jackson Principal Software Engineer
BlueQuartz Software Dayton, Ohio
mike.jackson@bluequartz.net www.bluequartz.net
On Feb 20, 2013, at 4:35 PM, Michael Jackson <mike.jackson@bluequartz.net> wrote:
I am trying to dump a data set from one of my HDF5 files. Here is the output from my command line invocation:
503:[mjackson@mine:~]$ h5dump -d /VoxelDataContainer/CELL_DATA/GrainIds -o /Users/Shared/Data/Ang_Data/Small_IN100_Output/GrainIds.bin /Users/Shared/Data/Ang_Data/Small_IN100_Output/Small_IN100.dream3d
HDF5 "/Users/Shared/Data/Ang_Data/Small_IN100_Output/Small_IN100.dream3d" {
DATASET "/VoxelDataContainer/CELL_DATA/GrainIds" {
DATATYPE H5T_STD_I32LE
DATASPACE SIMPLE { ( 4444713 ) / ( 4444713 ) }
DATA {
}
ATTRIBUTE "NumComponents" {
DATATYPE H5T_STD_I32LE
DATASPACE SIMPLE { ( 1 ) / ( 1 ) }
DATA {
(0): 1
}
}
ATTRIBUTE "ObjectType" {
DATATYPE H5T_STRING {
STRSIZE 19;
STRPAD H5T_STR_NULLTERM;
CSET H5T_CSET_ASCII;
CTYPE H5T_C_S1;
}
DATASPACE SCALAR
DATA {
(0): "DataArray<int32_t>"
}
}
}
}
Looking at the output file "GrainIds.bin" the file size is registered as 26,800,344 bytes which is confusing because if we take the DATASPACE from above and multiply by 4 (number of bytes in H5T_STD_I32LE) I get 17,778,852 bytes. This is on hdf5-1.8.10-patch1, compiled on OS X 10.8.2 using the latest XCode installation from Apple.
Thoughts?
___________________________________________________________
Mike Jackson Principal Software Engineer
BlueQuartz Software Dayton, Ohio
mike.jackson@bluequartz.net www.bluequartz.net
use -binary will dump the data to binary format. so you will get the exact size.
just using -o will dump the data to text format, which contains extra information
(index, delimiters, etc). the size will be different.
···
On 2/20/2013 3:46 PM, Michael Jackson wrote:
I used this:
h5dump --dataset=/VoxelDataContainer/CELL_DATA/GrainIds --output=/Users/Shared/Data/Ang_Data/Small_IN100_Output/GrainIds.bin --binary=NATIVE /Users/Shared/Data/Ang_Data/Small_IN100_Output/Small_IN100.dream3d
HDF5 "/Users/Shared/Data/Ang_Data/Small_IN100_Output/Small_IN100.dream3d" {
DATASET "/VoxelDataContainer/CELL_DATA/GrainIds" {
DATATYPE H5T_STD_I32LE
DATASPACE SIMPLE { ( 4444713 ) / ( 4444713 ) }
DATA {
}
}
and got what I expected. Not really sure what the difference was except the --binary=NATIVE argument but I would have thought that was implied with the -o option?
Sorry for the noise.
___________________________________________________________
Mike Jackson Principal Software Engineer
BlueQuartz Software Dayton, Ohio
mike.jackson@bluequartz.net www.bluequartz.net
On Feb 20, 2013, at 4:35 PM, Michael Jackson <mike.jackson@bluequartz.net> wrote:
I am trying to dump a data set from one of my HDF5 files. Here is the output from my command line invocation:
503:[mjackson@mine:~]$ h5dump -d /VoxelDataContainer/CELL_DATA/GrainIds -o /Users/Shared/Data/Ang_Data/Small_IN100_Output/GrainIds.bin /Users/Shared/Data/Ang_Data/Small_IN100_Output/Small_IN100.dream3d
HDF5 "/Users/Shared/Data/Ang_Data/Small_IN100_Output/Small_IN100.dream3d" {
DATASET "/VoxelDataContainer/CELL_DATA/GrainIds" {
DATATYPE H5T_STD_I32LE
DATASPACE SIMPLE { ( 4444713 ) / ( 4444713 ) }
DATA {
}
ATTRIBUTE "NumComponents" {
DATATYPE H5T_STD_I32LE
DATASPACE SIMPLE { ( 1 ) / ( 1 ) }
DATA {
(0): 1
}
}
ATTRIBUTE "ObjectType" {
DATATYPE H5T_STRING {
STRSIZE 19;
STRPAD H5T_STR_NULLTERM;
CSET H5T_CSET_ASCII;
CTYPE H5T_C_S1;
}
DATASPACE SCALAR
DATA {
(0): "DataArray<int32_t>"
}
}
}
Looking at the output file "GrainIds.bin" the file size is registered as 26,800,344 bytes which is confusing because if we take the DATASPACE from above and multiply by 4 (number of bytes in H5T_STD_I32LE) I get 17,778,852 bytes. This is on hdf5-1.8.10-patch1, compiled on OS X 10.8.2 using the latest XCode installation from Apple.
Thoughts?
___________________________________________________________
Mike Jackson Principal Software Engineer
BlueQuartz Software Dayton, Ohio
mike.jackson@bluequartz.net www.bluequartz.net
_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org