Hello hdf users I hope someone can help me with this issue.
I am new to hdf5 but am an experienced netcdf user (in case that matters). I am trying to read in a dataset and then write out that information in binary format. I am comparing my output to the h5dump binary output, but cannot use that utility exclusively because I will need to do other things with this data before writing it out. For now this is a straight read and then binary write.
Doing an od on my output binary file results in data that looks like this:
000000 062632 000000 062763 000000 063016 000000 063303 000000
000020 063325 000000 063361 000000 063330 000000 063314 000000
000040 063347 000000 063217 000000 063065 000000 063173 000000
Whereas od on the h5dump binary file looks like this:
000000 062632 062763 063016 063303 063325 063361 063330 063314
000020 063347 063217 063065 062173 063053 063011 063163 063143
000040 063267 063506 063140 063124 063316 063011 063126 063211
I see there is a padding feature in HDF5 and I'm wondering if this is what is causing the issue and how to get rid of it? The code I've written uses C++ HDF5 libraries but the code reading the binary file is in fortran and is reading the data values as 26010, 0, 26099, 0, etc which is understandable based on the first octal dump. Also my h5dump binary files are exactly half the size of my binary output, which also makes sense. The datatype is PredType::NATIVE_INT which says this is a 4 byte integer, but the size of h5dump file would assume a 2 byte integer.
Can someone explain why this is the case and how to fix it? Do I need to use H5Tconvert before writing my binary file?
Below are a few snippets of my code.
H5File * hfile = new H5File(file, H5F_ACC_RDONLY);
DataSet* mgtdata = new DataSet(mgtgroup->openDataSet(varname.c_str()));
hdfdata[dim*dim]; -- dims are from the DataSpace using getSimpleExtentDims()
mgtdata->read(hdfdata, PredType::NATIVE_INT) -- NATIVE_INT found with getIntType()
^^ This data appears to be fine and hdfdata(0) and hdfdata(1) are both non-zero values.
ofstream binfile(outfile.c_str(), ios::out|ios::binary);
binfile.write((const char*) &hdfdata, sizeof(hdfdata));
close files and delete
I have combed through the archives looking for similar issues. I feel like I'm missing something and I'm hoping someone can help me pinpoint what that is. Thanks for any help you can provide.