I have been using the same set of HDF5 codes for years and I recently came across a data file that is causing some issues. The dataset is a “simple” type with a single point. I have a C++ templated function that just reads a single scalar dataset like these. I have had no problems with this in the past but with this particular data file I am getting a rank value of 0. Here is my code:
template <typename T>
static herr_t readScalarDataset(hid_t loc_id,
const std::string& dsetName,
T& data)
{
H5SUPPORT_MUTEX_LOCK()
hid_t did;
herr_t err = 0;
herr_t retErr = 0;
hid_t spaceId;
hid_t dataType = H5Lite::HDFTypeForPrimitive(data);
if (dataType == -1)
{
return -1;
}
/* Open the dataset. */
did = H5Dopen( loc_id, dsetName.c_str(), H5P_DEFAULT );
if ( did < 0 )
{
std::cout << "H5Lite.h::readStringDataset(" << __LINE__ << ") Error opening Dataset at loc_id (" << loc_id << ") with object name (" << dsetName << ")" << std::endl;
return -1;
}
if ( did >= 0 )
{
spaceId = H5Dget_space(did);
if ( spaceId > 0 )
{
htri_t isSimple = H5Sis_simple(spaceId);
std::cout << "isSimple: " << isSimple << std::endl;
H5S_class_t extentType = H5Sget_simple_extent_type(spaceId);
std::cout << "extentType: " << extentType << std::endl;
hssize_t npoints = H5Sget_simple_extent_npoints(spaceId);
std::cout << "npoints: " << npoints << std::endl;
int32_t rank = H5Sget_simple_extent_ndims(spaceId);
if (rank > 0)
{
std::vector<hsize_t> dims;
dims.resize(rank);// Allocate enough room for the dims
err = H5Sget_simple_extent_dims(spaceId, &(dims.front()), nullptr);
hsize_t numElements = 1;
for (std::vector<hsize_t>::iterator iter = dims.begin(); iter < dims.end(); ++iter )
{
numElements = numElements * (*iter);
}
err = H5Dread(did, dataType, H5S_ALL, H5S_ALL, H5P_DEFAULT, &data );
if (err < 0)
{
std::cout << "Error Reading Data at loc_id (" << loc_id << ") with object name (" << dsetName << ")" << std::endl;
retErr = err;
}
}
else
{
std::cout << "Error Reading Data at loc_id (" << loc_id << ") with object name (" << dsetName << ") RANK=0." << std::endl;
retErr = -100;
}
err = H5Sclose(spaceId);
if (err < 0 )
{
std::cout << "Error Closing Data Space at loc_id (" << loc_id << ") with object name (" << dsetName << ")" << std::endl;
retErr = err;
}
}
else
{
retErr = spaceId;
}
err = H5Dclose( did );
if (err < 0 )
{
std::cout << "Error Closing Dataset at loc_id (" << loc_id << ") with object name (" << dsetName << ")" << std::endl;
retErr = err;
}
}
return retErr;
}
When I execute the line: “int32_t rank = H5Sget_simple_extent_ndims(spaceId);” I get a value of 0. Using HDFFView to inspect the data it does not look any different (clearly it is) than other data files that I have read in the past. Is there a logic issue with my code? Maybe I am not using the proper API? This is with the HDF 1.8.20 version of the library.
The HDF5 file was given to me from a commercial data acquisition system. HDFView seems to not have any problems with the file. In fact I have another function that is designed to read a “pointer” data set (really suited for large arrays of data) but using it to read a single value does work. I am guess I am just using an HDF5 API incorrectly but I am lost as to what it is. Any help is appreciated.
Thanks
Mike Jackson
BlueQuartz Software