HDF5-DIAG: Error detected in HDF5 (1.10.4) thread 140169838303040:


Hi guys,
I am trying to read an hdf5 dataset named mask that is stored in a file that I create with a python script.
The few instructions I use in python follow:

mask = h5_fid.create_dataset(‘mask’, dtype=‘u1’, shape=dum.shape)
mask[:] = dum[:]

When I try to read this dataset with a program in C I get the following errors:

HDF5-DIAG: Error detected in HDF5 (1.10.4) thread 139713380624192:
#000: H5Dio.c line 199 in H5Dread(): can’t read data
major: Dataset
minor: Read failed
#001: H5Dio.c line 467 in H5D__read(): unable to set up type info
major: Dataset
minor: Unable to initialize object
#002: H5Dio.c line 968 in H5D__typeinfo_init(): not a datatype
major: Invalid arguments to routine
minor: Inappropriate type

The code that I am using is the following:

#include “hdf5.h”

#define FILE “mask_196_130.h5”
#define DATASETNAME “mask”
#define NX_SUB 2 /* hyperslab dimensions /
#define NY_SUB 3
#define NX 2 /
output buffer dimensions */
#define NY 3
#define RANK 2
#define RANK_OUT 2

main (void)
hid_t file, dataset; /* handles /
hid_t datatype, dataspace;
hid_t memspace;
H5T_class_t class; /
datatype class /
H5T_order_t order; /
data order /
size_t size; /

* size of the data element
* stored in file
hsize_t dimsm[2]; /
memory space dimensions /
hsize_t dims_out[2]; /
dataset dimensions */
herr_t status;

int8_t         data_out[NX][NY]; /* output buffer */

hsize_t      count[2];              /* size of the hyperslab in the file */
hsize_t      offset[2];             /* hyperslab offset in the file */
hsize_t      count_out[2];          /* size of the hyperslab in memory */
hsize_t      offset_out[2];         /* hyperslab offset in memory */
int          i, j, k, status_n, rank;

for (j = 0; j < NX; j++) {
for (i = 0; i < NY; i++) {
    data_out[j][i] = 0;

 * Open the file and the dataset.
dataset = H5Dopen(file, DATASETNAME);

 * Get datatype and dataspace handles and then query
 * dataset class, order, size, rank and dimensions.
datatype  = H5Dget_type(dataset);     /* datatype handle */ 
class     = H5Tget_class(datatype);
if (class == H5T_INTEGER) printf("Data set has type H5T_INTEGER \n");
order     = H5Tget_order(datatype);
if (order == H5T_ORDER_LE) printf("Little endian order \n");

size  = H5Tget_size(datatype);
printf(" Data size is %d \n", size);

dataspace = H5Dget_space(dataset);    /* dataspace handle */
rank      = H5Sget_simple_extent_ndims(dataspace);
status_n  = H5Sget_simple_extent_dims(dataspace, dims_out, NULL);
printf("rank %d, dimensions %lu x %lu \n", rank,
   (unsigned long)(dims_out[0]), (unsigned long)(dims_out[1]));

 * Define hyperslab in the dataset. 
offset[0] = 0;
offset[1] = 0;
count[0]  = NX_SUB;
count[1]  = NY_SUB;
status = H5Sselect_hyperslab(dataspace, H5S_SELECT_SET, offset, NULL, 
			 count, NULL);

 * Define the memory dataspace.
dimsm[0] = NX;
dimsm[1] = NY;
memspace = H5Screate_simple(RANK_OUT,dimsm,NULL);   

 * Define memory hyperslab. 
offset_out[0] = 0;
offset_out[1] = 0;
count_out[0]  = NX_SUB;
count_out[1]  = NY_SUB;
status = H5Sselect_hyperslab(memspace, H5S_SELECT_SET, offset_out, NULL, 
			 count_out, NULL);

 * Read data from hyperslab in the file into the hyperslab in 
 * memory and display.
status = H5Dread(dataset, H5T_INTEGER, memspace, dataspace,
	     H5P_DEFAULT, data_out);
printf("%d\n", status);
for (j = 0; j < NX; j++) {
for (i = 0; i < NY; i++) printf("%d ", data_out[j][i]);

return 0;


It seems that the array data_out is properly declared, but it is apparent that the dataset is not properly read.
Is there any suggestions?
Many thanks in advance.


H5T_INTEGER is not a datatype ID (of type hid_t), but a datatype class (of type H5T_class_t).

That’s what the error message says:

#002: H5Dio.c line 968 in H5D__typeinfo_init(): not a datatype
major: Invalid arguments to routine
minor: Inappropriate type

Pre-defined integer datatypes are listed here.



Many thanks Gheber. I solved by reading H5T_NATIVE_CHAR. Now it works properly.
Best regards :slight_smile:


You are very welcome. I’d recommend you upgrade to HDF5 1.10.9 if you can, or ask the people in charge to do it for you.

Best, G.