Dear All,
I am using parallel hdf5 to write compound dataset from biological
simulations. I am trying to use compound datatype with an array element
whose size is known at runtime (I see similar question which is not answered
<http://hdf-forum.184993.n3.nabble.com/C-or-C-create-a-compound-datatype-of-double-arrays-with-runtime-size-determination-td4025678.html>\).
I have searched mailing list but didn't see any solution. I am providing
all info with the complete example. Let me know if I am making any mistake
below:
The compound datatype example
<https://www.hdfgroup.org/ftp/HDF5/examples/examples-by-api/hdf5-examples/1_8/C/H5T/h5ex_t_cmpd.c>
in the tutorial write below dataset (this is simple *serial* test example):
typedef struct {
double temperature;
double pressure;
char *location;
int serial_no;
} sensor_t;
I am slightly modifying this example to write:
typedef struct {
double temperature;
double pressure;
int *location; // note that compile time fixed
size array (e.g. *int location[38]*) works fine!
int serial_no;
} sensor_t;
Note that the length of location array is known at runtime and *same for
all instances of sensor_t (so I assume I don't need variable length
datatype which is not supported by parallel hdf5 anyway)*.
So I create compound datatype as:
// first create datatype for an array element
locdims[0] = nloc; //nloc is size of location array,
strtype = H5Tarray_create(H5T_NATIVE_INT, 1, locdims);
// I add size of location array otherwise I get an error: H5T__insert():
member extends past end of compound type
memtype = H5Tcreate (H5T_COMPOUND, sizeof (sensor_t) +
nloc*sizeof(int));
status = H5Tinsert (memtype, "Temperature (F)",
HOFFSET (sensor_t, temperature), H5T_NATIVE_DOUBLE);
status = H5Tinsert (memtype, "Pressure (inHg)",
HOFFSET (sensor_t, pressure), H5T_NATIVE_DOUBLE);
status = H5Tinsert (memtype, "Location", HOFFSET (sensor_t, location),
strtype);
/* for serial_no field I can't use HOFFSET (sensor_t, serial_no)
because location
* is pointer to 1-d array. So I manually calculate offset as:
*/
int serial_no_offset = HOFFSET (sensor_t, location) + nloc*sizeof(int);
status = H5Tinsert (memtype, "Serial number",
serial_no_offset, H5T_NATIVE_INT);
Now if I write and read above dataset through program I get correct values!
But if I try to look at the generated hdf5 file with hdf5dump, it shows
invalid values :
*$ ./a.out *
Dynamic Allocating nlocs 4
DS1[0]:
Serial number : 1153
Location : 0
Location : 10
Location : 20
Location : 30
Temperature (F) : 53.230000
Pressure (inHg) : 24.570000
DS1[1]:
Serial number : 1184
Location : 0
Location : 10
Location : 20
Location : 30
Temperature (F) : 55.120000
Pressure (inHg) : 22.950000
*$ h5dump h5ex_t_cmpd.h5*
HDF5 "h5ex_t_cmpd.h5" {
GROUP "/" {
DATASET "DS1" {
DATATYPE H5T_COMPOUND {
H5T_IEEE_F64LE "Temperature (F)";
H5T_IEEE_F64LE "Pressure (inHg)";
H5T_ARRAY { [4] H5T_STD_I32LE } "Location";
H5T_STD_I32LE "Serial number";
}
DATASPACE SIMPLE { ( 2 ) / ( 2 ) }
DATA {
(0): {
53.23,
24.57,
[ -515879600, 32684, 1153, 32767 ],
687194767
},
(1): {
6.93572e-310,
5.84974e-321,
[ 0, 0, 4, 0 ],
2
}
}
}
}
}
I have attached test program with this email. The fixed size array example
works fine and you can compile the attached test as:
gcc h5ex_t_cmpd.c -DSTATIC -I/include path lib_link
*To reproduce the above issue with (dynamically allocated array) : *
gcc h5ex_t_cmpd.c -I/include path lib_link
Any help will be appreciated!
Regards,
Pramod
h5ex_t_cmpd.c (5.58 KB)