Purpose of h5tcreate and h5tinsert

Hi,

I’m still working on reading a dataset created by FLASH. I found the code COMPOUNDEXAMPLE on the bitbucket site, but there are several aspects of it that are confusing to me.
It starts with creating a file (h5fcreate_f) and creating a dataspace (h5fcreate_simple_f), then creates a compound data type. To create the datatype there are calls to h5tget_size, which I understand as necessary to get the size of the compound data type. Then h5tcreate_f is used to create the data type. That much I think I understand.
Then h5tinsert_f is called to insert fields into the data type(?). Then h5dcreate_f is called to create the dataset. After that h5tcreate_f and h5tinsert_f are again called for each field in the compound data type. Why?

If I’m just wanting to read data do I need to go through all of those steps? From my experimenting, it would seem not, but I’d really like to understand why all those steps are included and whether they all are necessary.

Thanks,
Jon

If you use F2003 instead, it’s easier. H5Tinsert, etc. are needed to describe the derived type to HDF5. Otherwise, HDF5 does not know how to read the data into the derived type. You should just need to use H5Tinsert with the HOFFSET function to create the compound datatype in memory. Then use this memory datatype to read the data.

Are you referring to the h5ex_t_cmpd_F03.f90 example,
https://support.hdfgroup.org/HDF5/examples/api-fortran.html

No, the code I’m referring to is at
https://bitbucket.hdfgroup.org/projects/HDFFV/repos/hdf5/browse/fortran/examples/compound.f90
I’ve tried to follow that example as closely as possible but so far without success. I’m trying to read a compound data type in a file that consists of 80 character strings and standard 4 byte integers. There’s an array of 17 of these. The program segfaults while producing output that appears to show that the variables are the wrong size - each successive string is shifted several spaces. The output from h5dump would indicate that the variable sizes I’m using are the right ones. So I’m at a loss.
Why is it that it’s so simple in python, but so complex in Fortran? In python I can just do f = h5py.File(filename,‘r’), dset = f[‘dset_name’].
I guess no one has written a higher level wrapper to query and then read data in Fortran.

Jon

It’s not a Fortran issue; you would still have to do the same procedure with C. Correct, there is no high-level APIs in C or Fortran which handle compound datatypes. The only thing I can think of without looking at the code and file is you are not correctly reading the C string. You might take a look at this example, h5ex_t_stringC_F03.f90, at the previous link.

It could also be a bug in the Fortran interface.

Well I went back to that example code that you mentioned, h5ex_t_cmpd_F03.F90, and tried again to follow it (I had tried it earlier). This time it worked! So I can now read the data that I want to. Thanks for giving me that impetus.
I still don’t know what went wrong with the other example, but I’ve already spent more time than I wanted to on it.

Regards,
Jon

Please send us (help@hdfgroup.org) your file (or h5dump output that includes datatype) and we will create a Fortran example.

Thank you!

Elena

Hi Elena,

Sure, I could do that. The data file is rather large (1.4M). Given that, would the dump be better? I could edit it to a smaller size.
One thing about the examples - I find that there’s a bigger emphasis on writing than reading. In some cases it’s unclear to me if all the steps involved in writing the file are necessary for reading. So it might be nice to have examples just aimed at reading (probably paired with the example that writes the file.)

Jon

Hi Jon,

Attached is an example program that reads a compound dataset. Is this helpful?

-Barbara
help@hdfgroup.org

rdcompd.f90 (2.2 KB)
compd.h5 (67.5 KB)

Hi Barbara,

Yes! That’s much simpler than the other way - which also worked for me but with many more calls (h5tcreate, h5tset_size, h5t_insert…). I wasn’t aware of those get_type and get_native_type subroutines.

Thanks,
Jon