Greetings,
I'm working my way through the tutorials and so far I've been able to create valid files that have a dataset but can't seem to create, let alone add, attributes to that dataset.
I'm working on a Scientific Linux system with version 1.8.5.patch1 of the hdf5 package, and version 2.10.1 of the HDFView with Java library package.
The line of code that produced the exception is:
int attribute_id = H5.H5Acreate(dataset_id, attr_name,
HDF5Constants.H5T_STD_U64LE, dataspace_id,
HDF5Constants.H5P_DEFAULT, HDF5Constants.H5P_DEFAULT);
The exception reported is:
Oct 07, 2014 7:02:17 AM hdf5learn.Hdf5Learn main
SEVERE: null
ncsa.hdf.hdf5lib.exceptions.HDF5ObjectHeaderException: Unable to
initialize object
at ncsa.hdf.hdf5lib.H5._H5Acreate2(Native Method)
at ncsa.hdf.hdf5lib.H5.H5Acreate(H5.java:618)
at hdf5learn.Hdf5Learn.addAttribute(Hdf5Learn.java:100)
at hdf5learn.Hdf5Learn.main(Hdf5Learn.java:71)
I've confirmed that dataset_id and dataspace_id are valid by removing the attribute code and writing the file containing the dataset and verifying the file with HDFView, h5stat and the Python routines that will use the data.
I suspect my problem is either with the data type (I need attributes that are double and String types eventually) or the order in which I'm doing things. A pseudo code version of this test could look like:
fileId = H5.H5Fcreate(...)
dataspaceId = H5.H5create_simple(...)
if (fileId >= 0 && dataspaceId >=0
{
datasetId = H5.H5Dcreate(...)
if (datasetId >=0)
{
attributeId = H5.H5Acreate(...) <-- This is where the
exception is thrown
<fill and write attribute stuff that never gets called
because of exception>
H5.H5Dwrite(...)
H5.H5Dclose(datasetId)
H5.H5Fclose(...)
}
}
I have tried creating the attribute after writing the data with the same results.
Any hints on what could be wrong or links to documentation or examples I have missed would be greatly appreciated.
Best,
Joe