The documentation indicates that the size limits on an attribute are the same as for a dataset, but that if larger than 64K, it will not be stored in the object header.
https://support.hdfgroup.org/HDF5/faq/limits.html
However, I am finding that attributes 64K or larger cannot be created using HDF-Java. Are they not stored as "dense" attributes outside the object header? Do I need to create the attribute in a different way if 64K or larger?
Any attribute larger than 64K will fail to be written with the following (or similar) stack trace:
ncsa.hdf.hdf5lib.exceptions.HDF5ObjectHeaderException: Unable to initialize object
at ncsa.hdf.hdf5lib.H5._H5Acreate2(Native Method)
at ncsa.hdf.hdf5lib.H5.H5Acreate(H5.java:622)
at ncsa.hdf.object.h5.H5File.writeAttribute(H5File.java:1757)
at ncsa.hdf.object.h5.H5ScalarDS.writeMetadata(H5ScalarDS.java:1114)
The above was generated by modifying the ArrayAttribute example here to get a 64K array:
https://support.hdfgroup.org/ftp/HDF5/hdf-java/current/src/unpacked/src/examples/datatypes/H5ObjectEx_T_ArrayAttribute.java
and modifying the dimensions to give a 64K array
private static final int DIM0 = 64;
private static final int ADIM0 = 16;
private static final int ADIM1 = 8;
private static final int NDIMS = 2;
(size of element is 8 bytes)
Modifying to make a 63K byte array and the example works as expected:
private static final int DIM0 = 63;
private static final int ADIM0 = 16;
private static final int ADIM1 = 8;
private static final int NDIMS = 2;
Jarom Nelson; x33953
Computer Scientist, NIF, LLNL