64K Limit on Attribute size in HDF-Java?

The documentation indicates that the size limits on an attribute are the same as for a dataset, but that if larger than 64K, it will not be stored in the object header.
https://support.hdfgroup.org/HDF5/faq/limits.html

However, I am finding that attributes 64K or larger cannot be created using HDF-Java. Are they not stored as "dense" attributes outside the object header? Do I need to create the attribute in a different way if 64K or larger?

Any attribute larger than 64K will fail to be written with the following (or similar) stack trace:
ncsa.hdf.hdf5lib.exceptions.HDF5ObjectHeaderException: Unable to initialize object
                at ncsa.hdf.hdf5lib.H5._H5Acreate2(Native Method)
                at ncsa.hdf.hdf5lib.H5.H5Acreate(H5.java:622)
                at ncsa.hdf.object.h5.H5File.writeAttribute(H5File.java:1757)
                at ncsa.hdf.object.h5.H5ScalarDS.writeMetadata(H5ScalarDS.java:1114)

The above was generated by modifying the ArrayAttribute example here to get a 64K array:
https://support.hdfgroup.org/ftp/HDF5/hdf-java/current/src/unpacked/src/examples/datatypes/H5ObjectEx_T_ArrayAttribute.java

and modifying the dimensions to give a 64K array
    private static final int DIM0 = 64;
    private static final int ADIM0 = 16;
    private static final int ADIM1 = 8;
    private static final int NDIMS = 2;

(size of element is 8 bytes)

Modifying to make a 63K byte array and the example works as expected:
    private static final int DIM0 = 63;
    private static final int ADIM0 = 16;
    private static final int ADIM1 = 8;
    private static final int NDIMS = 2;

Jarom Nelson; x33953
Computer Scientist, NIF, LLNL

Hi Jarom,

I entered JAVA-1960 for this issue to be investigated further. It's not clear whether you can create dense attributes with the object package or not.

Dense attributes were new with HDF5-1.8. If you make sure to set to the library version bounds to use the latest version, you
should be able to create an attribute greater than 64K.

In C, you would set the H5Pset_libver_bounds property list to H5F_LIBVER_LATEST and create the file with that property list, like this:

  fapid = H5Pcreate (H5P_FILE_ACCESS);
status = H5Pset_libver_bounds (fapid, H5F_LIBVER_LATEST,H5F_LIBVER_LATEST);
  file_id = H5Fcreate(FILENAME, H5F_ACC_TRUNC, H5P_DEFAULT, fapid);
  status = H5Pclose (fapid);
   ...

I did find the H5Pset_libver_bounds API in the Java docs, so this should work with the JHI5.
If you encounter any issues let us know!

-Barbara
help@hdfgroup.org

···

From: Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org] On Behalf Of Nelson, Jarom
Sent: Tuesday, March 07, 2017 2:07 PM
To: hdf-forum@lists.hdfgroup.org
Subject: [Hdf-forum] 64K Limit on Attribute size in HDF-Java?

The documentation indicates that the size limits on an attribute are the same as for a dataset, but that if larger than 64K, it will not be stored in the object header.
https://support.hdfgroup.org/HDF5/faq/limits.html

However, I am finding that attributes 64K or larger cannot be created using HDF-Java. Are they not stored as "dense" attributes outside the object header? Do I need to create the attribute in a different way if 64K or larger?

Any attribute larger than 64K will fail to be written with the following (or similar) stack trace:
ncsa.hdf.hdf5lib.exceptions.HDF5ObjectHeaderException: Unable to initialize object
                at ncsa.hdf.hdf5lib.H5._H5Acreate2(Native Method)
                at ncsa.hdf.hdf5lib.H5.H5Acreate(H5.java:622)
                at ncsa.hdf.object.h5.H5File.writeAttribute(H5File.java:1757)
                at ncsa.hdf.object.h5.H5ScalarDS.writeMetadata(H5ScalarDS.java:1114)

The above was generated by modifying the ArrayAttribute example here to get a 64K array:
https://support.hdfgroup.org/ftp/HDF5/hdf-java/current/src/unpacked/src/examples/datatypes/H5ObjectEx_T_ArrayAttribute.java

and modifying the dimensions to give a 64K array
    private static final int DIM0 = 64;
    private static final int ADIM0 = 16;
    private static final int ADIM1 = 8;
    private static final int NDIMS = 2;

(size of element is 8 bytes)

Modifying to make a 63K byte array and the example works as expected:
    private static final int DIM0 = 63;
    private static final int ADIM0 = 16;
    private static final int ADIM1 = 8;
    private static final int NDIMS = 2;

Jarom Nelson; x33953
Computer Scientist, NIF, LLNL

Adding a call to FileFormat.setLibBounds(int,int) to the ArrayAttribute example:

            file = new H5File(FILENAME, FileFormat.CREATE);
            file.setLibBounds(HDF5Constants.H5F_LIBVER_LATEST, HDF5Constants.H5F_LIBVER_LATEST);
            file.open();

I still get the same behavior: fails with 64K, passes with 63K.

The versions I'm using:
HDF5 1.8.14
HDF-Java 2.11.0

Jarom

···

From: Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org] On Behalf Of Barbara Jones
Sent: Wednesday, March 8, 2017 12:43 PM
To: HDF Users Discussion List <hdf-forum@lists.hdfgroup.org>
Subject: Re: [Hdf-forum] 64K Limit on Attribute size in HDF-Java?

Hi Jarom,

I entered JAVA-1960 for this issue to be investigated further. It's not clear whether you can create dense attributes with the object package or not.

Dense attributes were new with HDF5-1.8. If you make sure to set to the library version bounds to use the latest version, you
should be able to create an attribute greater than 64K.

In C, you would set the H5Pset_libver_bounds property list to H5F_LIBVER_LATEST and create the file with that property list, like this:

  fapid = H5Pcreate (H5P_FILE_ACCESS);
status = H5Pset_libver_bounds (fapid, H5F_LIBVER_LATEST,H5F_LIBVER_LATEST);
  file_id = H5Fcreate(FILENAME, H5F_ACC_TRUNC, H5P_DEFAULT, fapid);
  status = H5Pclose (fapid);
   ...

I did find the H5Pset_libver_bounds API in the Java docs, so this should work with the JHI5.
If you encounter any issues let us know!

-Barbara
help@hdfgroup.org<mailto:help@hdfgroup.org>

From: Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org] On Behalf Of Nelson, Jarom
Sent: Tuesday, March 07, 2017 2:07 PM
To: hdf-forum@lists.hdfgroup.org<mailto:hdf-forum@lists.hdfgroup.org>
Subject: [Hdf-forum] 64K Limit on Attribute size in HDF-Java?

The documentation indicates that the size limits on an attribute are the same as for a dataset, but that if larger than 64K, it will not be stored in the object header.
https://support.hdfgroup.org/HDF5/faq/limits.html

However, I am finding that attributes 64K or larger cannot be created using HDF-Java. Are they not stored as "dense" attributes outside the object header? Do I need to create the attribute in a different way if 64K or larger?

Any attribute larger than 64K will fail to be written with the following (or similar) stack trace:
ncsa.hdf.hdf5lib.exceptions.HDF5ObjectHeaderException: Unable to initialize object
                at ncsa.hdf.hdf5lib.H5._H5Acreate2(Native Method)
                at ncsa.hdf.hdf5lib.H5.H5Acreate(H5.java:622)
                at ncsa.hdf.object.h5.H5File.writeAttribute(H5File.java:1757)
                at ncsa.hdf.object.h5.H5ScalarDS.writeMetadata(H5ScalarDS.java:1114)

The above was generated by modifying the ArrayAttribute example here to get a 64K array:
https://support.hdfgroup.org/ftp/HDF5/hdf-java/current/src/unpacked/src/examples/datatypes/H5ObjectEx_T_ArrayAttribute.java

and modifying the dimensions to give a 64K array
    private static final int DIM0 = 64;
    private static final int ADIM0 = 16;
    private static final int ADIM1 = 8;
    private static final int NDIMS = 2;

(size of element is 8 bytes)

Modifying to make a 63K byte array and the example works as expected:
    private static final int DIM0 = 63;
    private static final int ADIM0 = 16;
    private static final int ADIM1 = 8;
    private static final int NDIMS = 2;

Jarom Nelson; x33953
Computer Scientist, NIF, LLNL