Hi,
I am using the HDF5 plugin from ImageJ to view large 3D image arrays and encountered a ‘length is too large’ error. After a little bit of digging, it seems like the error comes from the fact that the array size is stored in int instead of long for one-dimensional arrays ().
// HDF5Utils.java
static int getOneDimensionalArraySize(final long[] dimensions)
{
assert dimensions != null;
if (dimensions.length == 0) // Scalar data space needs to be treated differently
{
return 1;
}
if (dimensions.length != 1)
{
throw new HDF5JavaException("Data Set is expected to be of rank 1 (rank="
+ dimensions.length + ")");
}
final int length = (int) dimensions[0];
if (length != dimensions[0])
{
throw new HDF5JavaException("Length is too large (" + dimensions[0] + ")");
}
return length;
}
Is it possible to switch all dimensionality related var to long type for jHDF5 as viewing large data set is quite common in the tomography field?
Thanks,
Chen