I have code that works really well to write a dataset from a java array, the problem is I reach the size limit on an int after about 20 million records and get an error. I would like to keep this code and somehow get around this. My dest_data is a byte buffer which has to be initialized with an int, the size of one record is 105 bytes so over 20 million records I will hit an int size limit, a byte array has to be initialized with an int.
dset_data = new byte[(int) dims[0] * Demand_Datatype.getDataSize()];
ByteBuffer outBuf = ByteBuffer.wrap(dset_data);
outBuf.order(ByteOrder.nativeOrder());
for (int indx = 0; indx < (int) dims[0]; indx++) {
object_data[indx].writeBuffer(outBuf, indx * Demand_Datatype.getDataSize());
}
// Write the Data Set
try {
if ((dataset_id >= 0) && (memtype_id >= 0))
H5.H5Dwrite(dataset_id, memtype_id, HDF5Constants.H5S_ALL, HDF5Constants.H5S_ALL,
HDF5Constants.H5P_DEFAULT, dset_data);
} catch (Exception e) {
e.printStackTrace();
}