Hi, I try to write a dataset in a H5file from the Java object package with
long[] CHUNKs = { 1, 1 };
H5File.createScalarDS(name, group, c.getResult(), dims, null, CHUNKs, 9, data);
In doing so I get an error:
[main] INFO hdf.hdf5lib.H5 - HDF5 library: hdf5_java
[main] INFO hdf.hdf5lib.H5 - successfully loaded from java.library.path
Aug. 18, 2025 1:50:45 PM ....hdf5.func.dataset.creator.scalar.HDF5DatasetCreator simple
SCHWERWIEGEND: null
java.lang.Exception: failed to write to scalar dataset: No space available for allocation
at hdf.object.h5.H5ScalarDS.write(H5ScalarDS.java:900)
at hdf.object.h5.H5ScalarDS.create(H5ScalarDS.java:1865)
at hdf.object.h5.H5File.createScalarDS(H5File.java:1669)
at hdf.object.FileFormat.createScalarDS(FileFormat.java:1261)
at ....hdf5.func.dataset.creator.scalar.HDF5DatasetCreator.simple(HDF5DatasetCreator.java:187)
at ....hdf5.func.dataset.creator.scalar.HDF5DatasetCreator.run(HDF5DatasetCreator.java:126)
Caused by: java.lang.Exception: No space available for allocation
at hdf.object.h5.H5ScalarDS.scalarDatasetCommonIO(H5ScalarDS.java:1159)
at hdf.object.h5.H5ScalarDS.write(H5ScalarDS.java:896)
... 85 more
A similar problem seems to be reported in “No space available for allocation” Error in H5Dwrite function - Hdf-forum archives - HDF Forum and also Memory allocation failed for raw data chunk - HDF5 Library - HDF Forum?
Do I have to adapt the JVM HeapSpace? Or is this an item resulting from the Windows dll’s? How can I avoid that. The dataset is rather big, but nowhere near the limits of RAM or disk space available on the system. Can I write the dataset in chunks to avoid that behavior? Are there any examples for doing that?
I use the object package from HDFView 3.3.2.
