I have a memory problem when trying to write large chunked dataset’s in a loop. The process consumes all available RAM and crashes at the end. This is an output from HDF5 lib:
#000: C:\Data\09_C\hdf5-1.8.20\src\H5Dio.c line 322 in H5Dwrite(): can’t prepare for writing data
minor: Write failed
#001: C:\Data\09_C\hdf5-1.8.20\src\H5Dio.c line 403 in H5D__pre_write(): can’t write data
#002: C:\Data\09_C\hdf5-1.8.20\src\H5Dio.c line 846 in H5D__write(): can’t write data
#003: C:\Data\09_C\hdf5-1.8.20\src\H5Dchunk.c line 2224 in H5D__chunk_write(): unable to read raw data chunk
major: Low-level I/O
minor: Read failed
#004: C:\Data\09_C\hdf5-1.8.20\src\H5Dchunk.c line 3093 in H5D__chunk_lock(): memory allocation failed for raw data chunk
major: Resource unavailable
minor: No space available for allocation
I carefully checked that I close all objects except dataSet which I keep open until the end of a program.
When I close and reopen dataset on each iteration the situation becomes better, but I see the memory increases each time the H5Dwrite is called. I also checked H5Fget_obj_count each loop is returning 1 for root file instance which is OK.
The memory is getting free only when the H5Fclose is called at the end of the program.
Do I missed something?