TL;DR
I have memory leaks coming from several H5D functions used when I extend and write to a data set, I assume the problem is on my end but my code is pretty boiler plate.
Questions:
-
Is there some returned/duplicated structures I am not cleaning up or is this a know problem?
-
Is this related to my dataset extension/chunking?
- Is this an expected addition to the data set structure ( represented by
dset_id
, which I assume is held in memory) as we expand / add chunks?
- Is this an expected addition to the data set structure ( represented by
Full Description
I have a program: it uses HDF5 C API (local cmake build) on windows x64, developed using VS 2019 and MSVC.
I have a method to extend and write to a data set.
However there appears to be a memory leak stemming from my use of HDF5, specifically H5D.
Specifically functions:
H5Dwrite
H5Dset_extent
H5Dget_space
Functions from other parts of the library, eg H5Sselect_hyperslab
do not appear to leak in the same way.
Memory profiling:
Using two snapshots in VS 2019:
The code:
// extend dataset
cur_dims.front() += bufLen;
ret = H5Dset_extent(dset_id, &cur_dims.front());
assert(ret >= 0);
// Get dataspace of extended dataset
space_id = H5Dget_space(dset_id);
assert(space_id >= 0);
// Select start of hyperslab in the dataset dataspace
std::vector<hsize_t> slab_start(rank, 0);
slab_start.front() = cur_dims.front() - bufLen;
std::vector<hsize_t> slab_count{ bufLen };
slab_count.insert(std::end(slab_count), std::begin(extra_dims), std::end(extra_dims));
ret = H5Sselect_hyperslab(space_id, H5S_SELECT_SET, &slab_start.front(), NULL, &slab_count.front(), NULL);
assert(ret >= 0);
ret = H5Dwrite(dset_id, mem_datatype, mem_space_id, space_id, H5P_DEFAULT, buf);
assert(ret >= 0);