Memory leak in getSpace?

Hi,

I'm using HDF5 1.8.15-patch1 whose library version is 10.0.1 (C++). OS is a Debian Wheezy.

I have a very simple test code that is trying to find a possible memory leak I'm observing in the code.

(...)
H5File file(...);

for (int i = 0; i < 1000; i++)
{
DataSet ds = file.openDataSet("MyDS"); // (1) intentionally kept inside the loop [see note below]
DataSpace dataspace(ds.getSpace()); // (2)

// doSomething

dataspace.close(); // (3)
ds.close(); // (4)

            sleep_for(milliseconds(20));
}

This program runs for 20sec and I keep seeing physical memory increasing every second. If line (2) is commented out, memory footprint stays at the same level throughout the whole program execution.

Notes:

- Version 1.8.16 (C++ API) is removing memory leaks as per this release doc (https://www.hdfgroup.org/ftp/HDF5/current/src/unpacked/release_docs/RELEASE.txt). Would this update also fix the getSpace() call?

- There is an old post in StackOverflow that shows exactly this problem I'm having (http://stackoverflow.com/questions/13443689/why-does-this-code-leak-simple-codesnippet). But there is no good answer for that.

- I was originally writing data to a dataset in that loop above. Then, I read this page (https://www.hdfgroup.org/HDF5/faq/perfissues.html, its third bullet in section Open Objects) and noticed that I was not closing the dataspace. But that didn't bring any positive effect unfortunately. Memory still keeps going up.

- If I call file.getObjCount() inside that loop, I see it returns 1. The call file.getObjCount(H5F_OBJ_FILE) also returns 1, meaning that it's only the file that is opened.

- The code snippet above is just an example and what is inside the loop (not the loop itself) is actually inside a wrapper that abstracts the HDF5 library from the client. So, I don't know if the wrapper user will use that inside a loop or not. That's why I want to intentionally keep lines (1, 2, 3 and 4) inside the loop.

Could anyone please shed some light on this?

Thanks!
Carlos Rocha