[BUG] Memory Leak when Slicing Dataset


#1

Hello!

To reproduce:

  1. Compile attached test1176.c.

  2. Run.
    Memory usage keeps reasonable when creating data file, but explodes
    quickly when reading data with stride=2.

Originally reported in https://github.com/h5py/h5py/issues/1176.

Best wishes,
Andrey Paramonov

test1176.c (1.82 KB)


#2

Hi Andrey,

I entered bug HDFFV-10709 for this issue.
You can log in to jira.hdfgroup.org to see the status of the issue.

Thanks!
-Barbara


#3

Please try code from hyperslab_update branch https://bitbucket.hdfgroup.org/projects/HDFFV/repos/hdf5/browse?at=refs%2Fheads%2Fhyperslab_updates.

Thank you!

Elena


#4

Hi Elena!

Works fine for me, albeit with the following patch:

diff --git a/src/H5Shyper.c b/src/H5Shyper.c
index d4e7be02fe..70dd1b4fd1 100644
--- a/src/H5Shyper.c
+++ b/src/H5Shyper.c
@@ -546,7 +546,7 @@ H5S__hyper_op_gen(void)
         /* No associated value with current thread - create one */
 #ifdef H5_HAVE_WIN_THREADS
         /* Win32 has to use LocalAlloc to match the LocalFree in DllMain */
-        op_gen = (H5CX_node_t **)LocalAlloc(LPTR, sizeof(uint64_t));
+        op_gen = (uint64_t *)LocalAlloc(LPTR, sizeof(uint64_t));
 #else
         /* Use HDmalloc here since this has to match the HDfree in the
          * destructor and we want to avoid the codestack there.

Thanks for fixing it!
and best wishes,
Andrey Paramonov


#5

Hi Andrey,
Thanks for the patch, I’ve committed it to the branch.

	Quincey