H5 JNI memory leak

Hi,

I use the Java HDF5 interface (JNI), but am struggling with what appears to be some type of JNI memory leak. My use of the H5 JNI interface is encapsulated in a "writer" class that I have created. This writer class simply creates an H5 file. I am using the H5.getOpenObjectCount() to test for any leaks once I have created my file:

//when I start, I know there are no open ids
Assert.assertEquals(0, H5.getOpenIDCount());

//next, create my h5 file (my use of JNI is encapsulated within this "HdfWriter" class)
HdfWriter writer = HdfWriterFactory.getCsvHdfWriter(
        tempCsvFile.getAbsolutePath(), tempHdfFile.getAbsolutePath());
HdfWriterFactory.createHdfFile(writer);
    
//finally, make sure all ids are closed
Assert.assertEquals(0, H5.getOpenIDCount());

Everything seems fine: I start with 0 open HDF ids, my file is created, and at the end I also have 0 open ids. However, if I iterate the code above in a loop my java process continues to grow in memory until it crashes. I have triple checked that this is not a heap memory issue- I've viewed this using jconsole, as well as examined heap dumps I have created of the process. I am not getting any java.lang.OutofMemoryException errors. The heap memory never grows above 60 or 70Mb, although the java process itself is consuming over 1.1Gb of memory (I am running on Linux). I'm not sure what else would possibly be consuming this memory other than something in the JNI layer.

If H5.getOpenIDCount() is returning 0, is it still possible that there continues to be allocated memory in the H5 JNI layer that should have been released back to the OS? I have also tried H5garbage_collect(), but this doesn't appear to have any impact. I am using Hdf Java 2.6, and didn't see any known issues relating to memory leaks. Any suggestions on how to debug would be greatly appreciated.

Regards,
Chris

Chris,

Thank you very much for reporting the issue. Before we will look at it, we
need more information from you.

Without knowing what is inside of HdfWriterFactory.createHdfFile(), it is
hard to know where the memory leak is. It would be great if you can isolate
the code so that we can compile it and reproduce problem.

Thanks
--pc

···

On 9/30/2011 1:51 PM, Brown, Chris wrote:

Hi,

I use the Java HDF5 interface (JNI), but am struggling with what appears to be some type of JNI memory leak. My use of the H5 JNI interface is encapsulated in a "writer" class that I have created. This writer class simply creates an H5 file. I am using the H5.getOpenObjectCount() to test for any leaks once I have created my file:

//when I start, I know there are no open ids
Assert.assertEquals(0, H5.getOpenIDCount());

//next, create my h5 file (my use of JNI is encapsulated within this "HdfWriter" class)
HdfWriter writer = HdfWriterFactory.getCsvHdfWriter(
        tempCsvFile.getAbsolutePath(), tempHdfFile.getAbsolutePath());
HdfWriterFactory.createHdfFile(writer);
    
//finally, make sure all ids are closed
Assert.assertEquals(0, H5.getOpenIDCount());

Everything seems fine: I start with 0 open HDF ids, my file is created, and at the end I also have 0 open ids. However, if I iterate the code above in a loop my java process continues to grow in memory until it crashes. I have triple checked that this is not a heap memory issue- I've viewed this using jconsole, as well as examined heap dumps I have created of the process. I am not getting any java.lang.OutofMemoryException errors. The heap memory never grows above 60 or 70Mb, although the java process itself is consuming over 1.1Gb of memory (I am running on Linux). I'm not sure what else would possibly be consuming this memory other than something in the JNI layer.

If H5.getOpenIDCount() is returning 0, is it still possible that there continues to be allocated memory in the H5 JNI layer that should have been released back to the OS? I have also tried H5garbage_collect(), but this doesn't appear to have any impact. I am using Hdf Java 2.6, and didn't see any known issues relating to memory leaks. Any suggestions on how to debug would be greatly appreciated.

Regards,
Chris

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org