Group Heap Leak in 2.7 HDF-JAVA

Greetings,

I've recently noticed a problem with the group heap size in the current
distribution. I created a sample program that demonstrates the condition
that I am seeing. Here is the write up of the situation.

The problem doesn't seem to be happening in older versions of the HDF-JAVA.
Please let me know if I am missing something in the new API.

···

-------------------------------
The problem occurs in
- hdf-java-2.7-bin for Windows 64bit and Java 1.6.0_26-b03 for Windows 64bit
- hdf-java-2.7-bin for Windows 32bit and Java jdk1.6.0_25 for Windows 32 bit

Fortunately, it seems that the problem does not occur in
- hdf-java-2.6.1-bin for Windows 64bit and Java 1.6.0_26-b03 for Windows
64bit

-------------------------------
I wrote a Java program that demonstrates this problem called
H5GroupHeapMemoryLeak. The demo basically runs two tests.

Test 1 - working test with _no_ leak
step1. open file
step2. create a unique group
step3. write a 20 datasets with 10k values in them under the group
step4. close file
step5. repeat steps1-4 25 times

Test 2 - demonstrates the Group Heap Leak problem
step1. open file
step2. create a group called "levelOneGroup" or get it from the file
step3. create a unique group
step4. write a 20 datasets with 10k values in them under the group
step5. close file
step6. repeat steps1-4 25 times

The difference between Test 1 and Test 2 seems to be the reuse of the
"levelOneGroup" group in Test 2. In this case, we must call
fileFormat.get(String) to get the group. It seems that this call is what
creates the heap problem.
-------------------------------
The output of the program is the following:

good file size: 202392
leak file size: 23187400

-------------------------------
The output of h5stat shows the differences in the Group Heap

good file size: 202392
File space information for file metadata (in bytes):
        Groups:
                B-tree/List: 48584
                Heap: 9984

//leak file size: 23187400
File space information for file metadata (in bytes):
        Groups:
                B-tree/List: 49456
                Heap: 11544088

-------------------------------
Here is the code for this demonstration. I based the demo on some tests that
were already in the distribution (excuse the lack of documentation)

import java.util.List;

import ncsa.hdf.object.Dataset;
import ncsa.hdf.object.Datatype;
import ncsa.hdf.object.FileFormat;
import ncsa.hdf.object.Group;
import ncsa.hdf.object.h5.H5File;

import
com.referentia.liveaction.server.core.nfstore.impl.hdf5.util.HDF5SdfConstants;

/**
* Implements an example of a memory leak in the groups.
*/
public class H5GroupHeapMemoryLeak {

  private static final boolean USEGET_TO_FIND_GROUP = true;

  public static void main(String args[]) throws Exception {
    // create the file and add groups ans dataset into the file
    int repeats = 25;
    int datasets = 20;
    int datasetSize = 10000;

    H5GroupHeapMemoryLeak test = new H5GroupHeapMemoryLeak();
    long goodFileSize = test.runGoodDemo(repeats, datasets, datasetSize);
    long leakFileSize = test.runMemoryLeakDemo(repeats, datasets,
datasetSize);

    System.out.println("good file size: " + goodFileSize);
    System.out.println("leak file size: " + leakFileSize);
  }

  private long runMemoryLeakDemo(int repeats, int datasets, int datasetSize)
throws Exception {
    String fileName = H5GroupHeapMemoryLeak.class.getSimpleName()
        + "-LEAK" + "-" + System.currentTimeMillis() + "-repeats" + repeats
        + "-datasets" + datasets + "-datasetSize" + datasetSize + ".h5";
    createFile(fileName);
    long finalFileSize = 0;
    for (int i = 0; i < repeats; i++) {
      FileFormat testFile = getFile(fileName);
      testFile.open();

      Group datasetGroup = getGroupWithMemoryLeak(testFile, "group" + i);
      writeToDatasetGroup(testFile, datasetGroup, datasets, datasetSize);

      testFile.close();
      finalFileSize = testFile.length();
      //ystem.out.println("\tfile length " + i + ": " + finalFileSize);
    }
    return finalFileSize;
  }

  private long runGoodDemo(int repeats, int datasets, int datasetSize)
throws Exception {
    String fileName = H5GroupHeapMemoryLeak.class.getSimpleName()
        + "-GOOD" + "-" + System.currentTimeMillis() + "-repeats" + repeats
        + "-datasets" + datasets + "-datasetSize" + datasetSize + ".h5";
    createFile(fileName);
    long finalFileSize = 0;
    for (int i = 0; i < repeats; i++) {
      FileFormat testFile = getFile(fileName);
      testFile.open();

      Group datasetGroup = getGoodGroup(testFile, "group" + i);
      writeToDatasetGroup(testFile, datasetGroup, datasets, datasetSize);

      testFile.close();
      finalFileSize = testFile.length();
      //System.out.println("\tfile length " + i + ": " + finalFileSize);
    }
    return finalFileSize;
  }

  private FileFormat getFile(String fileName) throws Exception {
    // retrieve an instance of H5File
    FileFormat fileFormat =
FileFormat.getFileFormat(FileFormat.FILE_TYPE_HDF5);
    if (fileFormat == null) {
      System.err.println("Cannot find HDF5 FileFormat.");
      return null;
    }
    // open the file with read and write access
    FileFormat testFile = (H5File) fileFormat.createFile(fileName,
FileFormat.FILE_CREATE_OPEN);

    if (testFile == null) {
      System.err.println("Failed to open file: " + fileName);
      return null;
    }
    return testFile;
  }

  private Group getGoodGroup(FileFormat testFile, String groupName) throws
Exception {
    Group root = (Group) ((javax.swing.tree.DefaultMutableTreeNode)
testFile.getRootNode()).getUserObject();
    Group datasetGroup = testFile.createGroup(groupName, root);
    return datasetGroup;
  }

  private Group getGroupWithMemoryLeak(FileFormat testFile, String
groupName) throws Exception {
    Group root = (Group) ((javax.swing.tree.DefaultMutableTreeNode)
testFile.getRootNode()).getUserObject();
    Group levelOneGroup = null;
    if (USEGET_TO_FIND_GROUP) {
      levelOneGroup = (Group) testFile.get("levelOneGroup");
    }
    else {
      List<?> members = root.getMemberList();
      for (Object tryingToFindGroup : members) {
        if (tryingToFindGroup instanceof Group &&
"levelOneGroup".equals(((Group) tryingToFindGroup).getName())) {
          levelOneGroup = (Group) tryingToFindGroup;
          break;
        }
      }
    }
    if (levelOneGroup == null) {
      levelOneGroup = testFile.createGroup("levelOneGroup", root);
    }
    Group datasetGroup = testFile.createGroup(groupName, levelOneGroup);
    return datasetGroup;
  }

  private void writeToDatasetGroup(FileFormat testFile, Group datasetGroup,
int datasets, int datasetSize) throws Exception {
    for (int i = 0; i < datasets; i++) {
      int[] args = new int[] { Datatype.CLASS_INTEGER, 8, Datatype.NATIVE,
Datatype.SIGN_NONE };
      Datatype datatype = testFile.createDatatype(args[0], args[1], args[2],
args[3]);
      writeDataset("data" + i, datasetSize, testFile, datasetGroup,
datatype);
    }
  }

  private void writeDataset(String datasetName, int datasetSize, FileFormat
testFile, Group group, Datatype datatype) throws Exception {
    int size = datasetSize;
    long[] initialSize = new long[] { size };
    long[] maxSize = new long[] { Long.MAX_VALUE };
    long[] chunkSize = new long[] { 60000 };
    int gzipCompressionLevel = HDF5SdfConstants.COMPRESSION_LEVEL;
    Dataset dataset = testFile.createScalarDS(datasetName, group, datatype,
initialSize, maxSize, chunkSize, gzipCompressionLevel, null);
    dataset.init();
    long[] data = new long[size];
    for (int i = 0; i < size; i++) {
      data[i] = i;
    }
    //we don't have to write data to show the group memory leak
    //dataset.write(data);
    dataset.close(dataset.getFID());
  }

  /**
   * create the file and add groups ans dataset into the file, which is the
same
   * as javaExample.H5DatasetCreate
   * @see javaExample.H5DatasetCreate
   * @throws Exception
   */
  private void createFile(String fileName) throws Exception {
    // retrieve an instance of H5File
    FileFormat fileFormat =
FileFormat.getFileFormat(FileFormat.FILE_TYPE_HDF5);

    if (fileFormat == null) {
      System.err.println("Cannot find HDF5 FileFormat.");
      return;
    }

    // create a new file with a given file name.
    H5File testFile = (H5File) fileFormat.createFile(fileName,
FileFormat.FILE_CREATE_OPEN);

    if (testFile == null) {
      System.err.println("Failed to create file:" + fileName);
      return;
    }

    // open the file and retrieve the root group
    testFile.open();
    // close file resource
    testFile.close();
  }
}

thanks, Aaron Kagawa

Hi Aaron,

Thank you very much for reporting the problem. We will look at the issue.

--pc

···

On 9/27/2011 6:39 PM, Aaron Kagawa wrote:

Greetings,

I've recently noticed a problem with the group heap size in the current distribution. I created a sample program that demonstrates the condition that I am seeing. Here is the write up of the situation.

The problem doesn't seem to be happening in older versions of the HDF-JAVA. Please let me know if I am missing something in the new API.

-------------------------------
The problem occurs in
- hdf-java-2.7-bin for Windows 64bit and Java 1.6.0_26-b03 for Windows 64bit
- hdf-java-2.7-bin for Windows 32bit and Java jdk1.6.0_25 for Windows 32 bit

Fortunately, it seems that the problem does not occur in
- hdf-java-2.6.1-bin for Windows 64bit and Java 1.6.0_26-b03 for Windows 64bit

-------------------------------
I wrote a Java program that demonstrates this problem called H5GroupHeapMemoryLeak. The demo basically runs two tests.

Test 1 - working test with _no_ leak
step1. open file
step2. create a unique group
step3. write a 20 datasets with 10k values in them under the group
step4. close file
step5. repeat steps1-4 25 times

Test 2 - demonstrates the Group Heap Leak problem
step1. open file
step2. create a group called "levelOneGroup" or get it from the file
step3. create a unique group
step4. write a 20 datasets with 10k values in them under the group
step5. close file
step6. repeat steps1-4 25 times

The difference between Test 1 and Test 2 seems to be the reuse of the "levelOneGroup" group in Test 2. In this case, we must call fileFormat.get(String) to get the group. It seems that this call is what creates the heap problem.
-------------------------------
The output of the program is the following:

good file size: 202392
leak file size: 23187400

-------------------------------
The output of h5stat shows the differences in the Group Heap

good file size: 202392
File space information for file metadata (in bytes):
        Groups:
                B-tree/List: 48584
                Heap: 9984

//leak file size: 23187400
File space information for file metadata (in bytes):
        Groups:
                B-tree/List: 49456
                Heap: 11544088

-------------------------------
Here is the code for this demonstration. I based the demo on some tests that were already in the distribution (excuse the lack of documentation)

import java.util.List;

import ncsa.hdf.object.Dataset;
import ncsa.hdf.object.Datatype;
import ncsa.hdf.object.FileFormat;
import ncsa.hdf.object.Group;
import ncsa.hdf.object.h5.H5File;

import com.referentia.liveaction.server.core.nfstore.impl.hdf5.util.HDF5SdfConstants;

/**
* Implements an example of a memory leak in the groups.
*/
public class H5GroupHeapMemoryLeak {
  private static final boolean USEGET_TO_FIND_GROUP = true;

  public static void main(String args[]) throws Exception {
    // create the file and add groups ans dataset into the file
    int repeats = 25;
    int datasets = 20;
    int datasetSize = 10000;
    H5GroupHeapMemoryLeak test = new H5GroupHeapMemoryLeak();
    long goodFileSize = test.runGoodDemo(repeats, datasets, datasetSize);
    long leakFileSize = test.runMemoryLeakDemo(repeats, datasets, datasetSize);
    System.out.println("good file size: " + goodFileSize);
    System.out.println("leak file size: " + leakFileSize);
  }
  private long runMemoryLeakDemo(int repeats, int datasets, int datasetSize) throws Exception {
    String fileName = H5GroupHeapMemoryLeak.class.getSimpleName()
        + "-LEAK" + "-" + System.currentTimeMillis() + "-repeats" + repeats
        + "-datasets" + datasets + "-datasetSize" + datasetSize + ".h5";
    createFile(fileName);
    long finalFileSize = 0;
    for (int i = 0; i < repeats; i++) {
      FileFormat testFile = getFile(fileName);
      testFile.open();
      Group datasetGroup = getGroupWithMemoryLeak(testFile, "group" + i);
      writeToDatasetGroup(testFile, datasetGroup, datasets, datasetSize);
      testFile.close();
      finalFileSize = testFile.length();
      //ystem.out.println("\tfile length " + i + ": " + finalFileSize);
    }
    return finalFileSize;
  }
  private long runGoodDemo(int repeats, int datasets, int datasetSize) throws Exception {
    String fileName = H5GroupHeapMemoryLeak.class.getSimpleName()
        + "-GOOD" + "-" + System.currentTimeMillis() + "-repeats" + repeats
        + "-datasets" + datasets + "-datasetSize" + datasetSize + ".h5";
    createFile(fileName);
    long finalFileSize = 0;
    for (int i = 0; i < repeats; i++) {
      FileFormat testFile = getFile(fileName);
      testFile.open();
      Group datasetGroup = getGoodGroup(testFile, "group" + i);
      writeToDatasetGroup(testFile, datasetGroup, datasets, datasetSize);
      testFile.close();
      finalFileSize = testFile.length();
      //System.out.println("\tfile length " + i + ": " + finalFileSize);
    }
    return finalFileSize;
  }
  private FileFormat getFile(String fileName) throws Exception {
    // retrieve an instance of H5File
    FileFormat fileFormat = FileFormat.getFileFormat(FileFormat.FILE_TYPE_HDF5);
    if (fileFormat == null) {
      System.err.println("Cannot find HDF5 FileFormat.");
      return null;
    }
    // open the file with read and write access
    FileFormat testFile = (H5File) fileFormat.createFile(fileName, FileFormat.FILE_CREATE_OPEN);

    if (testFile == null) {
      System.err.println("Failed to open file: " + fileName);
      return null;
    }
    return testFile;
  }
  private Group getGoodGroup(FileFormat testFile, String groupName) throws Exception {
    Group root = (Group) ((javax.swing.tree.DefaultMutableTreeNode) testFile.getRootNode()).getUserObject();
    Group datasetGroup = testFile.createGroup(groupName, root);
    return datasetGroup;
  }

  private Group getGroupWithMemoryLeak(FileFormat testFile, String groupName) throws Exception {
    Group root = (Group) ((javax.swing.tree.DefaultMutableTreeNode) testFile.getRootNode()).getUserObject();
    Group levelOneGroup = null;
    if (USEGET_TO_FIND_GROUP) {
      levelOneGroup = (Group) testFile.get("levelOneGroup");
    }
    else {
      List<?> members = root.getMemberList();
      for (Object tryingToFindGroup : members) {
        if (tryingToFindGroup instanceof Group && "levelOneGroup".equals(((Group) tryingToFindGroup).getName())) {
          levelOneGroup = (Group) tryingToFindGroup;
          break;
        }
      }
    }
    if (levelOneGroup == null) {
      levelOneGroup = testFile.createGroup("levelOneGroup", root);
    }
    Group datasetGroup = testFile.createGroup(groupName, levelOneGroup);
    return datasetGroup;
  }

  private void writeToDatasetGroup(FileFormat testFile, Group datasetGroup, int datasets, int datasetSize) throws Exception {
    for (int i = 0; i < datasets; i++) {
      int[] args = new int[] { Datatype.CLASS_INTEGER, 8, Datatype.NATIVE, Datatype.SIGN_NONE };
      Datatype datatype = testFile.createDatatype(args[0], args[1], args[2], args[3]);
      writeDataset("data" + i, datasetSize, testFile, datasetGroup, datatype);
    }
  }
  private void writeDataset(String datasetName, int datasetSize, FileFormat testFile, Group group, Datatype datatype) throws Exception {
    int size = datasetSize;
    long[] initialSize = new long[] { size };
    long[] maxSize = new long[] { Long.MAX_VALUE };
    long[] chunkSize = new long[] { 60000 };
    int gzipCompressionLevel = HDF5SdfConstants.COMPRESSION_LEVEL;
    Dataset dataset = testFile.createScalarDS(datasetName, group, datatype, initialSize, maxSize, chunkSize, gzipCompressionLevel, null);
    dataset.init();
    long[] data = new long[size];
    for (int i = 0; i < size; i++) {
      data[i] = i;
    }
    //we don't have to write data to show the group memory leak
    //dataset.write(data);
    dataset.close(dataset.getFID());
  }

  /**
   * create the file and add groups ans dataset into the file, which is the same
   * as javaExample.H5DatasetCreate
   * @see javaExample.H5DatasetCreate
   * @throws Exception
   */
  private void createFile(String fileName) throws Exception {
    // retrieve an instance of H5File
    FileFormat fileFormat = FileFormat.getFileFormat(FileFormat.FILE_TYPE_HDF5);

    if (fileFormat == null) {
      System.err.println("Cannot find HDF5 FileFormat.");
      return;
    }

    // create a new file with a given file name.
    H5File testFile = (H5File) fileFormat.createFile(fileName, FileFormat.FILE_CREATE_OPEN);

    if (testFile == null) {
      System.err.println("Failed to create file:" + fileName);
      return;
    }

    // open the file and retrieve the root group
    testFile.open();
    // close file resource
    testFile.close();
  }
}

thanks, Aaron Kagawa

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

Hi Peter,

I was just wondering if this issue was confirmed.

thanks, Aaron

···

On Wed, Sep 28, 2011 at 3:51 AM, Peter Cao <xcao@hdfgroup.org> wrote:

Hi Aaron,

Thank you very much for reporting the problem. We will look at the issue.

--pc

On 9/27/2011 6:39 PM, Aaron Kagawa wrote:

Greetings,

I've recently noticed a problem with the group heap size in the current
distribution. I created a sample program that demonstrates the condition
that I am seeing. Here is the write up of the situation.

The problem doesn't seem to be happening in older versions of the
HDF-JAVA. Please let me know if I am missing something in the new API.

-------------------------------
The problem occurs in
- hdf-java-2.7-bin for Windows 64bit and Java 1.6.0_26-b03 for Windows
64bit
- hdf-java-2.7-bin for Windows 32bit and Java jdk1.6.0_25 for Windows 32
bit

Fortunately, it seems that the problem does not occur in
- hdf-java-2.6.1-bin for Windows 64bit and Java 1.6.0_26-b03 for Windows
64bit

-------------------------------
I wrote a Java program that demonstrates this problem called
H5GroupHeapMemoryLeak. The demo basically runs two tests.

Test 1 - working test with _no_ leak
step1. open file
step2. create a unique group
step3. write a 20 datasets with 10k values in them under the group
step4. close file
step5. repeat steps1-4 25 times

Test 2 - demonstrates the Group Heap Leak problem
step1. open file
step2. create a group called "levelOneGroup" or get it from the file
step3. create a unique group
step4. write a 20 datasets with 10k values in them under the group
step5. close file
step6. repeat steps1-4 25 times

The difference between Test 1 and Test 2 seems to be the reuse of the
"levelOneGroup" group in Test 2. In this case, we must call
fileFormat.get(String) to get the group. It seems that this call is what
creates the heap problem.
-------------------------------
The output of the program is the following:

good file size: 202392
leak file size: 23187400

-------------------------------
The output of h5stat shows the differences in the Group Heap

good file size: 202392
File space information for file metadata (in bytes):
        Groups:
                B-tree/List: 48584
                Heap: 9984

//leak file size: 23187400
File space information for file metadata (in bytes):
        Groups:
                B-tree/List: 49456
                Heap: 11544088

-------------------------------
Here is the code for this demonstration. I based the demo on some tests
that were already in the distribution (excuse the lack of documentation)

import java.util.List;

import ncsa.hdf.object.Dataset;
import ncsa.hdf.object.Datatype;
import ncsa.hdf.object.FileFormat;
import ncsa.hdf.object.Group;
import ncsa.hdf.object.h5.H5File;

import
com.referentia.liveaction.server.core.nfstore.impl.hdf5.util.HDF5SdfConstants;

/**
* Implements an example of a memory leak in the groups.
*/
public class H5GroupHeapMemoryLeak {

  private static final boolean USEGET_TO_FIND_GROUP = true;

   public static void main(String args[]) throws Exception {
    // create the file and add groups ans dataset into the file
    int repeats = 25;
    int datasets = 20;
    int datasetSize = 10000;

    H5GroupHeapMemoryLeak test = new H5GroupHeapMemoryLeak();
    long goodFileSize = test.runGoodDemo(repeats, datasets, datasetSize);
    long leakFileSize = test.runMemoryLeakDemo(repeats, datasets,
datasetSize);

    System.out.println("good file size: " + goodFileSize);
    System.out.println("leak file size: " + leakFileSize);
  }

  private long runMemoryLeakDemo(int repeats, int datasets, int
datasetSize) throws Exception {
    String fileName = H5GroupHeapMemoryLeak.class.getSimpleName()
        + "-LEAK" + "-" + System.currentTimeMillis() + "-repeats" +
repeats
        + "-datasets" + datasets + "-datasetSize" + datasetSize + ".h5";
    createFile(fileName);
    long finalFileSize = 0;
    for (int i = 0; i < repeats; i++) {
      FileFormat testFile = getFile(fileName);
      testFile.open();

      Group datasetGroup = getGroupWithMemoryLeak(testFile, "group" + i);
      writeToDatasetGroup(testFile, datasetGroup, datasets, datasetSize);

      testFile.close();
      finalFileSize = testFile.length();
      //ystem.out.println("\tfile length " + i + ": " + finalFileSize);
    }
    return finalFileSize;
  }

  private long runGoodDemo(int repeats, int datasets, int datasetSize)
throws Exception {
    String fileName = H5GroupHeapMemoryLeak.class.getSimpleName()
        + "-GOOD" + "-" + System.currentTimeMillis() + "-repeats" +
repeats
        + "-datasets" + datasets + "-datasetSize" + datasetSize + ".h5";
    createFile(fileName);
    long finalFileSize = 0;
    for (int i = 0; i < repeats; i++) {
      FileFormat testFile = getFile(fileName);
      testFile.open();

      Group datasetGroup = getGoodGroup(testFile, "group" + i);
      writeToDatasetGroup(testFile, datasetGroup, datasets, datasetSize);

      testFile.close();
      finalFileSize = testFile.length();
      //System.out.println("\tfile length " + i + ": " + finalFileSize);
    }
    return finalFileSize;
  }

  private FileFormat getFile(String fileName) throws Exception {
    // retrieve an instance of H5File
    FileFormat fileFormat =
FileFormat.getFileFormat(FileFormat.FILE_TYPE_HDF5);
    if (fileFormat == null) {
      System.err.println("Cannot find HDF5 FileFormat.");
      return null;
    }
    // open the file with read and write access
    FileFormat testFile = (H5File) fileFormat.createFile(fileName,
FileFormat.FILE_CREATE_OPEN);

     if (testFile == null) {
      System.err.println("Failed to open file: " + fileName);
      return null;
    }
    return testFile;
  }

  private Group getGoodGroup(FileFormat testFile, String groupName) throws
Exception {
    Group root = (Group) ((javax.swing.tree.DefaultMutableTreeNode)
testFile.getRootNode()).getUserObject();
    Group datasetGroup = testFile.createGroup(groupName, root);
    return datasetGroup;
  }

   private Group getGroupWithMemoryLeak(FileFormat testFile, String
groupName) throws Exception {
    Group root = (Group) ((javax.swing.tree.DefaultMutableTreeNode)
testFile.getRootNode()).getUserObject();
    Group levelOneGroup = null;
    if (USEGET_TO_FIND_GROUP) {
      levelOneGroup = (Group) testFile.get("levelOneGroup");
    }
    else {
      List<?> members = root.getMemberList();
      for (Object tryingToFindGroup : members) {
        if (tryingToFindGroup instanceof Group &&
"levelOneGroup".equals(((Group) tryingToFindGroup).getName())) {
          levelOneGroup = (Group) tryingToFindGroup;
          break;
        }
      }
    }
    if (levelOneGroup == null) {
      levelOneGroup = testFile.createGroup("levelOneGroup", root);
    }
    Group datasetGroup = testFile.createGroup(groupName, levelOneGroup);
    return datasetGroup;
  }

   private void writeToDatasetGroup(FileFormat testFile, Group
datasetGroup, int datasets, int datasetSize) throws Exception {
    for (int i = 0; i < datasets; i++) {
      int[] args = new int[] { Datatype.CLASS_INTEGER, 8, Datatype.NATIVE,
Datatype.SIGN_NONE };
      Datatype datatype = testFile.createDatatype(args[0], args[1],
args[2], args[3]);
      writeDataset("data" + i, datasetSize, testFile, datasetGroup,
datatype);
    }
  }

  private void writeDataset(String datasetName, int datasetSize, FileFormat
testFile, Group group, Datatype datatype) throws Exception {
    int size = datasetSize;
    long[] initialSize = new long[] { size };
    long[] maxSize = new long[] { Long.MAX_VALUE };
    long[] chunkSize = new long[] { 60000 };
    int gzipCompressionLevel = HDF5SdfConstants.COMPRESSION_LEVEL;
    Dataset dataset = testFile.createScalarDS(datasetName, group, datatype,
initialSize, maxSize, chunkSize, gzipCompressionLevel, null);
    dataset.init();
    long[] data = new long[size];
    for (int i = 0; i < size; i++) {
      data[i] = i;
    }
    //we don't have to write data to show the group memory leak
    //dataset.write(data);
    dataset.close(dataset.getFID());
  }

   /**
   * create the file and add groups ans dataset into the file, which is the
same
   * as javaExample.H5DatasetCreate
   * @see javaExample.H5DatasetCreate
   * @throws Exception
   */
  private void createFile(String fileName) throws Exception {
    // retrieve an instance of H5File
    FileFormat fileFormat =
FileFormat.getFileFormat(FileFormat.FILE_TYPE_HDF5);

     if (fileFormat == null) {
      System.err.println("Cannot find HDF5 FileFormat.");
      return;
    }

     // create a new file with a given file name.
    H5File testFile = (H5File) fileFormat.createFile(fileName,
FileFormat.FILE_CREATE_OPEN);

     if (testFile == null) {
      System.err.println("Failed to create file:" + fileName);
      return;
    }

     // open the file and retrieve the root group
    testFile.open();
    // close file resource
    testFile.close();
  }
}

thanks, Aaron Kagawa

_______________________________________________
Hdf-forum is for HDF software users discussion.Hdf-forum@hdfgroup.orghttp://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

Hi Aaron,

Thank you for the reminder.

Yes, we are able to reproduce the problem but have not identified the cause yet.
We will keep you updated.

Thanks
--pc

···

On 10/7/2011 1:50 AM, Aaron Kagawa wrote:

Hi Peter,

I was just wondering if this issue was confirmed.

thanks, Aaron

On Wed, Sep 28, 2011 at 3:51 AM, Peter Cao <xcao@hdfgroup.org > <mailto:xcao@hdfgroup.org>> wrote:

    Hi Aaron,

    Thank you very much for reporting the problem. We will look at the
    issue.

    --pc

    On 9/27/2011 6:39 PM, Aaron Kagawa wrote:

    Greetings,

    I've recently noticed a problem with the group heap size in the
    current distribution. I created a sample program that
    demonstrates the condition that I am seeing. Here is the write up
    of the situation.

    The problem doesn't seem to be happening in older versions of the
    HDF-JAVA. Please let me know if I am missing something in the
    new API.

    -------------------------------
    The problem occurs in
    - hdf-java-2.7-bin for Windows 64bit and Java 1.6.0_26-b03 for
    Windows 64bit
    - hdf-java-2.7-bin for Windows 32bit and Java jdk1.6.0_25 for
    Windows 32 bit

    Fortunately, it seems that the problem does not occur in
    - hdf-java-2.6.1-bin for Windows 64bit and Java 1.6.0_26-b03 for
    Windows 64bit

    -------------------------------
    I wrote a Java program that demonstrates this problem called
    H5GroupHeapMemoryLeak. The demo basically runs two tests.

    Test 1 - working test with _no_ leak
    step1. open file
    step2. create a unique group
    step3. write a 20 datasets with 10k values in them under the group
    step4. close file
    step5. repeat steps1-4 25 times

    Test 2 - demonstrates the Group Heap Leak problem
    step1. open file
    step2. create a group called "levelOneGroup" or get it from the file
    step3. create a unique group
    step4. write a 20 datasets with 10k values in them under the group
    step5. close file
    step6. repeat steps1-4 25 times

    The difference between Test 1 and Test 2 seems to be the reuse of
    the "levelOneGroup" group in Test 2. In this case, we must call
    fileFormat.get(String) to get the group. It seems that this call
    is what creates the heap problem.
    -------------------------------
    The output of the program is the following:

    good file size: 202392
    leak file size: 23187400

    -------------------------------
    The output of h5stat shows the differences in the Group Heap

    good file size: 202392
    File space information for file metadata (in bytes):
            Groups:
                    B-tree/List: 48584
                    Heap: 9984

    //leak file size: 23187400
    File space information for file metadata (in bytes):
            Groups:
                    B-tree/List: 49456
                    Heap: 11544088

    -------------------------------
    Here is the code for this demonstration. I based the demo on some
    tests that were already in the distribution (excuse the lack of
    documentation)

    import java.util.List;

    import ncsa.hdf.object.Dataset;
    import ncsa.hdf.object.Datatype;
    import ncsa.hdf.object.FileFormat;
    import ncsa.hdf.object.Group;
    import ncsa.hdf.object.h5.H5File;

    import
    com.referentia.liveaction.server.core.nfstore.impl.hdf5.util.HDF5SdfConstants;

    /**
     * Implements an example of a memory leak in the groups.
     */
    public class H5GroupHeapMemoryLeak {
      private static final boolean USEGET_TO_FIND_GROUP = true;

      public static void main(String args[]) throws Exception {
        // create the file and add groups ans dataset into the file
        int repeats = 25;
        int datasets = 20;
        int datasetSize = 10000;
        H5GroupHeapMemoryLeak test = new H5GroupHeapMemoryLeak();
        long goodFileSize = test.runGoodDemo(repeats, datasets,
    datasetSize);
        long leakFileSize = test.runMemoryLeakDemo(repeats, datasets,
    datasetSize);
        System.out.println("good file size: " + goodFileSize);
        System.out.println("leak file size: " + leakFileSize);
      }
      private long runMemoryLeakDemo(int repeats, int datasets, int
    datasetSize) throws Exception {
        String fileName = H5GroupHeapMemoryLeak.class.getSimpleName()
            + "-LEAK" + "-" + System.currentTimeMillis() + "-repeats"
    + repeats
            + "-datasets" + datasets + "-datasetSize" + datasetSize +
    ".h5";
        createFile(fileName);
        long finalFileSize = 0;
        for (int i = 0; i < repeats; i++) {
          FileFormat testFile = getFile(fileName);
          testFile.open();
          Group datasetGroup = getGroupWithMemoryLeak(testFile,
    "group" + i);
          writeToDatasetGroup(testFile, datasetGroup, datasets,
    datasetSize);
          testFile.close();
          finalFileSize = testFile.length();
          //ystem.out.println("\tfile length " + i + ": " +
    finalFileSize);
        }
        return finalFileSize;
      }
      private long runGoodDemo(int repeats, int datasets, int
    datasetSize) throws Exception {
        String fileName = H5GroupHeapMemoryLeak.class.getSimpleName()
            + "-GOOD" + "-" + System.currentTimeMillis() + "-repeats"
    + repeats
            + "-datasets" + datasets + "-datasetSize" + datasetSize +
    ".h5";
        createFile(fileName);
        long finalFileSize = 0;
        for (int i = 0; i < repeats; i++) {
          FileFormat testFile = getFile(fileName);
          testFile.open();
          Group datasetGroup = getGoodGroup(testFile, "group" + i);
          writeToDatasetGroup(testFile, datasetGroup, datasets,
    datasetSize);
          testFile.close();
          finalFileSize = testFile.length();
          //System.out.println("\tfile length " + i + ": " +
    finalFileSize);
        }
        return finalFileSize;
      }
      private FileFormat getFile(String fileName) throws Exception {
        // retrieve an instance of H5File
        FileFormat fileFormat =
    FileFormat.getFileFormat(FileFormat.FILE_TYPE_HDF5);
        if (fileFormat == null) {
          System.err.println("Cannot find HDF5 FileFormat.");
          return null;
        }
        // open the file with read and write access
        FileFormat testFile = (H5File)
    fileFormat.createFile(fileName, FileFormat.FILE_CREATE_OPEN);

        if (testFile == null) {
          System.err.println("Failed to open file: " + fileName);
          return null;
        }
        return testFile;
      }
      private Group getGoodGroup(FileFormat testFile, String
    groupName) throws Exception {
        Group root = (Group)
    ((javax.swing.tree.DefaultMutableTreeNode)
    testFile.getRootNode()).getUserObject();
        Group datasetGroup = testFile.createGroup(groupName, root);
        return datasetGroup;
      }

      private Group getGroupWithMemoryLeak(FileFormat testFile,
    String groupName) throws Exception {
        Group root = (Group)
    ((javax.swing.tree.DefaultMutableTreeNode)
    testFile.getRootNode()).getUserObject();
        Group levelOneGroup = null;
        if (USEGET_TO_FIND_GROUP) {
          levelOneGroup = (Group) testFile.get("levelOneGroup");
        }
        else {
          List<?> members = root.getMemberList();
          for (Object tryingToFindGroup : members) {
            if (tryingToFindGroup instanceof Group &&
    "levelOneGroup".equals(((Group) tryingToFindGroup).getName())) {
              levelOneGroup = (Group) tryingToFindGroup;
              break;
            }
          }
        }
        if (levelOneGroup == null) {
          levelOneGroup = testFile.createGroup("levelOneGroup", root);
        }
        Group datasetGroup = testFile.createGroup(groupName,
    levelOneGroup);
        return datasetGroup;
      }

      private void writeToDatasetGroup(FileFormat testFile, Group
    datasetGroup, int datasets, int datasetSize) throws Exception {
        for (int i = 0; i < datasets; i++) {
          int[] args = new int[] { Datatype.CLASS_INTEGER, 8,
    Datatype.NATIVE, Datatype.SIGN_NONE };
          Datatype datatype = testFile.createDatatype(args[0],
    args[1], args[2], args[3]);
          writeDataset("data" + i, datasetSize, testFile,
    datasetGroup, datatype);
        }
      }
      private void writeDataset(String datasetName, int datasetSize,
    FileFormat testFile, Group group, Datatype datatype) throws
    Exception {
        int size = datasetSize;
        long[] initialSize = new long[] { size };
        long[] maxSize = new long[] { Long.MAX_VALUE };
        long[] chunkSize = new long[] { 60000 };
        int gzipCompressionLevel = HDF5SdfConstants.COMPRESSION_LEVEL;
        Dataset dataset = testFile.createScalarDS(datasetName, group,
    datatype, initialSize, maxSize, chunkSize, gzipCompressionLevel,
    null);
        dataset.init();
        long[] data = new long[size];
        for (int i = 0; i < size; i++) {
          data[i] = i;
        }
        //we don't have to write data to show the group memory leak
        //dataset.write(data);
        dataset.close(dataset.getFID());
      }

      /**
       * create the file and add groups ans dataset into the file,
    which is the same
       * as javaExample.H5DatasetCreate
       * @see javaExample.H5DatasetCreate
       * @throws Exception
       */
      private void createFile(String fileName) throws Exception {
        // retrieve an instance of H5File
        FileFormat fileFormat =
    FileFormat.getFileFormat(FileFormat.FILE_TYPE_HDF5);

        if (fileFormat == null) {
          System.err.println("Cannot find HDF5 FileFormat.");
          return;
        }

        // create a new file with a given file name.
        H5File testFile = (H5File) fileFormat.createFile(fileName,
    FileFormat.FILE_CREATE_OPEN);

        if (testFile == null) {
          System.err.println("Failed to create file:" + fileName);
          return;
        }

        // open the file and retrieve the root group
        testFile.open();
        // close file resource
        testFile.close();
      }
    }

    thanks, Aaron Kagawa

    _______________________________________________
    Hdf-forum is for HDF software users discussion.
    Hdf-forum@hdfgroup.org <mailto:Hdf-forum@hdfgroup.org>
    http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

    _______________________________________________
    Hdf-forum is for HDF software users discussion.
    Hdf-forum@hdfgroup.org <mailto:Hdf-forum@hdfgroup.org>
    http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

Hi Aaron,

The group heap problem is very strange. I couldn't figure out the cause.

I created a test program below to isolate the problem. If I use H5.H5Fopen()
(USE_H5=true), everything is fine. The problem is inside testFile.open().

testFile.open() calls H5.H5Fopen() and then calls H5File.depthFirst() to
retrieve the structure of ae file. Eventually, the problem lands at obj_info_all()
at h5gImp.c. I wrote and run a similar program in C (attached). The file size
was correct. So I couldn't reproduce the problem in C. I guess the problem is in
the Java layer not the C library.

I will continue to look for the source of the problem. If you (or anyone in the forum)
figure it out, please let me. Thank you very much in advance.

--pc

creategrp.c (1.53 KB)

···

==================================
     static private void testGroupMemoryLeak(String fname) throws Exception
     {
         int _pid = HDF5Constants.H5P_DEFAULT;
         boolean USE_H5 = false;

         int fid = H5.H5Fcreate(fname, HDF5Constants.H5F_ACC_TRUNC, _pid, _pid);
         int gid = H5.H5Gcreate(fid, "/levelOneGroup", _pid, _pid, _pid);
         H5.H5Gclose(gid);
         H5.H5Fclose(fid);

         for (int i = 0; i<25; i++) {
             if (USE_H5)
                 fid = H5.H5Fopen(fname, HDF5Constants.H5F_ACC_RDWR, _pid);
             else {
                 FileFormat testFile = new H5File(fname, H5File.WRITE);
                    fid = testFile.open();
             }

             gid = H5.H5Gcreate(fid, "/levelOneGroup/group" + i, _pid, _pid, _pid);
             H5.H5Gclose(gid);
             H5.H5Fclose(fid);
         }

         System.out.println((new File(fname)).length());
     }

On 10/7/2011 1:50 AM, Aaron Kagawa wrote:

Hi Peter,

I was just wondering if this issue was confirmed.

thanks, Aaron

On Wed, Sep 28, 2011 at 3:51 AM, Peter Cao <xcao@hdfgroup.org > <mailto:xcao@hdfgroup.org>> wrote:

    Hi Aaron,

    Thank you very much for reporting the problem. We will look at the
    issue.

    --pc

    On 9/27/2011 6:39 PM, Aaron Kagawa wrote:

    Greetings,

    I've recently noticed a problem with the group heap size in the
    current distribution. I created a sample program that
    demonstrates the condition that I am seeing. Here is the write up
    of the situation.

    The problem doesn't seem to be happening in older versions of the
    HDF-JAVA. Please let me know if I am missing something in the
    new API.

    -------------------------------
    The problem occurs in
    - hdf-java-2.7-bin for Windows 64bit and Java 1.6.0_26-b03 for
    Windows 64bit
    - hdf-java-2.7-bin for Windows 32bit and Java jdk1.6.0_25 for
    Windows 32 bit

    Fortunately, it seems that the problem does not occur in
    - hdf-java-2.6.1-bin for Windows 64bit and Java 1.6.0_26-b03 for
    Windows 64bit

    -------------------------------
    I wrote a Java program that demonstrates this problem called
    H5GroupHeapMemoryLeak. The demo basically runs two tests.

    Test 1 - working test with _no_ leak
    step1. open file
    step2. create a unique group
    step3. write a 20 datasets with 10k values in them under the group
    step4. close file
    step5. repeat steps1-4 25 times

    Test 2 - demonstrates the Group Heap Leak problem
    step1. open file
    step2. create a group called "levelOneGroup" or get it from the file
    step3. create a unique group
    step4. write a 20 datasets with 10k values in them under the group
    step5. close file
    step6. repeat steps1-4 25 times

    The difference between Test 1 and Test 2 seems to be the reuse of
    the "levelOneGroup" group in Test 2. In this case, we must call
    fileFormat.get(String) to get the group. It seems that this call
    is what creates the heap problem.
    -------------------------------
    The output of the program is the following:

    good file size: 202392
    leak file size: 23187400

    -------------------------------
    The output of h5stat shows the differences in the Group Heap

    good file size: 202392
    File space information for file metadata (in bytes):
            Groups:
                    B-tree/List: 48584
                    Heap: 9984

    //leak file size: 23187400
    File space information for file metadata (in bytes):
            Groups:
                    B-tree/List: 49456
                    Heap: 11544088

    -------------------------------
    Here is the code for this demonstration. I based the demo on some
    tests that were already in the distribution (excuse the lack of
    documentation)

    import java.util.List;

    import ncsa.hdf.object.Dataset;
    import ncsa.hdf.object.Datatype;
    import ncsa.hdf.object.FileFormat;
    import ncsa.hdf.object.Group;
    import ncsa.hdf.object.h5.H5File;

    import
    com.referentia.liveaction.server.core.nfstore.impl.hdf5.util.HDF5SdfConstants;

    /**
     * Implements an example of a memory leak in the groups.
     */
    public class H5GroupHeapMemoryLeak {
      private static final boolean USEGET_TO_FIND_GROUP = true;

      public static void main(String args[]) throws Exception {
        // create the file and add groups ans dataset into the file
        int repeats = 25;
        int datasets = 20;
        int datasetSize = 10000;
        H5GroupHeapMemoryLeak test = new H5GroupHeapMemoryLeak();
        long goodFileSize = test.runGoodDemo(repeats, datasets,
    datasetSize);
        long leakFileSize = test.runMemoryLeakDemo(repeats, datasets,
    datasetSize);
        System.out.println("good file size: " + goodFileSize);
        System.out.println("leak file size: " + leakFileSize);
      }
      private long runMemoryLeakDemo(int repeats, int datasets, int
    datasetSize) throws Exception {
        String fileName = H5GroupHeapMemoryLeak.class.getSimpleName()
            + "-LEAK" + "-" + System.currentTimeMillis() + "-repeats"
    + repeats
            + "-datasets" + datasets + "-datasetSize" + datasetSize +
    ".h5";
        createFile(fileName);
        long finalFileSize = 0;
        for (int i = 0; i < repeats; i++) {
          FileFormat testFile = getFile(fileName);
          testFile.open();
          Group datasetGroup = getGroupWithMemoryLeak(testFile,
    "group" + i);
          writeToDatasetGroup(testFile, datasetGroup, datasets,
    datasetSize);
          testFile.close();
          finalFileSize = testFile.length();
          //ystem.out.println("\tfile length " + i + ": " +
    finalFileSize);
        }
        return finalFileSize;
      }
      private long runGoodDemo(int repeats, int datasets, int
    datasetSize) throws Exception {
        String fileName = H5GroupHeapMemoryLeak.class.getSimpleName()
            + "-GOOD" + "-" + System.currentTimeMillis() + "-repeats"
    + repeats
            + "-datasets" + datasets + "-datasetSize" + datasetSize +
    ".h5";
        createFile(fileName);
        long finalFileSize = 0;
        for (int i = 0; i < repeats; i++) {
          FileFormat testFile = getFile(fileName);
          testFile.open();
          Group datasetGroup = getGoodGroup(testFile, "group" + i);
          writeToDatasetGroup(testFile, datasetGroup, datasets,
    datasetSize);
          testFile.close();
          finalFileSize = testFile.length();
          //System.out.println("\tfile length " + i + ": " +
    finalFileSize);
        }
        return finalFileSize;
      }
      private FileFormat getFile(String fileName) throws Exception {
        // retrieve an instance of H5File
        FileFormat fileFormat =
    FileFormat.getFileFormat(FileFormat.FILE_TYPE_HDF5);
        if (fileFormat == null) {
          System.err.println("Cannot find HDF5 FileFormat.");
          return null;
        }
        // open the file with read and write access
        FileFormat testFile = (H5File)
    fileFormat.createFile(fileName, FileFormat.FILE_CREATE_OPEN);

        if (testFile == null) {
          System.err.println("Failed to open file: " + fileName);
          return null;
        }
        return testFile;
      }
      private Group getGoodGroup(FileFormat testFile, String
    groupName) throws Exception {
        Group root = (Group)
    ((javax.swing.tree.DefaultMutableTreeNode)
    testFile.getRootNode()).getUserObject();
        Group datasetGroup = testFile.createGroup(groupName, root);
        return datasetGroup;
      }

      private Group getGroupWithMemoryLeak(FileFormat testFile,
    String groupName) throws Exception {
        Group root = (Group)
    ((javax.swing.tree.DefaultMutableTreeNode)
    testFile.getRootNode()).getUserObject();
        Group levelOneGroup = null;
        if (USEGET_TO_FIND_GROUP) {
          levelOneGroup = (Group) testFile.get("levelOneGroup");
        }
        else {
          List<?> members = root.getMemberList();
          for (Object tryingToFindGroup : members) {
            if (tryingToFindGroup instanceof Group &&
    "levelOneGroup".equals(((Group) tryingToFindGroup).getName())) {
              levelOneGroup = (Group) tryingToFindGroup;
              break;
            }
          }
        }
        if (levelOneGroup == null) {
          levelOneGroup = testFile.createGroup("levelOneGroup", root);
        }
        Group datasetGroup = testFile.createGroup(groupName,
    levelOneGroup);
        return datasetGroup;
      }

      private void writeToDatasetGroup(FileFormat testFile, Group
    datasetGroup, int datasets, int datasetSize) throws Exception {
        for (int i = 0; i < datasets; i++) {
          int[] args = new int[] { Datatype.CLASS_INTEGER, 8,
    Datatype.NATIVE, Datatype.SIGN_NONE };
          Datatype datatype = testFile.createDatatype(args[0],
    args[1], args[2], args[3]);
          writeDataset("data" + i, datasetSize, testFile,
    datasetGroup, datatype);
        }
      }
      private void writeDataset(String datasetName, int datasetSize,
    FileFormat testFile, Group group, Datatype datatype) throws
    Exception {
        int size = datasetSize;
        long[] initialSize = new long[] { size };
        long[] maxSize = new long[] { Long.MAX_VALUE };
        long[] chunkSize = new long[] { 60000 };
        int gzipCompressionLevel = HDF5SdfConstants.COMPRESSION_LEVEL;
        Dataset dataset = testFile.createScalarDS(datasetName, group,
    datatype, initialSize, maxSize, chunkSize, gzipCompressionLevel,
    null);
        dataset.init();
        long[] data = new long[size];
        for (int i = 0; i < size; i++) {
          data[i] = i;
        }
        //we don't have to write data to show the group memory leak
        //dataset.write(data);
        dataset.close(dataset.getFID());
      }

      /**
       * create the file and add groups ans dataset into the file,
    which is the same
       * as javaExample.H5DatasetCreate
       * @see javaExample.H5DatasetCreate
       * @throws Exception
       */
      private void createFile(String fileName) throws Exception {
        // retrieve an instance of H5File
        FileFormat fileFormat =
    FileFormat.getFileFormat(FileFormat.FILE_TYPE_HDF5);

        if (fileFormat == null) {
          System.err.println("Cannot find HDF5 FileFormat.");
          return;
        }

        // create a new file with a given file name.
        H5File testFile = (H5File) fileFormat.createFile(fileName,
    FileFormat.FILE_CREATE_OPEN);

        if (testFile == null) {
          System.err.println("Failed to create file:" + fileName);
          return;
        }

        // open the file and retrieve the root group
        testFile.open();
        // close file resource
        testFile.close();
      }
    }

    thanks, Aaron Kagawa

    _______________________________________________
    Hdf-forum is for HDF software users discussion.
    Hdf-forum@hdfgroup.org <mailto:Hdf-forum@hdfgroup.org>
    http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

    _______________________________________________
    Hdf-forum is for HDF software users discussion.
    Hdf-forum@hdfgroup.org <mailto:Hdf-forum@hdfgroup.org>
    http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

Hi Aaron,

We fixed the group heap problem in Java. Basically we replaced H5Oget_info_by_name()
by H5Oget_info() at obj_info_all() (h5gImp.c). It seems that there may be some issue at
H5Oget_info_by_name(). We nee further investigation of the issue in the C library. For now,
the Java part works fine.

The fix will go into HDF-Java 2.8 release, which is scheduled for middle December 2011.

If you need the fix now, you can just recompile the JNI C code with the follwoing code at
h5gImp.c

···

=========================================================================
herr_t obj_info_all(hid_t loc_id, const char *name, const H5L_info_t *info, void *op_data)
{
     int type = -1;
     hid_t oid=-1;
     herr_t retVal = 0;
     info_all_t* datainfo = (info_all_t*)op_data;
     H5O_info_t object_info;

     oid = H5Oopen( loc_id, name, H5P_DEFAULT) ;
     if (oid>=0) {
         retVal = H5Oget_info( loc_id, &object_info );
         H5Oclose(oid);
     }

...

Thanks
--pc

On 10/7/2011 1:50 AM, Aaron Kagawa wrote:

Hi Peter,

I was just wondering if this issue was confirmed.

thanks, Aaron

On Wed, Sep 28, 2011 at 3:51 AM, Peter Cao <xcao@hdfgroup.org > <mailto:xcao@hdfgroup.org>> wrote:

    Hi Aaron,

    Thank you very much for reporting the problem. We will look at the
    issue.

    --pc

    On 9/27/2011 6:39 PM, Aaron Kagawa wrote:

    Greetings,

    I've recently noticed a problem with the group heap size in the
    current distribution. I created a sample program that
    demonstrates the condition that I am seeing. Here is the write up
    of the situation.

    The problem doesn't seem to be happening in older versions of the
    HDF-JAVA. Please let me know if I am missing something in the
    new API.

    -------------------------------
    The problem occurs in
    - hdf-java-2.7-bin for Windows 64bit and Java 1.6.0_26-b03 for
    Windows 64bit
    - hdf-java-2.7-bin for Windows 32bit and Java jdk1.6.0_25 for
    Windows 32 bit

    Fortunately, it seems that the problem does not occur in
    - hdf-java-2.6.1-bin for Windows 64bit and Java 1.6.0_26-b03 for
    Windows 64bit

    -------------------------------
    I wrote a Java program that demonstrates this problem called
    H5GroupHeapMemoryLeak. The demo basically runs two tests.

    Test 1 - working test with _no_ leak
    step1. open file
    step2. create a unique group
    step3. write a 20 datasets with 10k values in them under the group
    step4. close file
    step5. repeat steps1-4 25 times

    Test 2 - demonstrates the Group Heap Leak problem
    step1. open file
    step2. create a group called "levelOneGroup" or get it from the file
    step3. create a unique group
    step4. write a 20 datasets with 10k values in them under the group
    step5. close file
    step6. repeat steps1-4 25 times

    The difference between Test 1 and Test 2 seems to be the reuse of
    the "levelOneGroup" group in Test 2. In this case, we must call
    fileFormat.get(String) to get the group. It seems that this call
    is what creates the heap problem.
    -------------------------------
    The output of the program is the following:

    good file size: 202392
    leak file size: 23187400

    -------------------------------
    The output of h5stat shows the differences in the Group Heap

    good file size: 202392
    File space information for file metadata (in bytes):
            Groups:
                    B-tree/List: 48584
                    Heap: 9984

    //leak file size: 23187400
    File space information for file metadata (in bytes):
            Groups:
                    B-tree/List: 49456
                    Heap: 11544088

    -------------------------------
    Here is the code for this demonstration. I based the demo on some
    tests that were already in the distribution (excuse the lack of
    documentation)

    import java.util.List;

    import ncsa.hdf.object.Dataset;
    import ncsa.hdf.object.Datatype;
    import ncsa.hdf.object.FileFormat;
    import ncsa.hdf.object.Group;
    import ncsa.hdf.object.h5.H5File;

    import
    com.referentia.liveaction.server.core.nfstore.impl.hdf5.util.HDF5SdfConstants;

    /**
     * Implements an example of a memory leak in the groups.
     */
    public class H5GroupHeapMemoryLeak {
      private static final boolean USEGET_TO_FIND_GROUP = true;

      public static void main(String args[]) throws Exception {
        // create the file and add groups ans dataset into the file
        int repeats = 25;
        int datasets = 20;
        int datasetSize = 10000;
        H5GroupHeapMemoryLeak test = new H5GroupHeapMemoryLeak();
        long goodFileSize = test.runGoodDemo(repeats, datasets,
    datasetSize);
        long leakFileSize = test.runMemoryLeakDemo(repeats, datasets,
    datasetSize);
        System.out.println("good file size: " + goodFileSize);
        System.out.println("leak file size: " + leakFileSize);
      }
      private long runMemoryLeakDemo(int repeats, int datasets, int
    datasetSize) throws Exception {
        String fileName = H5GroupHeapMemoryLeak.class.getSimpleName()
            + "-LEAK" + "-" + System.currentTimeMillis() + "-repeats"
    + repeats
            + "-datasets" + datasets + "-datasetSize" + datasetSize +
    ".h5";
        createFile(fileName);
        long finalFileSize = 0;
        for (int i = 0; i < repeats; i++) {
          FileFormat testFile = getFile(fileName);
          testFile.open();
          Group datasetGroup = getGroupWithMemoryLeak(testFile,
    "group" + i);
          writeToDatasetGroup(testFile, datasetGroup, datasets,
    datasetSize);
          testFile.close();
          finalFileSize = testFile.length();
          //ystem.out.println("\tfile length " + i + ": " +
    finalFileSize);
        }
        return finalFileSize;
      }
      private long runGoodDemo(int repeats, int datasets, int
    datasetSize) throws Exception {
        String fileName = H5GroupHeapMemoryLeak.class.getSimpleName()
            + "-GOOD" + "-" + System.currentTimeMillis() + "-repeats"
    + repeats
            + "-datasets" + datasets + "-datasetSize" + datasetSize +
    ".h5";
        createFile(fileName);
        long finalFileSize = 0;
        for (int i = 0; i < repeats; i++) {
          FileFormat testFile = getFile(fileName);
          testFile.open();
          Group datasetGroup = getGoodGroup(testFile, "group" + i);
          writeToDatasetGroup(testFile, datasetGroup, datasets,
    datasetSize);
          testFile.close();
          finalFileSize = testFile.length();
          //System.out.println("\tfile length " + i + ": " +
    finalFileSize);
        }
        return finalFileSize;
      }
      private FileFormat getFile(String fileName) throws Exception {
        // retrieve an instance of H5File
        FileFormat fileFormat =
    FileFormat.getFileFormat(FileFormat.FILE_TYPE_HDF5);
        if (fileFormat == null) {
          System.err.println("Cannot find HDF5 FileFormat.");
          return null;
        }
        // open the file with read and write access
        FileFormat testFile = (H5File)
    fileFormat.createFile(fileName, FileFormat.FILE_CREATE_OPEN);

        if (testFile == null) {
          System.err.println("Failed to open file: " + fileName);
          return null;
        }
        return testFile;
      }
      private Group getGoodGroup(FileFormat testFile, String
    groupName) throws Exception {
        Group root = (Group)
    ((javax.swing.tree.DefaultMutableTreeNode)
    testFile.getRootNode()).getUserObject();
        Group datasetGroup = testFile.createGroup(groupName, root);
        return datasetGroup;
      }

      private Group getGroupWithMemoryLeak(FileFormat testFile,
    String groupName) throws Exception {
        Group root = (Group)
    ((javax.swing.tree.DefaultMutableTreeNode)
    testFile.getRootNode()).getUserObject();
        Group levelOneGroup = null;
        if (USEGET_TO_FIND_GROUP) {
          levelOneGroup = (Group) testFile.get("levelOneGroup");
        }
        else {
          List<?> members = root.getMemberList();
          for (Object tryingToFindGroup : members) {
            if (tryingToFindGroup instanceof Group &&
    "levelOneGroup".equals(((Group) tryingToFindGroup).getName())) {
              levelOneGroup = (Group) tryingToFindGroup;
              break;
            }
          }
        }
        if (levelOneGroup == null) {
          levelOneGroup = testFile.createGroup("levelOneGroup", root);
        }
        Group datasetGroup = testFile.createGroup(groupName,
    levelOneGroup);
        return datasetGroup;
      }

      private void writeToDatasetGroup(FileFormat testFile, Group
    datasetGroup, int datasets, int datasetSize) throws Exception {
        for (int i = 0; i < datasets; i++) {
          int[] args = new int[] { Datatype.CLASS_INTEGER, 8,
    Datatype.NATIVE, Datatype.SIGN_NONE };
          Datatype datatype = testFile.createDatatype(args[0],
    args[1], args[2], args[3]);
          writeDataset("data" + i, datasetSize, testFile,
    datasetGroup, datatype);
        }
      }
      private void writeDataset(String datasetName, int datasetSize,
    FileFormat testFile, Group group, Datatype datatype) throws
    Exception {
        int size = datasetSize;
        long[] initialSize = new long[] { size };
        long[] maxSize = new long[] { Long.MAX_VALUE };
        long[] chunkSize = new long[] { 60000 };
        int gzipCompressionLevel = HDF5SdfConstants.COMPRESSION_LEVEL;
        Dataset dataset = testFile.createScalarDS(datasetName, group,
    datatype, initialSize, maxSize, chunkSize, gzipCompressionLevel,
    null);
        dataset.init();
        long[] data = new long[size];
        for (int i = 0; i < size; i++) {
          data[i] = i;
        }
        //we don't have to write data to show the group memory leak
        //dataset.write(data);
        dataset.close(dataset.getFID());
      }

      /**
       * create the file and add groups ans dataset into the file,
    which is the same
       * as javaExample.H5DatasetCreate
       * @see javaExample.H5DatasetCreate
       * @throws Exception
       */
      private void createFile(String fileName) throws Exception {
        // retrieve an instance of H5File
        FileFormat fileFormat =
    FileFormat.getFileFormat(FileFormat.FILE_TYPE_HDF5);

        if (fileFormat == null) {
          System.err.println("Cannot find HDF5 FileFormat.");
          return;
        }

        // create a new file with a given file name.
        H5File testFile = (H5File) fileFormat.createFile(fileName,
    FileFormat.FILE_CREATE_OPEN);

        if (testFile == null) {
          System.err.println("Failed to create file:" + fileName);
          return;
        }

        // open the file and retrieve the root group
        testFile.open();
        // close file resource
        testFile.close();
      }
    }

    thanks, Aaron Kagawa

    _______________________________________________
    Hdf-forum is for HDF software users discussion.
    Hdf-forum@hdfgroup.org <mailto:Hdf-forum@hdfgroup.org>
    http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

    _______________________________________________
    Hdf-forum is for HDF software users discussion.
    Hdf-forum@hdfgroup.org <mailto:Hdf-forum@hdfgroup.org>
    http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org