Problem with property lists

Hello,

I'm trying to make a fortran code use the MPI-I/O driver of HDF5. The initial code (which doesn't have any problem) looks like the following:

call h5open_f (ierror)
call h5fcreate_f (filename,H5F_ACC_TRUNC_F,file_id,ierror)
! creating groups, datasets, etc. then writing to them and closing them
call h5fclose_f (file_id,ierror)
call h5close_f (ierror)

I changed it to the following:

call h5open_f (ierror)
call h5pcreate_f(H5P_FILE_ACCESS_F,access_prop_id,ierror)
call h5pcreate_f(H5P_FILE_CREATE_F,create_prop_id,ierror)
call h5pset_fapl_mpio_f(access_prop_id,MPI_COMM_WORLD,MPI_INFO_NULL,ierror)
call h5fcreate_f(filename,H5F_ACC_TRUNC_F,file_id,ierror,create_prop_id,access_prop_id)
! creating groups, datasets, etc. then writing to them and closing them
call h5fclose_f (file_id,ierror)
call h5pclose_f (create_prop_id,ierror)
call h5pclose_f (access_prop_id,ierror)
call h5close_f (ierror)

and now I have the following error from HDF5:

HDF5-DIAG: Error detected in HDF5 (1.8.12) MPI-process 0:
#000: H5F.c line 2061 in H5Fclose(): decrementing file ID failed
major: Object atom
minor: Unable to close file
#001: H5I.c line 1479 in H5I_dec_app_ref(): can't decrement ID ref count
major: Object atom
minor: Unable to decrement reference count
#002: H5F.c line 1830 in H5F_close(): can't close file, there are objects still open
major: File accessibilty
minor: Unable to close file

I tried to close the property lists before the file, after the file, I tried not to close them, I always get this error.
Any idea where this comes from?

Thank you,

Matthieu Dorier
PhD student at ENS Rennes
http://people.irisa.fr/Matthieu.Dorier

Are you sure you closed all opened objects (groups, datasets, etc...) before you closed the file?
The error you get suggests that you have not. The MPIO driver sets the default file close degree to SEMI, meaning that if object are open, the file close will fail. All other VFL drivers set the default close degree to WEAK, which delays the file close allowing you to have some open objects when you call file close. This explains that in the first case, you program succeeds but fails in the second case.

For more information, look here:
http://www.hdfgroup.org/HDF5/doc/RM/RM_H5P.html#Property-SetFcloseDegree

Thanks,
Mohamad

···

From: Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org] On Behalf Of Matthieu Dorier
Sent: Sunday, January 19, 2014 4:53 AM
To: HDF Forum
Subject: [Hdf-forum] Problem with property lists

Hello,

I'm trying to make a fortran code use the MPI-I/O driver of HDF5. The initial code (which doesn't have any problem) looks like the following:

      call h5open_f (ierror)
      call h5fcreate_f (filename,H5F_ACC_TRUNC_F,file_id,ierror)
      ! creating groups, datasets, etc. then writing to them and closing them
      call h5fclose_f (file_id,ierror)
      call h5close_f (ierror)

I changed it to the following:

      call h5open_f (ierror)
      call h5pcreate_f(H5P_FILE_ACCESS_F,access_prop_id,ierror)
      call h5pcreate_f(H5P_FILE_CREATE_F,create_prop_id,ierror)
      call h5pset_fapl_mpio_f(access_prop_id,MPI_COMM_WORLD,MPI_INFO_NULL,ierror)
      call h5fcreate_f(filename,H5F_ACC_TRUNC_F,file_id,ierror,create_prop_id,access_prop_id)
      ! creating groups, datasets, etc. then writing to them and closing them
      call h5fclose_f (file_id,ierror)
      call h5pclose_f (create_prop_id,ierror)
      call h5pclose_f (access_prop_id,ierror)
      call h5close_f (ierror)

and now I have the following error from HDF5:

HDF5-DIAG: Error detected in HDF5 (1.8.12) MPI-process 0:
  #000: H5F.c line 2061 in H5Fclose(): decrementing file ID failed
    major: Object atom
    minor: Unable to close file
  #001: H5I.c line 1479 in H5I_dec_app_ref(): can't decrement ID ref count
    major: Object atom
    minor: Unable to decrement reference count
  #002: H5F.c line 1830 in H5F_close(): can't close file, there are objects still open
    major: File accessibilty
    minor: Unable to close file

I tried to close the property lists before the file, after the file, I tried not to close them, I always get this error.
Any idea where this comes from?

Thank you,

Matthieu Dorier
PhD student at ENS Rennes
http://people.irisa.fr/Matthieu.Dorier

Indeed that was the problem, one of the groups was not closed.
Thank you!

Matthieu Dorier
PhD student at ENS Rennes
http://people.irisa.fr/Matthieu.Dorier
----- Mail original -----

···

De: "Mohamad Chaarawi" <chaarawi@hdfgroup.org>
À: "HDF Users Discussion List" <hdf-forum@lists.hdfgroup.org>
Envoyé: Lundi 20 Janvier 2014 21:23:01
Objet: Re: [Hdf-forum] Problem with property lists

Are you sure you closed all opened objects (groups, datasets, etc…)
before you closed the file?
The error you get suggests that you have not. The MPIO driver sets
the default file close degree to SEMI, meaning that if object are
open, the file close will fail. All other VFL drivers set the
default close degree to WEAK, which delays the file close allowing
you to have some open objects when you call file close. This
explains that in the first case, you program succeeds but fails in
the second case.

For more information, look here:
http://www.hdfgroup.org/HDF5/doc/RM/RM_H5P.html#Property-SetFcloseDegree

Thanks,
Mohamad

From: Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org] On
Behalf Of Matthieu Dorier
Sent: Sunday, January 19, 2014 4:53 AM
To: HDF Forum
Subject: [Hdf-forum] Problem with property lists

Hello,

I'm trying to make a fortran code use the MPI-I/O driver of HDF5. The
initial code (which doesn't have any problem) looks like the
following:

call h5open_f (ierror)

call h5fcreate_f (filename,H5F_ACC_TRUNC_F,file_id,ierror)

! creating groups, datasets, etc. then writing to them and closing
them

call h5fclose_f (file_id,ierror)

call h5close_f (ierror)

I changed it to the following:

call h5open_f (ierror)

call h5pcreate_f(H5P_FILE_ACCESS_F,access_prop_id,ierror)

call h5pcreate_f(H5P_FILE_CREATE_F,create_prop_id,ierror)

call
h5pset_fapl_mpio_f(access_prop_id,MPI_COMM_WORLD,MPI_INFO_NULL,ierror)

call
h5fcreate_f(filename,H5F_ACC_TRUNC_F,file_id,ierror,create_prop_id,access_prop_id)

! creating groups, datasets, etc. then writing to them and closing
them

call h5fclose_f (file_id,ierror)

call h5pclose_f (create_prop_id,ierror)

call h5pclose_f (access_prop_id,ierror)

call h5close_f (ierror)

and now I have the following error from HDF5:

HDF5-DIAG: Error detected in HDF5 (1.8.12) MPI-process 0:

#000: H5F.c line 2061 in H5Fclose(): decrementing file ID failed

major: Object atom

minor: Unable to close file

#001: H5I.c line 1479 in H5I_dec_app_ref(): can't decrement ID ref
count

major: Object atom

minor: Unable to decrement reference count

#002: H5F.c line 1830 in H5F_close(): can't close file, there are
objects still open

major: File accessibilty

minor: Unable to close file

I tried to close the property lists before the file, after the file,
I tried not to close them, I always get this error.

Any idea where this comes from?

Thank you,

Matthieu Dorier
PhD student at ENS Rennes
http://people.irisa.fr/Matthieu.Dorier
_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

Sometimes it can be hard to figure out which HDF doodad is still open. I use this (c code) snippet to point to the culprit:

   norphans = H5Fget_obj_count(file->file_id, H5F_OBJ_ALL);
   if (norphans > 1) { /* expect 1 for the file we have not closed */
       int i;
       H5O_info_t info;
       char name[64];
       hid_t * objects = calloc(norphans, sizeof(hid_t));
       H5Fget_obj_ids(file->file_id, H5F_OBJ_ALL, -1, objects);
       for (i=0; i<norphans; i++) {
           H5Oget_info(objects[i], &info);
           H5Iget_name(objects[i], name, 64);
           printf("%d of %zd things still open: %d with name %s of type %d", i, norphans, objects[i], name, info.type);
       }
   }

==rob

···

On 01/21/2014 02:37 AM, Matthieu Dorier wrote:

Indeed that was the problem, one of the groups was not closed.

--
Rob Latham
Mathematics and Computer Science Division
Argonne National Lab, IL USA