Unable to open dataset collectively in PHDF5

Hi All,

Another thing that I found now is that even though the file gets created using
PHDF5, I am unable to open the file. When I do a h5dump, I get the following
error.

-bash-3.00$ h5dump smwf.h5
h5dump error: internal error (file h5dump.c:line 4017)

Is there some problem with file creation. May be that is causing the other
problems as well.

Regards,
Nikhil

Hello All,

I am having trouble using Parallel HDF5 in my application.

I followed the tutorial and created a HDF5 file and dataset collectively.

This

was the code I used.

CC PARALLEL IO

      CALL h5open_f(error)

c Setup file access property list with parallel I/O access.

      CALL h5pcreate_f(H5P_FILE_ACCESS_F,pcrp_list,error)
      CALL h5pset_fapl_mpio_f(pcrp_list,new_comm,new_info,error)

c Create the file collectively.

      CALL h5fcreate_f(pdsetname,H5F_ACC_TRUNC_F,pfile_id,
     & error,access_prp = pcrp_list)

c Close property list

      CALL h5pclose_f(pcrp_list, error)

      call MPI_Allreduce(numcols,dim,1,
     & MPI_INTEGER,MPI_SUM,new_comm,ierr)

  dims(1) = dim*k1max
  maxdims = (/H5S_UNLIMITED_F/)

      CALL h5screate_simple_f(rank,dims,pdataspace,error,maxdims)
      CALL h5pcreate_f(H5P_DATASET_CREATE_F,pcrp_list,error)

      chunk_dims(1) = dim
      CALL h5pset_chunk_f(pcrp_list,rank,chunk_dims,error)

c Create dataset collectively

      CALL h5dcreate_f(pfile_id,pdsetname,H5T_NATIVE_REAL,
     & pdataspace,pdset_id,error,pcrp_list)

      CALL h5sclose_f(pdataspace, error)
      CALL h5dclose_f(pdset_id, error)
      CALL h5fclose_f(pfile_id, error)

CC PARALLEL IO

This part works fine and the file and dataset gets created.

But when I reopen the dataset, I get errors.

This is what I am doing.

      CALL h5pcreate_f(H5P_FILE_ACCESS_F,pcrp_list,error)
      CALL h5pset_fapl_mpio_f(pcrp_list,new_comm,new_info,error)

      CALL h5fopen_f ("smwf.h5",H5F_ACC_RDWR_F,pfile_id,error,pcrp_list)
      if (error.ne.0) then
         write(*,*) 'File open failed'
      endif

c CALL h5pclose_f(pcrp_list,error)

      CALL h5dopen_f(pfile_id,pdsetname,pdset_id, error)
      if (error.ne.0) then
         write(*,*) 'dataset open failed'
      endif

      CALL h5pclose_f(pcrp_list,error)
      CALL h5dclose_f(pdset_id,error)
      CALL h5fclose_f(pfile_id,error)
***********************************************************************

I am just trying to open the file and dataset collectively before I try to

write

it. The file does open but the h5dopen_f call does not work for some reason.

I

assume that since the dataset 'pdset_id' resides in the file 'pfile_id', it

must

···

be also set for parallel access. So the collective call should work. The
h5dopen_f call does not have any parameters for property list as well as
communicator. So does HDF5 get this info from the file ?

I am confused as to what is goin wrong. If someone can find an error, please
point it out. The error I am getting is attached in the file error.log.

If I comment the calls to H5Dopen_f and h5dclose_f, I do not get any error.

Regards,
Nikhil

----------------------------------------------------------------------
This mailing list is for HDF software users discussion.
To subscribe to this list, send a message to hdf-forum-subscribe@hdfgroup.org.
To unsubscribe, send a message to hdf-forum-unsubscribe@hdfgroup.org.

A post was merged into an existing topic: Unable to open dataset collectively in PHDF5