Reading from a dataset into an allocatable array in Fortran

I am having trouble reading from a dataset into an allocatable array in
fortran. Here is an excerpt from my code:

SUBROUTINE read_r1(this,dsetname,data)
      CLASS(HDF5FileType),INTENT(INOUT) :: this
      CHARACTER(LEN=*),INTENT(IN) :: dsetname
      DOUBLE PRECISION,ALLOCATABLE,INTENT(INOUT) :: data(:slight_smile:
      CHARACTER(LEN=MAX_PATH_LENGTH) :: path

      INTEGER(HSIZE_T),DIMENSION(1) :: dims,maxdims,len1
      INTEGER(HID_T),PARAMETER :: rank=1

      INTEGER(HID_T) :: error,mem
      INTEGER(HID_T) :: dspace_id,dset_id
      ! Get dataset dimensions for allocation
      CALL h5dget_space_f(dset_id,dspace_id,error)
      CALL h5sget_simple_extent_dims_f(dspace_id,dims,maxdims,error)
      ! Allocate to size
      ALLOCATE(data(dims(1)))

      ! Read the dataset
      mem=H5T_NATIVE_DOUBLE
      CALL h5dread_vl_f(dset_id,mem,data,dims,len1,error)
      IF(error /= 0)THEN
        CALL this%e%raiseError(myName//": Failed to read data from
dataset.")
      ENDIF
ENDSUBROUTINE

Compiling this gives me:

/home/youngmit/codes/mpact/MPACT_libs/Utils/src/FileType_HDF5.f90:480.57:

      CALL h5dread_vl_f(dset_id,mem,data,dims,len1,error)
                                                         1
Error: There is no specific subroutine for the generic 'h5dread_vl_f' at (1)

I havent found any examples of people using allocatable arrays, so is it
even possible? How else could I approach extracting variable-sized data in
fortran?

Thanks,

Mitch

Please take a look at this example

http://www.hdfgroup.org/ftp/HDF5/examples/examples-by-api/hdf5-examples/1_8/FORTRAN/H5T/h5ex_t_vlen_F03.f90

The HDF5 library should configured with the --enable-fortran --enable-fortran2003 flags in order for this example to run and Fortran2003 compiler is required.

To see if this feature is enabled with your current installation, check the libhdf5.settings file under the lib subdirectory of the HDF5 installation directory (see the "Fortran 2003 Compiler" line).

Elena

···

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Elena Pourmal The HDF Group http://hdfgroup.org
1800 So. Oak St., Suite 203, Champaign IL 61820
217.531.6112
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

On Jan 22, 2013, at 12:51 PM, Mitchell Young wrote:

I am having trouble reading from a dataset into an allocatable array in fortran. Here is an excerpt from my code:

SUBROUTINE read_r1(this,dsetname,data)
      CLASS(HDF5FileType),INTENT(INOUT) :: this
      CHARACTER(LEN=*),INTENT(IN) :: dsetname
      DOUBLE PRECISION,ALLOCATABLE,INTENT(INOUT) :: data(:slight_smile:
      CHARACTER(LEN=MAX_PATH_LENGTH) :: path

      INTEGER(HSIZE_T),DIMENSION(1) :: dims,maxdims,len1
      INTEGER(HID_T),PARAMETER :: rank=1
      
      INTEGER(HID_T) :: error,mem
      INTEGER(HID_T) :: dspace_id,dset_id
      ! Get dataset dimensions for allocation
      CALL h5dget_space_f(dset_id,dspace_id,error)
      CALL h5sget_simple_extent_dims_f(dspace_id,dims,maxdims,error)
      ! Allocate to size
      ALLOCATE(data(dims(1)))

      ! Read the dataset
      mem=H5T_NATIVE_DOUBLE
      CALL h5dread_vl_f(dset_id,mem,data,dims,len1,error)
      IF(error /= 0)THEN
        CALL this%e%raiseError(myName//": Failed to read data from dataset.")
      ENDIF
ENDSUBROUTINE

Compiling this gives me:

/home/youngmit/codes/mpact/MPACT_libs/Utils/src/FileType_HDF5.f90:480.57:

      CALL h5dread_vl_f(dset_id,mem,data,dims,len1,error)
                                                         1
Error: There is no specific subroutine for the generic 'h5dread_vl_f' at (1)

I havent found any examples of people using allocatable arrays, so is it even possible? How else could I approach extracting variable-sized data in fortran?

Thanks,

Mitch
_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

Sorry, i was calling the wrong subroutine. getting segfault now, but ill
keep working on it for a while before i post anything.

Sorry again.