Reading HDF5 datasets larger than 1 MB

Dear All,

I'm a new user of this HDF5 API. First I tried the C++ API and then I tried
the high level Lite API, But I seem to always run into the same problem:
When the data set exceeds 1 MB, the file reading fails.

A h5dump of the file I'm trying to reads looks like:

HDF5 "testE5.h5" {
GROUP "/" {
   GROUP "FileType" {
      DATASET "AgilentH5FileType" {
         DATATYPE H5T_STRING {
               STRSIZE 40;
               STRPAD H5T_STR_NULLTERM;
               CSET H5T_CSET_ASCII;
               CTYPE H5T_C_S1;
            }
         DATASPACE SCALAR
         DATA {
         (0): "Agilent Waveform"
         }
      }
   }
   GROUP "Frame" {
      DATASET "TheFrame" {
         DATATYPE H5T_COMPOUND {
            H5T_STRING {
               STRSIZE 12;
               STRPAD H5T_STR_NULLTERM;
               CSET H5T_CSET_ASCII;
               CTYPE H5T_C_S1;
            } "Model";
            H5T_STRING {
               STRSIZE 12;
               STRPAD H5T_STR_NULLTERM;
               CSET H5T_CSET_ASCII;
               CTYPE H5T_C_S1;
            } "Serial";
            H5T_STRING {
               STRSIZE 22;
               STRPAD H5T_STR_NULLTERM;
               CSET H5T_CSET_ASCII;
               CTYPE H5T_C_S1;
            } "Date";
         }
         DATASPACE SCALAR
         DATA {
         (0): {
               "DSO90254A",
               "MY51050105",
               "14-Jun-2012 15:54:11"
            }
         }
      }
   }
   GROUP "Waveforms" {
      ATTRIBUTE "NumWaveforms" {
         DATATYPE H5T_STD_I32LE
         DATASPACE SCALAR
         DATA {
         (0): 2
         }
      }
      GROUP "Channel 2" {
         ATTRIBUTE "Count" {
            DATATYPE H5T_STD_I32LE
            DATASPACE SCALAR
            DATA {
            (0): 1
            }
         }
         ATTRIBUTE "MaxBandwidth" {
            DATATYPE H5T_IEEE_F64LE
            DATASPACE SCALAR
            DATA {
            (0): 2e+09
            }
         }
         DATASET "Channel 2 Data" {
            DATATYPE H5T_IEEE_F32LE
            DATASPACE SIMPLE { ( 32400000 ) / ( H5S_UNLIMITED ) }
            DATA {
            (0): 0.00187331, -0.000487326, 0.00105394, 0.00184433,
            (4): 0.00106184, 0.000901129, 0.00140435, 0.00141488,

I truncated the file... It has 32800000 samples in the data set "Channel 2
Data". When I use the following code to read the file:

  hsize_t dims_out = 32800000;

  double *buff = new double[dims_out];
  hid_t file_id = H5Fopen (in_fileName.c_str(), H5F_ACC_RDONLY, H5P_DEFAULT);

  herr_t status = H5LTread_dataset_double(file_id,"/Waveforms/Channel
2/Channel 2 Data",buff);

It fails and reports the following:

HDF5-DIAG: Error detected in HDF5 (1.8.4-patch1) thread 140737353955136:
  #000: ../../../src/H5Dio.c line 174 in H5Dread(): can't read data
    major: Dataset
    minor: Read failed
  #001: ../../../src/H5Dio.c line 404 in H5D_read(): can't read data
    major: Dataset
    minor: Read failed
  #002: ../../../src/H5Dchunk.c line 1733 in H5D_chunk_read(): unable to
read raw data chunk
    major: Low-level I/O
    minor: Read failed
  #003: ../../../src/H5Dchunk.c line 2742 in H5D_chunk_lock(): data pipeline
read failed
    major: Data filters
    minor: Filter operation failed
  #004: ../../../src/H5Z.c line 1017 in H5Z_pipeline(): filter returned
failure during read
    major: Data filters
    minor: Read failed
  #005: ../../../src/H5Zdeflate.c line 117 in H5Z_filter_deflate():
inflate() failed
    major: Data filters
    minor: Unable to initialize object

However, if I read the same kind of file that has only 1000000 samples in
said data set, then everything seems fine.

I'm certain that I'm missing something very basic, but I didn't finds
anything in the fora not in the documentation...

Best Regards,

Jón

···

--
View this message in context: http://hdf-forum.184993.n3.nabble.com/Reading-HDF5-datasets-larger-than-1-MB-tp4025160.html
Sent from the hdf-forum mailing list archive at Nabble.com.

Dear All,

False alarm! It turns out the file I was working with is corrupt. It stopped
after 700 ksamples although it claimed to have 32 Msamples in the header.
h5dump as well as hdfview choke on the file too. Initially, I didn't notice
the error message produced by h5dump, because I piped it through less.

Its not an issue with my use of the HDF5 API. Hope I didn't cause too much
confusion.

Best Regards,

Jón

···

--
View this message in context: http://hdf-forum.184993.n3.nabble.com/Reading-HDF5-datasets-larger-than-1-MB-tp4025160p4025161.html
Sent from the hdf-forum mailing list archive at Nabble.com.