HDF5 large dataset read fails - inflate() failed

I am working on a Centos 7 Linux machine and 
I am trying to read a single band from an hdf5 dataset. The dataset is ~23000 lines, 512 samples, 256 bands. Using the dataset class method to selectHyperslab I am trying to read one band at a time. I can read up to band 40 using the C++  selectHyperslab but will get the following error when reading band 40 and beyond. 

Output from program …
Processing band 38
0 23394
1 512
2 256
Processing band 39
0 23394
1 512
2 256
Processing band 40
0 23394
1 512
2 256
HDF5-DIAG: Error detected in HDF5 (1.10.4) thread 0:
#000: H5Dio.c line 199 in H5Dread(): can’t read data
major: Dataset
minor: Read failed
#001: H5Dio.c line 601 in H5D__read(): can’t read data
major: Dataset
minor: Read failed
#002: H5Dchunk.c line 2229 in H5D__chunk_read(): unable to read raw data chunk
major: Low-level I/O
minor: Read failed
#003: H5Dchunk.c line 3609 in H5D__chunk_lock(): data pipeline read failed
major: Dataset
minor: Filter operation failed
#004: H5Z.c line 1326 in H5Z_pipeline(): filter returned failure during read
major: Data filters
minor: Read failed
#005: H5Zdeflate.c line 123 in H5Z_filter_deflate(): inflate() failed
major: Data filters
minor: Unable to initialize object
terminate called after throwing an instance of ‘H5::DataSetIException’
Abort (core dumped)

My code which calls the dataset read is the following class method...
void InHdf5::read_band (int band, float *banddat) {
    int i, ndims ;
    hsize_t dims [3], dims_out[3] ;
    float *pixloc ;
    DataSet dset ;
    DataSpace dspace ;
    H5T_class_t type_class ;
    string dset_name ("/radiance_data") ;
    dset = h5f->openDataSet (dset_name) ;
    type_class = dset.getTypeClass() ;
    
    // get dataspace dimensions
    dspace = dset.getSpace () ;
    ndims = dspace.getSimpleExtentDims (dims, NULL) ;
    for (i=0; i<ndims; i++) {
        cout << i<< "  " << dims[i] << endl ;
    }

    
    dims_out[0] = dims [0] ;
    dims_out[1] = dims [1] ;
    dims_out[2] = dims [2] ;
    hsize_t count_out[3] ;
    hsize_t offset_out[]={0,0,band} ;
    hsize_t stride[] = {1,1,1} ;
    hsize_t block[] = {1,1,1} ;
    
    count_out[0] = dims[0] ;
    count_out[1] = dims[1] ;
    count_out[2] = 1 ;
    //offset_out[0] = iblock ;
    DataSpace mspace1 (ndims, count_out) ;

    dspace.selectHyperslab (H5S_SELECT_SET, count_out, offset_out) ;
    dset.read (banddat, PredType::NATIVE_FLOAT, mspace1, dspace) ;

}
    
Thank you for any help.

Have you tried h5dump on this dataset (use -d option)? Does it fail too?

Thank you!

Elena

Yes, just did try and failed a quarter of the way through the dataset with
h5dump error: unable to print data

the last samples it printed before failing were near (4800, 512, 256) which for float data is near a 2.4GByte boundary and then fails. My program has been working fine on shorter data.
THank you
harold

You may try
h5dump --enable-error-stack -d to see the error stack when dumping the dataset in question.

h5check https://portal.hdfgroup.org/display/HDF5/h5check can be used to find out if the file is corrupted.

Thank you!

Elena