Dear experts:
I encountered a problem with HDF5 for writing a very large data set into one HDF5 file. The code runs fine for a smaller test. I have checked with the memory, which seems fine. Any suggestions are welcome.
Many thanks.
Weiguang
The error information:
HDF5-DIAG: Error detected in HDF5 (1.10.5) MPI-process 0:
#000: H5Dio.c line 336 in H5Dwrite(): can’t write data
major: Dataset
minor: Write failed
#001: H5Dio.c line 798 in H5D__write(): unable to initialize storage
major: Dataset
minor: Unable to initialize object
#002: H5Dint.c line 2245 in H5D__alloc_storage(): unable to initialize contiguous storage
major: Low-level I/O
minor: Unable to initialize object
#003: H5Dcontig.c line 173 in H5D__contig_alloc(): unable to reserve file space
major: Low-level I/O
minor: No space available for allocation
#004: H5MF.c line 851 in H5MF_alloc(): allocation failed from aggr/vfd
major: Resource unavailable
minor: Can’t allocate space
#005: H5MFaggr.c line 124 in H5MF_aggr_vfd_alloc(): can’t allocate raw data
major: Resource unavailable
minor: Can’t allocate space
#006: H5MFaggr.c line 221 in H5MF__aggr_alloc(): ‘normal’ file space allocation request will overlap into ‘temporary’ file space
major: Resource unavailable
minor: Out of range
The piece of the code:
start[0] = pcsum;
start[1] = 0;
count[0] = pc;
count[1] = get_values_per_blockelement(blocknr);
pcsum += pc;
H5Sselect_hyperslab(hdf5_dataspace_in_file, H5S_SELECT_SET,
start, NULL, count, NULL);
dims[0] = pc;
dims[1] = get_values_per_blockelement(blocknr);
if ((hdf5_dataspace_memory = H5Screate_simple(rank, dims, NULL) < 0))
{
printf("failed to allocate memory for `hdf5 data space memory' (dims[0]: %lld x dims[1] %lld ~ %g MB).\n", dims[0], dims[1],dims[0]* dims[1]*4 / (1024.0 * 1024.0));
report_memory_usage(&HighMark_run, "RUN");
endrun(1238);
}
hdf5_status =
H5Dwrite(hdf5_dataset, hdf5_datatype,
hdf5_dataspace_memory,
hdf5_dataspace_in_file, H5P_DEFAULT, CommBuffer);
H5Sclose(hdf5_dataspace_memory);