Hi there,
I am seeing this error when trying to crate a dataset with chunks of a particular size. The minimum code to reproduce is pretty simple:
#include "hdf5.h"
#include "stdlib.h"
int main (int argc, char **argv)
{
MPI_Init(&argc, &argv);
// Set up file access property list with parallel I/O access
hid_t plist_id = H5Pcreate(H5P_FILE_ACCESS);
H5Pset_fapl_mpio(plist_id, MPI_COMM_WORLD, MPI_INFO_NULL);
// Create a new file collectively and release property list identifier
hid_t file_id = H5Fcreate("out.h5", H5F_ACC_TRUNC, H5P_DEFAULT, plist_id);
H5Pclose(plist_id);
// Create the dataspace for the dataset.
hsize_t dimsf[3] = {782, 590, 768};
hid_t filespace = H5Screate_simple(3, dimsf, NULL);
// Create the dataset creation property list, and set the chunk size
hsize_t chunk[3] = {256, 256, 256};
hid_t dcpl = H5Pcreate (H5P_DATASET_CREATE);
herr_t status = H5Pset_chunk (dcpl, 3, chunk);
// Create the chunked dataset.
hid_t dset_id = H5Dcreate (file_id, "dset", H5T_STD_I32LE, filespace, H5P_DEFAULT, dcpl, H5P_DEFAULT);
MPI_Finalize();
return 0;
}
The code above fails at the dataset creation call with the following error:
> ./a.out
mca_fbtl_posix_pwritev: error in writev:File too large
mca_fbtl_posix_pwritev: error in writev:File too large
mca_fbtl_posix_pwritev: error in writev:File too large
mca_fbtl_posix_pwritev: error in writev:File too large
mca_fbtl_posix_pwritev: error in writev:File too large
HDF5-DIAG: Error detected in HDF5 (1.10.5) MPI-process 0:
#000: H5D.c line 145 in H5Dcreate2(): unable to create dataset
major: Dataset
minor: Unable to initialize object
#001: H5Dint.c line 329 in H5D__create_named(): unable to create and link to dataset
major: Dataset
minor: Unable to initialize object
#002: H5L.c line 1557 in H5L_link_object(): unable to create new link to object
major: Links
minor: Unable to initialize object
#003: H5L.c line 1798 in H5L__create_real(): can't insert link
major: Links
minor: Unable to insert object
#004: H5Gtraverse.c line 851 in H5G_traverse(): internal path traversal failed
major: Symbol table
minor: Object not found
#005: H5Gtraverse.c line 627 in H5G__traverse_real(): traversal operator failed
major: Symbol table
minor: Callback failed
#006: H5L.c line 1604 in H5L__link_cb(): unable to create object
major: Links
minor: Unable to initialize object
#007: H5Oint.c line 2453 in H5O_obj_create(): unable to open object
major: Object header
minor: Can't open object
#008: H5Doh.c line 300 in H5O__dset_create(): unable to create dataset
major: Dataset
minor: Unable to initialize object
#009: H5Dint.c line 1278 in H5D__create(): can't update the metadata cache
major: Dataset
minor: Unable to initialize object
#010: H5Dint.c line 977 in H5D__update_oh_info(): unable to update layout/pline/efl header message
major: Dataset
minor: Unable to initialize object
#011: H5Dlayout.c line 508 in H5D__layout_oh_create(): unable to initialize storage
major: Dataset
minor: Unable to initialize object
#012: H5Dint.c line 2335 in H5D__alloc_storage(): unable to initialize dataset with fill value
major: Dataset
minor: Unable to initialize object
#013: H5Dint.c line 2422 in H5D__init_storage(): unable to allocate all chunks of dataset
major: Dataset
minor: Unable to initialize object
#014: H5Dchunk.c line 4402 in H5D__chunk_allocate(): unable to write raw data to file
major: Low-level I/O
minor: Write failed
#015: H5Dchunk.c line 4727 in H5D__chunk_collective_fill(): unable to write raw data to file
major: Low-level I/O
minor: Write failed
#016: H5Fio.c line 165 in H5F_block_write(): write through page buffer failed
major: Low-level I/O
minor: Write failed
#017: H5PB.c line 1028 in H5PB_write(): write through metadata accumulator failed
major: Page Buffering
minor: Write failed
#018: H5Faccum.c line 826 in H5F__accum_write(): file write failed
major: Low-level I/O
minor: Write failed
#019: H5FDint.c line 258 in H5FD_write(): driver write request failed
major: Virtual File Layer
minor: Write failed
#020: H5FDmpio.c line 1876 in H5FD_mpio_write(): file write failed
major: Low-level I/O
minor: Write failed
Clearly the “file too large” message doesn’t make sense, because my dataset is not that big. Also, I noticed that slightly increasing or decreasing the chunk size can make it work.
I built with gcc 7.5.0 and linked against Open MPI 3.1.3 on Ubuntu 18.04.
Any suggestion or feedback will be mostly welcome. Thanks!