Memory Increase right after dataset.write

Hi,

I am using hdf5 C++ API’s to write some relatively long vectors.

The memory size for allocating the vector is about 148MB and using getrusage() in linux I was able to confirm the memory usage of 148MB right after the vector creation.

When I call the function dataset.write(myvector,…)

I found that the memory increases immediately by a factor of 35, it reaches about 5000MB.

Even with smaller vector of allocation size 7.00 MB the same phenomena happens and the memory increases to 250MB just after calling dataset.write().

Does anyone know what is going on? or ever encountered the same problem ?

Here is a small snippet of the code:

Group group;
H5File file(file_name.c_str(), H5F_ACC_RDWR);
group=file.createGroup(group_name.c_str());

IntType int_dt( PredType::NATIVE_INT);
FloatType float_dt( PredType::NATIVE_DOUBLE);

//dataspace
hsize_t dims[1];
dims[0] = n_total; // n_total is the length of the u vector
hsize_t maxdims[1] = {H5S_UNLIMITED};
DataSpace dataspace(1, dims,maxdims);

// create dataset creation prop list to set chunk for extentable dataset
hsize_t cdims[1]={20};
DSetCreatPropList ds_creatplist;
ds_creatplist.setChunk( 1, cdims );

DataSet ds_sol_q(group.createDataSet(“q”, float_dt, dataspace,ds_creatplist));

ds_sol_q.write(u,PredType::NATIVE_DOUBLE);

Best Regards,
Mohammad Alhawwary.