efficiency - Opening and closing of datasets

All,

I will be having around 15 – 20 datasets to which at a time one value will be written . The data that needs to be written is a c++ object . Which approach is better in terms of performance and efficiency ?

1. For each write , open the dataset , write the data and close the dataset

2. Open the dataset once before the write starts , complete writing of the entire data and then close the dataset at the end .

Regards

Ram

Hi Ram,

···

On Apr 24, 2008, at 4:10 AM, Ramakrishnan Iyer wrote:

All,

I will be having around 15 – 20 datasets to which at a time one value will be written . The data that needs to be written is a c++ object . Which approach is better in terms of performance and efficiency ?

1. For each write , open the dataset , write the data and close the dataset

2. Open the dataset once before the write starts , complete writing of the entire data and then close the dataset at the end .

  You are more likely to perform better for option #2, but it may take more memory. Also, it's likely that the performance will be fairly similar in many cases, since the HDF5 library caches the metadata needed to open a dataset.

  Quincey

----------------------------------------------------------------------
This mailing list is for HDF software users discussion.
To subscribe to this list, send a message to hdf-forum-subscribe@hdfgroup.org.
To unsubscribe, send a message to hdf-forum-unsubscribe@hdfgroup.org.