performance

We are looking at HDF to handle images.
Someone wrote some HDF code as part of larger linux image processing package and had significant performance problems.

I wrote a stand alone (attached) on my mac to create a group and write 200k 2D images under it.
Results are attached for three uniform image sizes (64x64, 128x128, 256x256).
Compared to using fread, HDF was taking 20x longer for the smaller image, those results not included.

any ideas on how to approach this differently?
My next step is to consolidate all the 2D images into a single 3D image stack.

Matthew Dougherty
713-433-3849
National Center for Macromolecular Imaging
Baylor College of Medicine/Houston Texas USA

HDF standalone test 1.doc (171 KB)

ICtest1.c (2.43 KB)

···

=========================================================================

Hi Matthew,

We are looking at HDF to handle images.
Someone wrote some HDF code as part of larger linux image processing package and had significant performance problems.

I wrote a stand alone (attached) on my mac to create a group and write 200k 2D images under it.
Results are attached for three uniform image sizes (64x64, 128x128, 256x256).
Compared to using fread, HDF was taking 20x longer for the smaller image, those results not included.

any ideas on how to approach this differently?

  I spent a little time looking at your benchmark this morning and made a few tweaks to it:
  - Ported the API calls for the changes to the 1.8.0 release (which you'll need to run it now)
  - Closed the dangling 'ICgrp' group ID, which was holding the previous files open
  - Moved the datatype & dataspace creation/destruction out of the main loop
  - Turned on the "use the latest version of the format" flag for the file
  - Printed the statistics for the last 10000 operations

  I think the results should vary much less now, can you give this a try?

  Quincey

ICtest1-QAK1.c (2.94 KB)

···

On Dec 11, 2007, at 1:36 AM, Dougherty, Matthew T. wrote: