HDF5 File size is too large


I am working with HDF5 C APIs(Build Version 1.8.5) and writing data in
packet table.

After writing to the file I observed that the HDF5 file size is too large
than the file I wrote with the binary writer for the same size of data.

I don't want to use compression in the packet table.

While testing my program what I found is that, as I change Chunk Size the
output HDF5 file size changes. But, I am not able to decide on the size to
be kept that will be best suitable for me.

After going through the HDF5 documentation I understood that, as the storage
is allocated in chunks, the total size in the file can be larger than the
size of data array.

The current chunk size i'm using is '100'. And I'm not satisfied of results
with the same.
Can anybody help me to understand the relationship between chunk size and
File size?
And also best suitable chunk size for me...


Best Regards,
View this message in context: http://hdf-forum.184993.n3.nabble.com/HDF5-File-size-is-too-large-tp2271605p2271605.html
Sent from the hdf-forum mailing list archive at Nabble.com.