Hi,
I am trying to test compression in hdf5 by testing a small program which just writes out one variable after creating a dataset. It kind of looks like this
call h5fcreate_f(filename, H5F_ACC_TRUNC_F, file_id, error)
call h5screate_simple_f(rank,dim_sizes,space_id,hstatus)
call h5pcreate_f(H5P_DATASET_CREATE_F, plist_id, hstatus)
call h5pset_chunk_f(plist_id,rank,chunkdim_sizes,hstatus)
!Set the compression parameters
comp_prm(1)=6 ! not the best, but close (and fast)
call h5pset_deflate_f(plist_id, comp_prm(1), hstatus)
call h5tcopy_f(H5T_NATIVE_REAL,type_id,hstatus)
call h5dcreate_f(file_id,vname,type_id,space_id,sds_id,hstatus,plist_id)
call h5dwrite_f(sds_id,type_id,variable,dim_sizes,hstatus)
Ofourse this is in hdf5. I try the same thing in hdf4 and the sample code looks like this
sd_id = sfstart(filename,DFACC_CREATE)
sds_id = sfcreate(sd_id,vname,DFNT_FLOAT32, rank, dim_sizes)
status = sfscompress(sds_id,COMP_CODE_DEFLATE,6)
status = sfwdata(sds_id, hstart, hstride, hedges, variable)
status = sfendacc(sds_id)
status = sfend(sd_id)
Both the codes worked fine. The only problem was that I got an hdf5 file which was at least 5 times the size of the hdf4 file. That doesnt make sense. It should be the same right? No matter what value I gave for chunkdim_sizes, the result was the same.
Can anyone please help
Thanks,
Abhilash Chandy.
···
____________________________________________________________________________________
Be a better friend, newshound, and
know-it-all with Yahoo! Mobile. Try it now. http://mobile.yahoo.com/;_ylt=Ahu06i62sR8HDtDypao8Wcj9tAcJ
----------------------------------------------------------------------
This mailing list is for HDF software users discussion.
To subscribe to this list, send a message to hdf-forum-subscribe@hdfgroup.org.
To unsubscribe, send a message to hdf-forum-unsubscribe@hdfgroup.org.