h5dump - dimensions of a dataset??

Hi everyone,

I'm a newbie to HDF5 and I'm writing a simple parameter ingestion program
for HDF5-formatted images.
I can get almost all the attributes I need using a function that interfaces
with "h5dump", but there's one sticky point I can't find: the raster
dimensions!

What's the easiest way to obtain the dimensions of the attached image, i.e.
the dataset "SBI" in my case? I can *see* the dimensions when I use h5dump
this way:

h5dump -d /S01/SBI/ C*h5 | grep DATASPACE
  DATASPACE SIMPLE { ( 14446, 18643, 2 ) / ( H5S_UNLIMITED,
H5S_UNLIMITED, 2 ) }

(the size of the image is 14446 x 18643) - but this has the effect of
printing out the whole dataset, which takes at least a few minutes!
Is there any better way to retrieve these dimensions? Seems like they would
be stored in an attribute for the dataset, but they aren't.

Thanks!!
runway88

···

--
View this message in context: http://hdf-forum.184993.n3.nabble.com/h5dump-dimensions-of-a-dataset-tp3954909.html
Sent from the hdf-forum mailing list archive at Nabble.com.

What's the easiest way to obtain the dimensions of the attached image, i.e.
the dataset "SBI" in my case? I can *see* the dimensions when I use h5dump
this way:

h5dump -d /S01/SBI/ C*h5 | grep DATASPACE
DATASPACE SIMPLE { ( 14446, 18643, 2 ) / ( H5S_UNLIMITED,
H5S_UNLIMITED, 2 ) }

(the size of the image is 14446 x 18643) - but this has the effect of
printing out the whole dataset, which takes at least a few minutes!
Is there any better way to retrieve these dimensions?

h5ls?

- Rhys

I think
h5dump -A -d /S01/SBI/ C*h5
will get you what you want.

···

On 05/02/2012 06:33 AM, runway88 wrote:

Hi everyone,

I'm a newbie to HDF5 and I'm writing a simple parameter ingestion program
for HDF5-formatted images.
I can get almost all the attributes I need using a function that interfaces
with "h5dump", but there's one sticky point I can't find: the raster
dimensions!

What's the easiest way to obtain the dimensions of the attached image, i.e.
the dataset "SBI" in my case? I can *see* the dimensions when I use h5dump
this way:

h5dump -d /S01/SBI/ C*h5 | grep DATASPACE
   DATASPACE SIMPLE { ( 14446, 18643, 2 ) / ( H5S_UNLIMITED,
H5S_UNLIMITED, 2 ) }

(the size of the image is 14446 x 18643) - but this has the effect of
printing out the whole dataset, which takes at least a few minutes!
Is there any better way to retrieve these dimensions? Seems like they would
be stored in an attribute for the dataset, but they aren't.

Thanks!!
runway88

--
View this message in context: http://hdf-forum.184993.n3.nabble.com/h5dump-dimensions-of-a-dataset-tp3954909.html
Sent from the hdf-forum mailing list archive at Nabble.com.

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

Hi,

You can use -H option with h5dump to display only header information without data.

So for your example:
h5dump -H -d/S01/SBI/ C*h5

This gives you set of file name and dataset name along with the layout info, so you can capture them to process as you want.

Thanks,

Jonathan

···

On 5/2/2012 6:33 AM, runway88 wrote:

Hi everyone,

I'm a newbie to HDF5 and I'm writing a simple parameter ingestion program
for HDF5-formatted images.
I can get almost all the attributes I need using a function that interfaces
with "h5dump", but there's one sticky point I can't find: the raster
dimensions!

What's the easiest way to obtain the dimensions of the attached image, i.e.
the dataset "SBI" in my case? I can *see* the dimensions when I use h5dump
this way:

h5dump -d /S01/SBI/ C*h5 | grep DATASPACE
   DATASPACE SIMPLE { ( 14446, 18643, 2 ) / ( H5S_UNLIMITED,
H5S_UNLIMITED, 2 ) }

(the size of the image is 14446 x 18643) - but this has the effect of
printing out the whole dataset, which takes at least a few minutes!
Is there any better way to retrieve these dimensions? Seems like they would
be stored in an attribute for the dataset, but they aren't.

Thanks!!
runway88

--
View this message in context: http://hdf-forum.184993.n3.nabble.com/h5dump-dimensions-of-a-dataset-tp3954909.html
Sent from the hdf-forum mailing list archive at Nabble.com.

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

Thanks people, with your help I found a command that does what I need:

h5ls -r C*h5 | grep SBI

(where SBI is the name of the dataset)
and the output is, for example:

/S01/SBI Dataset {14446/Inf, 18643/Inf, 2}

So I can parse this to get the dimensions...

Cheers, runway88

···

--
View this message in context: http://hdf-forum.184993.n3.nabble.com/h5dump-dimensions-of-a-dataset-tp3954909p3958606.html
Sent from the hdf-forum mailing list archive at Nabble.com.

You can use -H option with h5dump to display only header information without
data.

So for your example:
h5dump -H -d /S01/SBI/ C*h5

This gives you set of file name and dataset name along with the layout info,
so you can capture them to process as you want.

I'll doubly recommend 'h5ls' over 'h5dump -H' if you just want to
snarf dimensions from textual output. Compare

[5006 rhys@setun Desktop][1]$ h5dump -H -d /bar_rho_u_u_u sample00000.h5
HDF5 "sample00000.h5" {
DATASET "/bar_rho_u_u_u" {
COMMENT "Mean quantity sample stored using row-major indices (B-spline
coefficient, tensor component, sample number) where the B-spline basis
is defined by /Ny, /breakpoints_y, and /knots"
   DATATYPE H5T_IEEE_F64LE
   DATASPACE SIMPLE { ( 1, 10, 96 ) / ( 1, 10, 96 ) }
}
}

to

[5009 rhys@setun Desktop]$ h5ls sample00000.h5 | grep bar_rho_u_u_u
bar_rho_u_u_u Dataset {1, 10, 96}

where the latter is /much/ easier to parse from a shell script.

- Rhys

This is true if one is only interested in layout info without associating file names via command line.
Also , in other way, user can write a quick script to display file name first and its h5ls output.
So I guess it's users choice.

Thanks for the reply!

Jonathan

···

On 5/2/2012 9:41 AM, Rhys Ulerich wrote:

You can use -H option with h5dump to display only header information without
data.

So for your example:
h5dump -H -d /S01/SBI/ C*h5

This gives you set of file name and dataset name along with the layout info,
so you can capture them to process as you want.

I'll doubly recommend 'h5ls' over 'h5dump -H' if you just want to
snarf dimensions from textual output. Compare

[5006 rhys@setun Desktop][1]$ h5dump -H -d /bar_rho_u_u_u sample00000.h5
HDF5 "sample00000.h5" {
DATASET "/bar_rho_u_u_u" {
COMMENT "Mean quantity sample stored using row-major indices (B-spline
coefficient, tensor component, sample number) where the B-spline basis
is defined by /Ny, /breakpoints_y, and /knots"
    DATATYPE H5T_IEEE_F64LE
    DATASPACE SIMPLE { ( 1, 10, 96 ) / ( 1, 10, 96 ) }
}

to

[5009 rhys@setun Desktop]$ h5ls sample00000.h5 | grep bar_rho_u_u_u
bar_rho_u_u_u Dataset {1, 10, 96}

where the latter is /much/ easier to parse from a shell script.

- Rhys

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

Also , in other way, user can write a quick script to display file name
first and its h5ls output.

Very true.

# Dump h5ls output for all files in the current directory
# Each line is prefixed by the relevant filename
for file in *.h5; do h5ls $file | awk "{printf \"$file \";print}"; done

- Rhys

If you're into python, I'll heartily recommend h5py. It will let you
loop over all datasets and grab whatever you want in just a few lines,
e.g. all dataset names, data types, shapes, attributes, or whatever.

Paul

···

On Wed, May 2, 2012 at 5:10 PM, Rhys Ulerich <rhys.ulerich@gmail.com> wrote:

Also , in other way, user can write a quick script to display file name
first and its h5ls output.

Very true.

# Dump h5ls output for all files in the current directory
# Each line is prefixed by the relevant filename
for file in *.h5; do h5ls $file | awk "{printf \"$file \";print}"; done

- Rhys

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org