How to Write Java Objects to HDF5?

Hello.

Bear with me, I am learning HDF5 and I really appreciate any help you can
provide.

Is it possible to write Java objects to an HDF5 file?

My requirement is I have some data set with complex data structure that I
am not able to represent with available data types of HDF (Integer, String,
byte etc.) at the same time this data set needs to fit into overall dataset.

For example, I have a parent data set Parent with subset called child1
(String), child2 (Integer), child3 (custom- a java object), how to
read/write this parent data set and all of its subsets using Java API for
HDF5?

If it is possible, would it also be possible to apply parallel IO for
dataset A?

Thank you very much.

Best regards.

Hi!

My name is Jordan, and I work on an open-source project that uses HDF5. Our
project uses the HDF Java Object library to handle reading/writing for
datasets ranging from a couple MB to a hundred MB in size. Although we
don't currently do any parallel IO with HDF5, you might want to check out
our project for examples with the Java Object library... see
github.com/eclipse.ice, in particular the bundles src/org.eclipse.ice.io
and src/org.eclipse.ice.reactor

Here are a couple of links to some wiki pages we wrote about our use of
HDF5 in our Java project:

http://sourceforge.net/p/niceproject/docs/Challenges%20Using%20HDF5%20and%20Java/
http://sourceforge.net/p/niceproject/docs/NiCE%20and%20HDF5/

These articles are a little old, and since I started, we've shied away from
the Java Object library in favor of the JNI. However, this should hopefully
get you started.

Also, I'd like to add that passing strings between a native (C/C++) library
and Java can be tricky. In my experience, the null terminator can be
particularly troublesome. I think we have examples of using both the Java
Object library and the JNI to handle this...

HdfReaderFactory and HdfWriterFactory use the Java Object library for
reading and writing HDF Attributes, including Integers and Strings.
HdfIOFactory uses the JNI to read and write the same types of attributes. I
prefer this one, as it seems more stable and faster in practice, although I
may be biased since I wrote it. :slight_smile:

I'll leave the discussion about performance issues regarding HDF file
structure to more talented experts, but you might want to give a more
detailed example of the problem you are trying to address. Pictures are
always nice! :slight_smile:

Cheers,
Jordan

ยทยทยท

On Mon, Oct 27, 2014 at 4:36 PM, Hdfdev Hdfdev <hdf5dev@gmail.com> wrote:

Hello.

Bear with me, I am learning HDF5 and I really appreciate any help you can
provide.

Is it possible to write Java objects to an HDF5 file?

My requirement is I have some data set with complex data structure that I
am not able to represent with available data types of HDF (Integer, String,
byte etc.) at the same time this data set needs to fit into overall dataset.

For example, I have a parent data set Parent with subset called child1
(String), child2 (Integer), child3 (custom- a java object), how to
read/write this parent data set and all of its subsets using Java API for
HDF5?

If it is possible, would it also be possible to apply parallel IO for
dataset A?

Thank you very much.

Best regards.

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org

http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5