compound data types in Java

Uh oh. I just read the page on using compound data types in Java:
http://www.hdfgroup.org/hdf-java-html/JNI/jhi5/compound.html

Is this the latest summary on the subject? The second option
(effectively not using compound data types but instead using separate
arrays) is not an option for me, as I have to deal with a large number
of preexisting data files. The warning at the end of the first option is
rather scary and I am looking for some clarification:

The programmer should be aware that the proper layout
of the bytes depends on the platform and C compiler.
It is up to the program to make sure it constructs
the correct records.

Does this mean the Java H5.* libraries have platform-dependent behavior
for dealing with this case? I can parse/manipulate raw byte arrays
manually in Java if I have to, but if the platform-dependency is in the
low-level read-write functions that access the HDF5 file, then I'm in
trouble.

···

_________________________________________________________________________________________

This e-mail and the information, including any attachments, it contains are intended to be a confidential communication only to the person or entity to whom it is addressed and may contain information that is privileged. If the reader of this message is not the intended recipient, you are hereby notified that any dissemination, distribution or copying of this communication is strictly prohibited. If you have received this communication in error, please immediately notify the sender and destroy the original message.

Thank you.

Please consider the environment before printing this email.

wait a minute -- how does the HDFView Java app do it? which file in the
source code handles compound data types?

···

-----Original Message-----
From: hdf-forum-bounces@hdfgroup.org
[mailto:hdf-forum-bounces@hdfgroup.org] On Behalf Of Jason Sachs
Sent: Tuesday, July 14, 2009 11:06 AM
To: hdf-forum@hdfgroup.org
Subject: [Hdf-forum] compound data types in Java

Uh oh. I just read the page on using compound data types in Java:
http://www.hdfgroup.org/hdf-java-html/JNI/jhi5/compound.html

Is this the latest summary on the subject? The second option
(effectively not using compound data types but instead using separate
arrays) is not an option for me, as I have to deal with a large number
of preexisting data files. The warning at the end of the first option is
rather scary and I am looking for some clarification:

The programmer should be aware that the proper layout
of the bytes depends on the platform and C compiler.
It is up to the program to make sure it constructs
the correct records.

Does this mean the Java H5.* libraries have platform-dependent behavior
for dealing with this case? I can parse/manipulate raw byte arrays
manually in Java if I have to, but if the platform-dependency is in the
low-level read-write functions that access the HDF5 file, then I'm in
trouble.

_________________________________________________________________________________________

This e-mail and the information, including any attachments, it contains are intended to be a confidential communication only to the person or entity to whom it is addressed and may contain information that is privileged. If the reader of this message is not the intended recipient, you are hereby notified that any dissemination, distribution or copying of this communication is strictly prohibited. If you have received this communication in error, please immediately notify the sender and destroy the original message.

Thank you.

Please consider the environment before printing this email.

A few guesses would be:
   Java is Big Endian by nature. If you are running on an x86 box there may be endian issues but you should be able to check if the data set was stored as big or little endian.
   Padding of fields: Some compilers will pad fields to bring them into alignment certain types of byte boundaries. This also may have an effect on the storage of the data. If you treat the compound data type like a big array of bytes and do all the encoding/decoding in java then you probably will not have any problems with the padding or the endian issues. But these are my guesses.

···

On Jul 14, 2009, at 11:06 AM, Jason Sachs wrote:

Uh oh. I just read the page on using compound data types in Java:
http://www.hdfgroup.org/hdf-java-html/JNI/jhi5/compound.html

Is this the latest summary on the subject? The second option
(effectively not using compound data types but instead using separate
arrays) is not an option for me, as I have to deal with a large number
of preexisting data files. The warning at the end of the first option is
rather scary and I am looking for some clarification:

The programmer should be aware that the proper layout
of the bytes depends on the platform and C compiler.
It is up to the program to make sure it constructs
the correct records.

Does this mean the Java H5.* libraries have platform-dependent behavior
for dealing with this case? I can parse/manipulate raw byte arrays
manually in Java if I have to, but if the platform-dependency is in the
low-level read-write functions that access the HDF5 file, then I'm in
trouble.

---
Mike Jackson www.bluequartz.net