writing 16 bits unsigned int with java-hdf

Hi,

I need to write scientific data in a 16 bits unsigned int multi dimensional
array.
My data in java is in memory in a 32 bit signed int multi dimensional array.

The in memory array and the dataset have the same dimensions.

This how I have create my hdf5 dataset:
Datatype dtype = testFile.createDatatype(Datatype.CLASS_INTEGER, 2,
Datatype.NATIVE, Datatype.SIGN_NONE);

Dataset dataset = testFile.createScalarDS("radiance", g1, dtype, dims2D,
null, null, 0, null);

I get a conversion exception when I try to write the full array with
dataset.write()

java.lang.ClassCastException: [[I cannot be cast to [I
  at ncsa.hdf.object.Dataset.convertToUnsignedC(Dataset.java:1110)
  at ncsa.hdf.object.h5.H5ScalarDS.write(H5ScalarDS.java:905)
  at eumetsat.sandbox.Sandbox.create_dset(Sandbox.java:234)
  at eumetsat.sandbox.Sandbox.main(Sandbox.java:299)

Could you please let me know what I am doing wrong or point me out to a
piece of code example doing the same thing.

Many thanks
Guillaume

···

--
View this message in context: http://hdf-forum.184993.n3.nabble.com/writing-16-bits-unsigned-int-with-java-hdf-tp4025219.html
Sent from the hdf-forum mailing list archive at Nabble.com.

When data is written from 32-bit singed integer in memory to 16-bit unsigned
integer in file, it will be converted to unsigned using Dataset.convertToUnsignedC().

Our current implementation of Dataset.convertToUnsignedC() does not support
multiple dimension arrays. It only works for 1-D arrays for performance reason.
We will add the issue (supporting for N-D arrays) to our work list.

Reading and writing N-D arrays is not efficient. We encourage users to read to 1D array
and write data in 1D, which is very efficient.

As a workaround, you could convert the data by yourself before you write the data.
See the example code below:

···

==========
     private static void testH5Write2D(final String filename) throws Exception
     {
         long[] dims = {10, 5};
         int[][] data = new int[(int)dims[0]][(int)dims[1]];
         H5File file = new H5File(filename, H5File.CREATE);
         file.open();

         for (int i=0; i<data.length; i++)
             for (int j=0; j<data[0].length; j++)
                     data[i][j] =(int) ((i+1)*j);

         Datatype dtype = file.createDatatype(Datatype.CLASS_INTEGER, 2, Datatype.NATIVE, Datatype.SIGN_NONE);
         Dataset dataset = file.createScalarDS("dset", null, dtype, dims, null, null, 0, null);
         dataset.init();

         short[][] tmp = new short[(int)dims[0]][(int)dims[1]];
         for (int i=0; i<data.length; i++)
             for (int j=0; j<data[0].length; j++)
                     tmp[i][j] = (short) data[i][j];

         dataset.write(tmp);

         file.close();
     }

On 7/4/2012 9:44 AM, gaubert wrote:

Hi,

I need to write scientific data in a 16 bits unsigned int multi dimensional
array.
My data in java is in memory in a 32 bit signed int multi dimensional array.

The in memory array and the dataset have the same dimensions.

This how I have create my hdf5 dataset:
Datatype dtype = testFile.createDatatype(Datatype.CLASS_INTEGER, 2,
Datatype.NATIVE, Datatype.SIGN_NONE);

Dataset dataset = testFile.createScalarDS("radiance", g1, dtype, dims2D,
null, null, 0, null);

I get a conversion exception when I try to write the full array with
dataset.write()

java.lang.ClassCastException: [[I cannot be cast to [I
  at ncsa.hdf.object.Dataset.convertToUnsignedC(Dataset.java:1110)
  at ncsa.hdf.object.h5.H5ScalarDS.write(H5ScalarDS.java:905)
  at eumetsat.sandbox.Sandbox.create_dset(Sandbox.java:234)
  at eumetsat.sandbox.Sandbox.main(Sandbox.java:299)

Could you please let me know what I am doing wrong or point me out to a
piece of code example doing the same thing.

Many thanks
Guillaume

--
View this message in context: http://hdf-forum.184993.n3.nabble.com/writing-16-bits-unsigned-int-with-java-hdf-tp4025219.html
Sent from the hdf-forum mailing list archive at Nabble.com.

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org