Hello,
I am trying to create a large file (~155 MB) using the HDF Java API.
The idea is to create a 3D integer dataset from a set of 2D images. The total dataset size will be 512x512x304 (16 bits).
The original data is a set of 2D DICOM imagens. Each DICOM image is a 512x512 (16 bits) slice.
My code is:
//Create 512x512x304 image
H5File testFile = ...
//Obtain list of DICOM files
File[] list = ...
//Copy data to HDF image
for(int i=0;i<list.length;i++){
short[] array = readDicomFile(list[i]);
long[] start = dataset.getStartDims();
long[] stride = dataset.getStride();
long[] sizes = dataset.getSelectedDims();
// select the subset: starting at (0, 0, z)
start[0] = 0;
start[1] = 0;
start[2] = i;
// select the subset: subset size (512, 512, 1)
sizes[0] = 512;
sizes[1] = 512;
sizes[2] = 1;
// select the subset: set stride to (1, 1, 1)
stride[0] = 1;
stride[1] = 1;
stride[2] = 1;
Object data = dataset.read();
short[] buffer = (short[])data;
for (int j = 0; j < buffer.length; j++) {
buffer[j] = array[j];
}
dataset.write(buffer); } //Close the file
testFile.close();
The problem is that this code is VERY slow. Is there a way to create
the hdf file in a faster way?
Thank you all,
Ramon Moreno
···
----------------------------------------------------------------------
This mailing list is for HDF software users discussion.
To subscribe to this list, send a message to hdf-forum-subscribe@hdfgroup.org.
To unsubscribe, send a message to hdf-forum-unsubscribe@hdfgroup.org.