I am quite new to hdf5 and was given a task to fix some code we have that takes in file(s) and wraps them into an hdf5 file. This all worked well until we started getting larger files (> 4GB) which exceeded the amount that java can store in a single byte array. my thought was to read in as much as possible and write it to the hdf5 file and then read in the next chunk and append it. Our code is storing all files as 1 dimensional arrays of opaque data. It seems like I need to use hyperslab to write the files to hdf5 in chunks,so I looked through the various hyperslab examples and tried to incorporate them into our code but to date all of my attempts have met with “HDF5ResourceUnavailableExceptions” or “Write Failed” exceptions.
Some of the more immediate questions I have would be
When creating the Opaque Datatype should I be setting the size to 1 byte or the entire file size?
Is chunking the file optional ?
Does anyone have any examples of the writes large binary files into an HDF5file?