We have an old Windows tool that uses HDF5 version 1.2.1 (December 21, 1999) and we’re hoping we can upgrade to the latest version of HDF5 API. This upgrade came about mainly due a fatal error coming from HDF5 v1.2.1 that’s bringing down its parent JVM, but only when running on newer Windows (11 in this case).
We interface with our tool via JNI. The error in HDF5 1.2.1 seems to be a 0xC0000374, STATUS_HEAP_CORRUPTION. This fatal error does not bring down the JVM on a legacy WinXP deployment of our tool.
We’re hoping to find the best HdF5 upgrade path and maybe some tips. This is my first exposure to the HDF5 suite. I’m pouring over the docs now.
Some questions before we dive into upgrading HDF5:
Does the API Compatibility Macros offer a mapping of our old version (1.2.1) to the latest (1.14.6)?
If not, should we just compile against 1.14.6, see what’s wrong, and look up and upgrade to the updated function signatures?
Should we try older HDF5 versions, hopefully with the same API we’re using, and see if our problem goes away?
Any upgrade path will require work and decisions. There has been significant changes and maybe you should start by reviewing the HISTORY-1_0-1_8_0.txt file in the hdf5 repo release_docs folder.
Though really, the earliest version we can start discussing would be 1.6/1.8. However JNI/Java interface was in a different project, HDFView 2.
Version 1.10 added the java interface into the hdf5 repo and HDFView moved away from the JNI with release 3. In addition, all IDs changed from 32-bit to 64-bit (ints to longs). Again the HISTORY-1_8_0-1_10_0.txt file would be of help.
1.12/1.14 added some new features and major fixes. Again see the HISTORY files.
Now if you want to skip all that and just use the latest 1.14.6, and you have your application code - at a minimum you will need to deal with the ID changes. There are API settings that let you focus on using older APIs (like 1.6) but depending on your requirement for continued use, you may want to convert to the lastest APIs.