I created a conversion application for a client using the HDF5 libraries. This application was working fine for them as of late 2024. I made some updates in 2025. It worked well on my desktop and I was able to convert their data for them, but they were not able to convert it themselves with the same application. They were not getting the output files at all.
I am trying to get this application working on their systems. I updated my HDF5 libraries to 2.0, updated my HDFView. I am able to generate complete output. They are getting an output file now, but just a little bit of header in there. Many HDF5 return code prints later and I’m not seeing anything. I opened both their output file and mine in hexed.it. There is a difference in the first couple of byte lines. I’m guessing it’s a dll version issue somewhere but I could use some help.
I did find this page, but it really just made it more confusing.
The first 8 bytes should be the HDF5 File Signature and match.
89 48 44 46 0D 0A 1A 0A
The next bytes, bad good comment:
Version # of Super Block: 02 02 match but documentation says this should be 0 or 1, is this the problem?
Version # of Global Free-space Storage 08 08 match,
Version # of Root Group Symbol Table Entry: 08 08 match
Reserved (zero): 01 00 ***Is this the problem?
I am thinking the documentation I found is out of date and that I still have a version mismatch of something somewhere.
You said you are now using HDF5 version 2.0. What is the HDF5 version that your client is using?
Please use HDF5 diagnostic tools to evaluate differences between HDF5 files. Raw hex is very difficult to examine. Do not use hex, except as a last resort. You should have “h5dump” as a command line tool, as part of your HDF5 installed package. To begin, run “h5dump -BH filename.h5” on each file, and compare the outputs. (This works on Linux systems, but the instructions might be slightly different on Windows. I am not a Windows user.)
@julie, thanks for raising this issue. While you may have second thoughts about the HDF5 2.0.0 update, you did what I would recommend to anyone without hesitation. Bravo!
We published an HDF5 2.0.0 Library Migration Guide that starts with four common “myths” about HDF5 2.0.0. What you are seeing, I believe, is a form of what’s referred to as myth no. 3 in the document: All files written by 2.0.0 are unreadable by 1.x. The solution provided in the document reads: “If you need older compatibility, explicitly set bounds.” Now that may sound a little mysterious, but an example is given a few lines earlier Example 3: Creating files that older HDF5 can read. Would you mind trying that and reporting back?
dave.allured: What is the HDF5 version that your client is using?
They are using the reader and whatever files I put into the installer.
gheber: The upgrade was not a decision. The client needed a few minor fields changed, and the resultant app did not work for him.
Sorry for the long delay, life happened. You both responded quickly and I greatly appreciate it.
I added the lines to set the file access that you directed me to. It worked on my machine. I rebuilt the app and installer and have provided it to my client. We shall see.
Unfortunately, this did not work. I have tried in on a second computer locally (client is not local) and it fails in the same way.
This was supposed to just be adding a couple fields that had been removed, a couple hour job that has turned into a nightmare.
The same app generates a valid output file on my dev system, but nowhere else. I can view the output file generated on my dev system as can the client. Neither of us can open the output file created on another machine.
I tried h5dump on both files. With -V, both report:
h5dump: Version 2.0.0
With -B, the dev system output file generates more output than the original file:
h5dump --superblock JC0001_000000.hdf5 > JC001_superblock.txt
With the output file generated elsewhere:
h5dump --superblock SC0001_000000.hdf5 > SC001_superblock.txt
h5dump error: unable to open file “SC0001_000000.hdf5”
In a similar fashion, h5dump -n gives me a nicely formatted list of contents vs. unable to open file. The same for -H.
The files are similar size, the bad (SC) is slightly smaller as if something is just missing. I am hoping for a response, but in the meantime I am going back to hexedit to see where the difference is.
Julie, are you able to run h5dump on the same machine that created the file in question? If so, please run exactly “h5dump -BH” on each of those two files, capture the outputs, and attach here as two *.txt files. The outputs are large, so please do not insert in-line. H5dump run on different machines, as needed, would be sufficient.
If confidentiality is an issue, please let us know and we will deal with it somehow.
(File attachment is frustratingly obscure on this Discourse system. Inside an edit window, look for an “Upload” icon in the tool bar at the top of the window. It looks like an upward-pointing arrow.)
I attempted this, I still got “h5dump error: unable to open file”. Confidentiality is an issue, I will plop these 2 output files on my Google drive and PM links.
Julie, thanks for providing the two files. Same as you, I can easily read JC0001 with both h5dump versions 1.4.5 and 2.1.0. Both h5dump versions are NOT able to read SC0001.
However, engaging the error stack shows slightly more information about a “file open error”. See full error stack from h5dump v2.1.0, below. Try yourself. The stack from 2.1.0 is shorter and IMO more precise and helpful, than the long stack from 1.14.5.
With that and a bit of code reading, here are some conclusions.
File SC0001 is confirmed to be a real HDF5 file.
The superblock appears to be valid, by function H5F_open, lines 2115-2117, H5Fint.c v2.1.0.
The root group object seems to be damaged or unreadable for some reason.
The lowest level error is “bad object header version number”. I would not read too much into that. This could easily be broader damage that showed up first as an unexpected number.
In short, this seems to be a damaged file that is not simply missing a small piece. This could be a bug in the generating process, or possibly an actual bug in HDF5. I suggest the best way forward is to look at the code and process that created or broke this file. Is this problem repeatable? Can you produce the code and procedure that generated the problem?
This has become a gnarly problem, beyond my expertise in HDF5. I recommend that you offer the file and code to the HDF5 support team, and let them get into it. Possibly a smart person will recognize a known pattern in this error stack.
HDF5-DIAG: Error detected in HDF5 (2.1.0):
#000: /Users/dallured/hdf5/210/src/src/H5F.c line 821 in H5Fopen(): unable to synchronously open file
major: File accessibility
minor: Unable to open file
#001: /Users/dallured/hdf5/210/src/src/H5F.c line 782 in H5F__open_api_common(): unable to open file
major: File accessibility
minor: Unable to open file
#002: /Users/dallured/hdf5/210/src/src/H5VLcallback.c line 3869 in H5VL_file_open(): open failed
major: Virtual Object Layer
minor: Can't open object
#003: /Users/dallured/hdf5/210/src/src/H5VLcallback.c line 3718 in H5VL__file_open(): open failed
major: Virtual Object Layer
minor: Can't open object
#004: /Users/dallured/hdf5/210/src/src/H5VLnative_file.c line 128 in H5VL__native_file_open(): unable to open file
major: File accessibility
minor: Unable to open file
#005: /Users/dallured/hdf5/210/src/src/H5Fint.c line 2140 in H5F_open(): unable to read root group
major: File accessibility
minor: Unable to open file
#006: /Users/dallured/hdf5/210/src/src/H5Groot.c line 218 in H5G_mkroot(): can't check if symbol table message exists
major: Symbol table
minor: Can't get value
#007: /Users/dallured/hdf5/210/src/src/H5Omessage.c line 791 in H5O_msg_exists(): unable to protect object header
major: Object header
minor: Unable to protect metadata
#008: /Users/dallured/hdf5/210/src/src/H5Oint.c line 1016 in H5O_protect(): unable to load object header
major: Object header
minor: Unable to protect metadata
#009: /Users/dallured/hdf5/210/src/src/H5AC.c line 1303 in H5AC_protect(): H5C_protect() failed
major: Object cache
minor: Unable to protect metadata
#010: /Users/dallured/hdf5/210/src/src/H5Centry.c line 3154 in H5C_protect(): can't load entry
major: Object cache
minor: Unable to load metadata into cache
#011: /Users/dallured/hdf5/210/src/src/H5Centry.c line 1228 in H5C__load_entry(): incorrect metadata checksum after all read attempts
major: Object cache
minor: Read failed
#012: /Users/dallured/hdf5/210/src/src/H5Ocache.c line 185 in H5O__cache_get_final_load_size(): can't deserialize object header prefix
major: Object header
minor: Unable to decode value
#013: /Users/dallured/hdf5/210/src/src/H5Ocache.c line 1101 in H5O__prefix_deserialize(): bad object header version number
major: Object header
minor: Wrong version number
h5dump error: unable to open file "SC0001_000000.hdf5"
Thanks for the response. This issue hasn’t been forgotten, but was back burnered. I will attempt to figure out how to contact support, I thought that is what I was doing here.
Also, it is repeatable 100% of the time on most computers, just not my dev system.
Side note to HDF Group. Normally I would try h5debug in a case like this, to examine file blocks symbolically at the lowest level. However, h5debug starts with the same H5Fopen call as h5dump and other tools. This bails immediately with “cannot open file”, and nothing else, for the same reason as with h5dump. There seems to be no way around this. There is not even an option to print the error stack.
It would be helpful if h5debug could provide some kind of bypass to allow examining superblock, root group, etc. in crippled files like this one. An option to print error stack would also be good.