I just am trying to figure out if I am spinning my wheels or not?
Is it possible to write an array with the 1.10.5 libraries and still be able to read that data with a 1.4.4 version.
What I am seeing is that basic attributes like floats, integers, and the like are seen as nodes, but the data arrays are not.
I have been using C# with the latest HDF.PInvoke from Nuget (1.10.5)
And Python to C++ with the HL libraries and 1.4.4
Is there a trick I can use to write data in a special way to allow 1.4.4 to see these data nodes?
I am by no means an expert on this topic, as requires understanding of some internal parts of the library, as well as be familiar with C# and PInvoke. OTOH I share what I know on the topic, and if none else pitches, a totally insane idea still better than silence
Will get to “insane” soon, but for nowthis property lists suggest of having some control over the library version being used however. “Currently, high must be set to the pre-defined value H5F_LIBVER_LATEST. H5F_LIBVER_LATEST corresponds to the version of the HDF5 Library in use.” indicates there is no actual way to force the library to lower version then its configuration.[^1] within a recent source distribution I only see options for --with-default-api-version=(v16|v18|v110)
Here are two choices to explore – if you feel adventurous:
RPC or remote procedure call between the two versions of the library, zeromq and protocol buffer might just do… static linking is required on each end.
The insane one, which probably will kill you, or at least leave you mentally crippled Wrap the required function calls with a prefix v14_* create a library and combine it with libhdf5-1.4, then with objcopy --keep-global-symbol=v14_call_01 [... ] --keep-global-symbol=v14_call_n] hide all symbols except the marked public ones. Link against the higher version of the HDF5 library. Enjoy the rest of the week… Did I mention this idea was insane?
Yes, I noticed the C# and that you probably are a windows OS user, and probably the library symbol manipulation is bit of a stretch, but nothing better I can think of at the moment.
best wishes: steve
[^1]: corrected outdated link, the new updated link points to the latest documentation which allows setting the versions. @dave.allured
Joseph, one of the newly supported version combinations is low=H5F_LIBVER_EARLIEST, high=H5F_LIBVER_V18. When creating a new file with HDF5 1.10.5, try inserting H5P_SET_LIBVER_BOUNDS with these particular settings before calling H5Fcreate, as shown in the code example on the same doc page. Also when writing to the new file, ensure that you avoid invoking any “new features” introduced after 1.4.x, that would require higher versioned file objects.
I do not know for sure whether low=H5F_LIBVER_EARLIEST will get you all the way back to 1.4 compatible format, but the documentation implies this might actually work. It is such a simple experiment that it is worth a try. Let us know what happens.
Hi Dave, thanks for correcting it! I updated the post with the correct link to documentation.
So If I understood it right low=H5F_LIBVER_EARLIEST can set to the earliest version the compiled libhdf5.so supports.
In any event from this older documentation link “An HDF5 design criterion is that the HDF5 Library is always backwardly compatible. A backward compatibilty failure is a bug.” should nail it.
In the doc, that statement “always backwardly compatible” applies only to the ability of newer library versions to read files written by older versions. The preceding paragraph addresses the ability of a newer library to write files that can be read by older libraries, so-called “forward compatibility”. This is not guaranteed unless the writing application sticks to traditional file objects and features, and avoids a number of newer HDF5 features that insert advanced file object versions.
The remainder of that document lists the safe and unsafe features for writing files that can be read by older library versions.
I have updated my experiment and unfortunately the 1.4.4 is still not seeing the “node”.
Perhaps my code will help, maybe I am overlooking something?
private static string _fileName = @“C:\Data\TestHDF5.h5”;
private static void Main(string[] args)
{
float[] data = new float[10];
for (int i = 0; i < data.Length; ++i)
{
data[i] = i;
}
var accessPropertyList = H5P.create(H5P.FILE_ACCESS);
H5P.set_libver_bounds(accessPropertyList, H5F.libver_t.EARLIEST, H5F.libver_t.V18);
long _fileID = H5F.create(_fileName, H5F.ACC_TRUNC, H5P.DEFAULT, accessPropertyList);
ulong[] dims = { (ulong)data.Length };
long dataSpaceID = H5S.create_simple(1, dims, null);
long dataTypeID = H5T.IEEE_F32LE;
long memoryTypeId = H5T.IEEE_F32LE;
var plist = H5P.create(H5P.DATASET_CREATE);
H5P.set_chunk(plist, 1, dims);
H5P.set_deflate(plist, 6);
var datasetID = H5D.create(_fileID, "test", dataTypeID, dataSpaceID, H5P.DEFAULT, plist, H5P.DEFAULT);
GCHandle pinnedArrayHandle = GCHandle.Alloc(data, GCHandleType.Pinned);
int answer = H5D.write(datasetID, memoryTypeId, H5S.ALL, H5S.ALL, H5P.DEFAULT, pinnedArrayHandle.AddrOfPinnedObject());
H5D.close(datasetID);
H5P.close(plist);
H5F.close(_fileID);
H5P.close(accessPropertyList);
}
Gerd, according to the RFC and the function docs, that combination earliest/earliest is not supported. Please let us know if this limitation was changed.
My guess is that Joseph’s most recent example is now limited by accidentally invoking a post-1.4 file feature, not by the compatibility settings. For next step, I would suggest removing the chunking and compression statements.
NULL dataspaces (H5Screate)
The H5S_NULL dataspace class allows an application to use H5Screate to define a dataset or attribute dataspace with no elements.
Etc.
Does your H5S.create_simple with null argument invoke the NULL dataspace feature?
Dave, you are correct. Setting them both to 0 (earliest) yields this:
HDF5-DIAG: Error detected in HDF5 (1.13.0) thread 139792772755776:
#000: H5Pfapl.c line 2448 in H5Pset_libver_bounds(): Invalid (low,high) combination of library version bound
Unfortunately, 1 sends you straight to HDF5 1.8.x
I agree that Joe might be accidentally invoking a post-1.4 feature. G.
I had to write a bit more code, but below is an H5Dump from both the 1.10 and the HLHDF versions.
NOTE: the OLD files “test” node is found, but the first file’s cannot
…back to pulling my hair out…
Officially giving up at this point…
Unless there is an H5Dump option I am missing to help figure it out.
I would say there is something broken in the HLHDF libraries the old version is using, which I cannot upgrade. So at this point I am punting.
I really do appreciate all the efforts you folks have put in.
Joseph, good job on making the h5dumps match exactly.
My interpretation of all the documentation is that this is supposed to work. With attention to details, it should be possible to get all the way back to 1.0 forward compatibility for basic data types. I would not blame this on broken HLHDF, not yet. HDF5 contains fine structure that is not rendered with any h5dump option. You could try h5debug, but I have found that tool to be quite tedious.
Would you be interested in sending me copies of TestHDF5.h5 and TestOLD.h5, also your latest test code version? I will take a closer look. Zip or tar file, use the upload button in the forum edit window.
OK, I’m showing my ignorance here (not for the first time) and didn’t catch that in the original post. When you say ‘HLHDF’, what does that mean? What is an HL_Node?