Based on your original example code you posted, your second call to H5Dget_create_plist after you closed and then re-opened the dataset is suggestive that that is what you want.
H5Dclose is there to make certain I can re-open the dataset from a clean state with
H5Dopen(..., dapl_id). The emphasis is on the passed
H5Dopen API calls, and how it interacts with
H5Dget_access_plist. Out of the two the
dapl is more interesting, since it is passed to both API calls, and if they preserved their state – which at this point don’t – then the
H5Dget_access_plist call would return the same property list.
Returning a property list only makes sense if they are the same you saved. This way one can save raw pointers to objects, and retrieve them later in a different call.
To give you an example how I am using this in H5CPP one could pass custom data access properties as a list to
auto fd = h5::open("some_file.h5", H5F_ACC_RDWR );
auto ds = h5::open(fd, "dataset",
h5::high_throughput | h5::julia );
To choose filtering pipeline
h5::builtin | h5::high_throughput and how objects such as
std::hashmap<K,V> should be persisted
h5::julia | h5::python | h5::matlab way, so they can be retrieved from respected systems without additional code. If this worked out, then opened up the possibility to use HDF5 format for general compiler assisted persistence for modern C++.
Although In H5CPP I can work around this problem by attaching the
dapl to h5::ds_t dataset, this breaks the contract: H5CPP having binary compatible with the underlying HDF5 CAPI.
I hope the above argument makes sense to THG.
- Should customized dataset CREATE properties be persisted to the file? I believe this is the root of the problem in your existing code where you open the dataset you previously created and cannot find your custome properties there. Standard CREATE properties certainly are persisted. So, I think it is reasonable that if a caller has customized the dataset CREATE properties to expect those customized properties to also get persisted right along with the standard properties
In my opinion it should not. There are attributes to color datasets and groups and their lifespan is indeed tied to their parent. I argue that properties are the ‘attributes’ of object handles/dataset descriptors, and the property lists lifespan should be tied to their parent objects, the dataset descriptors.
For instance if I want to replace the HDF5 provided filter chain with one on my own – which has different design criteria, and performance properties – then the easiest way of tying this experimental pipeline is to use a property list. Using the advertised property list interface one can initialize, and shut down the object proper. Unfortunately this is not possible with the current implementation of
H5P_DEFAULT are unaffected by the proposed changes. The code path is only replicating/mirroring existing mechanism, the one used with
Although in the H5CPP the
h5::default-s are initialised with sensible values with good result/acceptance currently I don’t see if it is relevant to the original problem: HDF5 CAPI doesn’t preserve Data Access Property List instead makes one up on the go.