Refresh object so that external changes were applied

Hi,

Lets suppose I’m working with HDF5 from python.
I have opened group and at the time I opened it I know that there is no data within it.
While this group is opened I add a dataset to it from HDFVIEW and close the HDFVIEW. So the changes should be saved.
Then I need to go back to python and get this dataset. But apparently I cannot see this dataset within my group.

Is there a way to “reload” the group so the external modification were visible?
I’ve tried H5Orefresh but it seems it doesn’t help in this context.

Are you keeping the file/group open in Python while modifying it (adding the dataset) in HDFView? If that were the case, that would be asking for trouble. Since h5py and HDFView are running in different process contexts, and there’s no inter-process communication, you end up with two inconsistent in-memory file states.
Does that make sense? G.

1 Like

Thank you for response.

Yes, that is the case I worry about.
To clarify a little: I don’t use h5py but rather custom C++ wrapper (based on HighFive) that I binded to python using pybind11.

How do you think if I reload the module will it I be able to reopen the file/group/dataset and see the modification made with HDFView?
I think I will try that soon, but just in case you have some thought on that…

No, reloading the module will most likely NOT help. The problem is that, unless you flush the changes you’ve made via HDFView, they will not be visible to the OS buffer cache/file system, because some of them are still sitting in the cache of the HDF5 library instance that’s running in the context of the HDFView process. I suspect, but don’t know for sure if pushing the “Save” button in HDFView triggers a flush, but closing the file will for sure.

1 Like

Save just does a H5Dwrite, reloading the file in HDFView would flush everything.

1 Like