Hi all,
I want to check the results of my application in real-time. Therefore, I
use the *HDFViewer* provided by the HDFGroup to open a single HDF5 file,
while my application is still running and writing data in parallel.
In an initialization phase of the code datasets are generated and
attributes are written. Then, the file is opened for parallel access
using *h5pset_fapl_mpio_f*. At this point, I already can see the file
written to disk. When I open it externally with HDFView, the file
structure is present and all data arrays are initialized with zeros.
In the next step of the code, data is generated and the application
calls *h5dwrite_f *to write it in parallel. When I now open the file
externally with HDFView, I still see the just empty arrays as I would
expect it, since not necessarily the data have to be written to disk.
But when the program is finished, the arrays are incomplete. That means,
_to_ the first part of the arrays still just zeros are assigned, whereas
the last part shows the correct data. This behavior is triggered by the
external file access. Is it expected?
Is there a safe why to check the data in real-time? I already tried
*h5fflush_f*, but then the file is locked for external access until the
application is finished.
Cheers,
Stefan