I faced with problem when trying to copy dataset to the file, located on a network drive.
For example (it is pseudocode, OS Windows):
target_file = H5Fcreate(path_to_network_drive, H5F_ACC_TRUNC, H5P_DEFAULT, H5P_DEFAULT);
H5Ocopy(src_file, ‘.’, target_file, ‘data’, H5P_DEFAULT, H5P_DEFAULT);
The point is that the src_file contains one resizable dataset (of doubles) with reletavly small chunk size (1024 elements). File with 64 millions doubles (~530 Mb) coping through H5Ocopy for about 80 seconds.
If try to copy src_file with fixed size dataset (with same count of elements), it require about 10 seconds.
If I increase the chunk size, it will be finished faster, but it isn’t suitable for my case.
Is this well known problem? This is problem with HDF5 library or it can be fixed by some way?