Warm Regards,
Jim
From: Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org] On Behalf Of Elena Pourmal
Sent: Monday, April 21, 2014 10:22 PM
To: HDF Users Discussion List
Subject: Re: [Hdf-forum] Access violation after deleting link and recreating dataset
Hi Jim,
On Apr 20, 2014, at 11:56 PM, "Rowe, Jim" <J.Rowe@questintegrity.com> wrote:
Thank you, Elena. I think my problem was due to multi-threading issues in my client app. A separate thread(s) was accessing the file through a different fileid. I’ve reworked my thread synchronization and am preventing multiple instances from getting created and all is looking good.
Is there a simple way to just open the file for exclusive access? Similar to share_exlusive flag for windows OpenFile function. It seems that would prevent other threads
and processes from accessing it. Unless I am missing something, the only H5f open flags are read-only and read/write.
Unfortunately, there is no a simple way to open the file for exclusive access with the current HDF5. We contemplated the idea for the next major release, but implemented it only for the SWMR (single writer-multiple reader) mode.
Preventing file access by multiple writers or single writer/multiple readers without using SWMR mode will definitely make HDF5 little-bit more user-friendly ;-)! We may implement in the future.
Thank you!
Elena
Warm Regards,
Jim
From: Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org] On Behalf Of Elena Pourmal
Sent: Thursday, April 17, 2014 6:08 PM
To: HDF Users Discussion List
Subject: Re: [Hdf-forum] Access violation after deleting link and recreating dataset
Jim,
A short program that demonstrates the issue would be helpful.
I just tried in C and couldn't make it to fail (probably haven't try hard
#include "hdf5.h"
#define FILE "dset.h5"
int main() {
hid_t file_id, dataset_id, dataspace_id; /* identifiers */
hsize_t dims[2];
herr_t status;
/* Create the data space for the dataset. */
dims[0] = 4;
dims[1] = 6;
dataspace_id = H5Screate_simple(2, dims, NULL);
/* Create a new file using default properties. */
file_id = H5Fcreate(FILE, H5F_ACC_TRUNC, H5P_DEFAULT, H5P_DEFAULT);
/* Create the dataset. */
dataset_id = H5Dcreate2(file_id, "/dset", H5T_STD_I32BE, dataspace_id,
H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT);
status = H5Dclose(dataset_id);
/* Delete the dataset. */
H5Ldelete (file_id, "/dset", H5P_DEFAULT);
H5Fflush(file_id, H5F_SCOPE_GLOBAL);
dataset_id = H5Dcreate2(file_id, "/dset", H5T_STD_I32BE, dataspace_id,
H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT);
status = H5Dclose(dataset_id);
status = H5Sclose(dataspace_id);
status = H5Fclose(file_id);
}
Elena
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Elena Pourmal The HDF Group http://hdfgroup.org
1800 So. Oak St., Suite 203, Champaign IL 61820
217.531.6112
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
On Apr 17, 2014, at 5:01 PM, "Rowe, Jim" <J.Rowe@questintegrity.com> wrote:
Hello- I am consistently getting access violations after doing the following:
H5L.Delete(_H5FileId, dataSetPath);
H5F.flush(_H5FileId, H5F.Scope.GLOBAL);
// setup for create call omitted
H5D.create( … ) //creates same structure dataset to same dataSetPath
My intent is to completely clear out the dataset by removing and recreating it. Is there a preferred way to do this? Is H5L not the right call?
Any help would be appreciated.
Warm Regards,
Jim
_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org