Access violation after deleting link and recreating dataset

Hello- I am consistently getting access violations after doing the following:

H5L.Delete(_H5FileId, dataSetPath);
H5F.flush(_H5FileId, H5F.Scope.GLOBAL);
// setup for create call omitted
H5D.create( ... ) //creates same structure dataset to same dataSetPath

My intent is to completely clear out the dataset by removing and recreating it. Is there a preferred way to do this? Is H5L not the right call?

Any help would be appreciated.

Warm Regards,
Jim

Jim,

A short program that demonstrates the issue would be helpful.

I just tried in C and couldn't make it to fail (probably haven't try hard :slight_smile:

#include "hdf5.h"
#define FILE "dset.h5"

int main() {

   hid_t file_id, dataset_id, dataspace_id; /* identifiers */
   hsize_t dims[2];
   herr_t status;

   /* Create the data space for the dataset. */
   dims[0] = 4;
   dims[1] = 6;
   dataspace_id = H5Screate_simple(2, dims, NULL);

   /* Create a new file using default properties. */
   file_id = H5Fcreate(FILE, H5F_ACC_TRUNC, H5P_DEFAULT, H5P_DEFAULT);

   /* Create the dataset. */
   dataset_id = H5Dcreate2(file_id, "/dset", H5T_STD_I32BE, dataspace_id,
                          H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT);

   status = H5Dclose(dataset_id);

   /* Delete the dataset. */
   H5Ldelete (file_id, "/dset", H5P_DEFAULT);
   H5Fflush(file_id, H5F_SCOPE_GLOBAL);

   dataset_id = H5Dcreate2(file_id, "/dset", H5T_STD_I32BE, dataspace_id,
                          H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT);
   status = H5Dclose(dataset_id);

   status = H5Sclose(dataspace_id);
   status = H5Fclose(file_id);
}

Elena

···

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Elena Pourmal The HDF Group http://hdfgroup.org
1800 So. Oak St., Suite 203, Champaign IL 61820
217.531.6112
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

On Apr 17, 2014, at 5:01 PM, "Rowe, Jim" <J.Rowe@questintegrity.com> wrote:

Hello- I am consistently getting access violations after doing the following:

H5L.Delete(_H5FileId, dataSetPath);
H5F.flush(_H5FileId, H5F.Scope.GLOBAL);
// setup for create call omitted
H5D.create( … ) //creates same structure dataset to same dataSetPath

My intent is to completely clear out the dataset by removing and recreating it. Is there a preferred way to do this? Is H5L not the right call?

Any help would be appreciated.

Warm Regards,
Jim
_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

Thank you, Elena. I think my problem was due to multi-threading issues in my client app. A separate thread(s) was accessing the file through a different fileid. I've reworked my thread synchronization and am preventing multiple instances from getting created and all is looking good.

Is there a simple way to just open the file for exclusive access? Similar to share_exlusive flag for windows OpenFile function. It seems that would prevent other threads and processes from accessing it. Unless I am missing something, the only H5f open flags are read-only and read/write.

Warm Regards,
Jim

···

From: Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org] On Behalf Of Elena Pourmal
Sent: Thursday, April 17, 2014 6:08 PM
To: HDF Users Discussion List
Subject: Re: [Hdf-forum] Access violation after deleting link and recreating dataset

Jim,

A short program that demonstrates the issue would be helpful.

I just tried in C and couldn't make it to fail (probably haven't try hard :slight_smile:

#include "hdf5.h"
#define FILE "dset.h5"

int main() {

   hid_t file_id, dataset_id, dataspace_id; /* identifiers */
   hsize_t dims[2];
   herr_t status;

   /* Create the data space for the dataset. */
   dims[0] = 4;
   dims[1] = 6;
   dataspace_id = H5Screate_simple(2, dims, NULL);

   /* Create a new file using default properties. */
   file_id = H5Fcreate(FILE, H5F_ACC_TRUNC, H5P_DEFAULT, H5P_DEFAULT);

   /* Create the dataset. */
   dataset_id = H5Dcreate2(file_id, "/dset", H5T_STD_I32BE, dataspace_id,
                          H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT);

   status = H5Dclose(dataset_id);

   /* Delete the dataset. */
   H5Ldelete (file_id, "/dset", H5P_DEFAULT);
   H5Fflush(file_id, H5F_SCOPE_GLOBAL);

   dataset_id = H5Dcreate2(file_id, "/dset", H5T_STD_I32BE, dataspace_id,
                          H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT);
   status = H5Dclose(dataset_id);

   status = H5Sclose(dataspace_id);
   status = H5Fclose(file_id);
}

Elena
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Elena Pourmal The HDF Group http://hdfgroup.org
1800 So. Oak St., Suite 203, Champaign IL 61820
217.531.6112
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

On Apr 17, 2014, at 5:01 PM, "Rowe, Jim" <J.Rowe@questintegrity.com<mailto:J.Rowe@questintegrity.com>> wrote:

Hello- I am consistently getting access violations after doing the following:

H5L.Delete(_H5FileId, dataSetPath);
H5F.flush(_H5FileId, H5F.Scope.GLOBAL);
// setup for create call omitted
H5D.create( ... ) //creates same structure dataset to same dataSetPath

My intent is to completely clear out the dataset by removing and recreating it. Is there a preferred way to do this? Is H5L not the right call?

Any help would be appreciated.

Warm Regards,
Jim
_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org<mailto:Hdf-forum@lists.hdfgroup.org>
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

Hi Jim,

···

From: Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org] On Behalf Of Rowe, Jim
Sent: Monday, April 21, 2014 12:57 AM
To: HDF Users Discussion List
Subject: Re: [Hdf-forum] Access violation after deleting link and recreating dataset

Thank you, Elena. I think my problem was due to multi-threading issues in my client app. A separate thread(s) was accessing the file through a different fileid. I've reworked my thread synchronization and am preventing multiple instances from getting created and all is looking good.

[Dana Robinson]
Are you using the thread-safe version of the library? You safely can't access the HDF5 library from multiple threads unless you are using it. I've seen many people try to handle the synchronization primitives at the application level and it never seems to work out. We don't provide thread-safe binaries for the HDF5 library so you'll have to build it yourself using CMake.

Is there a simple way to just open the file for exclusive access? Similar to share_exlusive flag for windows OpenFile function. It seems that would prevent other threads and processes from accessing it. Unless I am missing something, the only H5f open flags are read-only and read/write.

[Dana Robinson]
The majority of the virtual file drivers (VFDs) that handle low-level I/O use POSIX I/O calls, even on Windows. We do not have a driver that uses Win32 API calls, though this is in the works. The old "Windows VFD" was simply the POSIX VFD with some Windows-specific #ifdefs and never used Win32.

Dana

Warm Regards,
Jim

From: Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org] On Behalf Of Elena Pourmal
Sent: Thursday, April 17, 2014 6:08 PM
To: HDF Users Discussion List
Subject: Re: [Hdf-forum] Access violation after deleting link and recreating dataset

Jim,

A short program that demonstrates the issue would be helpful.

I just tried in C and couldn't make it to fail (probably haven't try hard :slight_smile:

#include "hdf5.h"
#define FILE "dset.h5"

int main() {

   hid_t file_id, dataset_id, dataspace_id; /* identifiers */
   hsize_t dims[2];
   herr_t status;

   /* Create the data space for the dataset. */
   dims[0] = 4;
   dims[1] = 6;
   dataspace_id = H5Screate_simple(2, dims, NULL);

   /* Create a new file using default properties. */
   file_id = H5Fcreate(FILE, H5F_ACC_TRUNC, H5P_DEFAULT, H5P_DEFAULT);

   /* Create the dataset. */
   dataset_id = H5Dcreate2(file_id, "/dset", H5T_STD_I32BE, dataspace_id,
                          H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT);

   status = H5Dclose(dataset_id);

   /* Delete the dataset. */
   H5Ldelete (file_id, "/dset", H5P_DEFAULT);
   H5Fflush(file_id, H5F_SCOPE_GLOBAL);

   dataset_id = H5Dcreate2(file_id, "/dset", H5T_STD_I32BE, dataspace_id,
                          H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT);
   status = H5Dclose(dataset_id);

   status = H5Sclose(dataspace_id);
   status = H5Fclose(file_id);
}

Elena
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Elena Pourmal The HDF Group http://hdfgroup.org
1800 So. Oak St., Suite 203, Champaign IL 61820
217.531.6112
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

On Apr 17, 2014, at 5:01 PM, "Rowe, Jim" <J.Rowe@questintegrity.com<mailto:J.Rowe@questintegrity.com>> wrote:

Hello- I am consistently getting access violations after doing the following:

H5L.Delete(_H5FileId, dataSetPath);
H5F.flush(_H5FileId, H5F.Scope.GLOBAL);
// setup for create call omitted
H5D.create( ... ) //creates same structure dataset to same dataSetPath

My intent is to completely clear out the dataset by removing and recreating it. Is there a preferred way to do this? Is H5L not the right call?

Any help would be appreciated.

Warm Regards,
Jim
_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org<mailto:Hdf-forum@lists.hdfgroup.org>
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

Hi Jim,

Thank you, Elena. I think my problem was due to multi-threading issues in my client app. A separate thread(s) was accessing the file through a different fileid. I’ve reworked my thread synchronization and am preventing multiple instances from getting created and all is looking good.

Is there a simple way to just open the file for exclusive access? Similar to share_exlusive flag for windows OpenFile function. It seems that would prevent other threads
and processes from accessing it. Unless I am missing something, the only H5f open flags are read-only and read/write.

Unfortunately, there is no a simple way to open the file for exclusive access with the current HDF5. We contemplated the idea for the next major release, but implemented it only for the SWMR (single writer-multiple reader) mode.

Preventing file access by multiple writers or single writer/multiple readers without using SWMR mode will definitely make HDF5 little-bit more user-friendly ;-)! We may implement in the future.

Thank you!

Elena

···

On Apr 20, 2014, at 11:56 PM, "Rowe, Jim" <J.Rowe@questintegrity.com> wrote:

Warm Regards,
Jim

From: Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org] On Behalf Of Elena Pourmal
Sent: Thursday, April 17, 2014 6:08 PM
To: HDF Users Discussion List
Subject: Re: [Hdf-forum] Access violation after deleting link and recreating dataset

Jim,

A short program that demonstrates the issue would be helpful.

I just tried in C and couldn't make it to fail (probably haven't try hard :slight_smile:

#include "hdf5.h"
#define FILE "dset.h5"

int main() {

   hid_t file_id, dataset_id, dataspace_id; /* identifiers */
   hsize_t dims[2];
   herr_t status;

   /* Create the data space for the dataset. */
   dims[0] = 4;
   dims[1] = 6;
   dataspace_id = H5Screate_simple(2, dims, NULL);

   /* Create a new file using default properties. */
   file_id = H5Fcreate(FILE, H5F_ACC_TRUNC, H5P_DEFAULT, H5P_DEFAULT);

   /* Create the dataset. */
   dataset_id = H5Dcreate2(file_id, "/dset", H5T_STD_I32BE, dataspace_id,
                          H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT);

   status = H5Dclose(dataset_id);

   /* Delete the dataset. */
   H5Ldelete (file_id, "/dset", H5P_DEFAULT);
   H5Fflush(file_id, H5F_SCOPE_GLOBAL);

   dataset_id = H5Dcreate2(file_id, "/dset", H5T_STD_I32BE, dataspace_id,
                          H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT);
   status = H5Dclose(dataset_id);

   status = H5Sclose(dataspace_id);
   status = H5Fclose(file_id);
}

Elena
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Elena Pourmal The HDF Group http://hdfgroup.org
1800 So. Oak St., Suite 203, Champaign IL 61820
217.531.6112
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

On Apr 17, 2014, at 5:01 PM, "Rowe, Jim" <J.Rowe@questintegrity.com> wrote:

Hello- I am consistently getting access violations after doing the following:

H5L.Delete(_H5FileId, dataSetPath);
H5F.flush(_H5FileId, H5F.Scope.GLOBAL);
// setup for create call omitted
H5D.create( … ) //creates same structure dataset to same dataSetPath

My intent is to completely clear out the dataset by removing and recreating it. Is there a preferred way to do this? Is H5L not the right call?

Any help would be appreciated.

Warm Regards,
Jim
_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

Elena,
Ok, thank you-sound like a personals ad. :wink: Can you clarify how SWMR mode is used? Is that part of building it thread-safe? That functionality would address most of the issues for my app.

Warm Regards,
Jim

···

From: Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org] On Behalf Of Elena Pourmal
Sent: Monday, April 21, 2014 10:22 PM
To: HDF Users Discussion List
Subject: Re: [Hdf-forum] Access violation after deleting link and recreating dataset

Hi Jim,

On Apr 20, 2014, at 11:56 PM, "Rowe, Jim" <J.Rowe@questintegrity.com<mailto:J.Rowe@questintegrity.com>> wrote:

Thank you, Elena. I think my problem was due to multi-threading issues in my client app. A separate thread(s) was accessing the file through a different fileid. I've reworked my thread synchronization and am preventing multiple instances from getting created and all is looking good.

Is there a simple way to just open the file for exclusive access? Similar to share_exlusive flag for windows OpenFile function. It seems that would prevent other threads
and processes from accessing it. Unless I am missing something, the only H5f open flags are read-only and read/write.

Unfortunately, there is no a simple way to open the file for exclusive access with the current HDF5. We contemplated the idea for the next major release, but implemented it only for the SWMR (single writer-multiple reader) mode.

Preventing file access by multiple writers or single writer/multiple readers without using SWMR mode will definitely make HDF5 little-bit more user-friendly ;-)! We may implement in the future.

Thank you!

Elena

Warm Regards,
Jim

From: Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org<mailto:forum-bounces@lists.hdfgroup.org>] On Behalf Of Elena Pourmal
Sent: Thursday, April 17, 2014 6:08 PM
To: HDF Users Discussion List
Subject: Re: [Hdf-forum] Access violation after deleting link and recreating dataset

Jim,

A short program that demonstrates the issue would be helpful.

I just tried in C and couldn't make it to fail (probably haven't try hard :slight_smile:

#include "hdf5.h"
#define FILE "dset.h5"

int main() {

   hid_t file_id, dataset_id, dataspace_id; /* identifiers */
   hsize_t dims[2];
   herr_t status;

   /* Create the data space for the dataset. */
   dims[0] = 4;
   dims[1] = 6;
   dataspace_id = H5Screate_simple(2, dims, NULL);

   /* Create a new file using default properties. */
   file_id = H5Fcreate(FILE, H5F_ACC_TRUNC, H5P_DEFAULT, H5P_DEFAULT);

   /* Create the dataset. */
   dataset_id = H5Dcreate2(file_id, "/dset", H5T_STD_I32BE, dataspace_id,
                          H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT);

   status = H5Dclose(dataset_id);

   /* Delete the dataset. */
   H5Ldelete (file_id, "/dset", H5P_DEFAULT);
   H5Fflush(file_id, H5F_SCOPE_GLOBAL);

   dataset_id = H5Dcreate2(file_id, "/dset", H5T_STD_I32BE, dataspace_id,
                          H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT);
   status = H5Dclose(dataset_id);

   status = H5Sclose(dataspace_id);
   status = H5Fclose(file_id);
}

Elena
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Elena Pourmal The HDF Group http://hdfgroup.org
1800 So. Oak St., Suite 203, Champaign IL 61820
217.531.6112
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

On Apr 17, 2014, at 5:01 PM, "Rowe, Jim" <J.Rowe@questintegrity.com<mailto:J.Rowe@questintegrity.com>> wrote:

Hello- I am consistently getting access violations after doing the following:

H5L.Delete(_H5FileId, dataSetPath);
H5F.flush(_H5FileId, H5F.Scope.GLOBAL);
// setup for create call omitted
H5D.create( ... ) //creates same structure dataset to same dataSetPath

My intent is to completely clear out the dataset by removing and recreating it. Is there a preferred way to do this? Is H5L not the right call?

Any help would be appreciated.

Warm Regards,
Jim
_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org<mailto:Hdf-forum@lists.hdfgroup.org>
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org<mailto:Hdf-forum@lists.hdfgroup.org>
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

Hi Jim,

Elena,
Ok, thank you—sound like a personals ad. :wink: Can you clarify how SWMR mode is used?

When an HDF5 file is open by writer process for SWMR access, the flag is set up by the HDF5 library so no other writer process can open the file. Any number of processes can open the file for reading.

Is that part of building it thread-safe?

No.

That functionality would address most of the issues for my app.

We hope so!

All,

SWMR is still "under construction" (we are looking into a few performance issues and documentation needs more work). Below are some pointers to source and documentation.

SWMR prototype source code is available from ftp://ftp.hdfgroup.uiuc.edu/pub/outgoing/SWMR/src/

Please check the README file in ftp directory. It contains installation instructions, the list of the file systems on which SWMR runs and the programming model to follow.

Documentation is available at http://www.hdfgroup.org/projects/SWMR/.

Please be aware that SWMR writer will create files that are not readable by the HDF5 1.8.* libraries and tools. One will need to run h5repack from the SWMR distribution to create 1.8 file; solution for the 1.8 - 1.10 forward compatibility is in the works.

We are planning to organize a Web seminar on SWMR in June. Please send me an email if you are interested in participating.

Thank you!

Elena

···

On Apr 22, 2014, at 10:17 AM, "Rowe, Jim" <J.Rowe@questintegrity.com> wrote:

Warm Regards,
Jim

From: Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org] On Behalf Of Elena Pourmal
Sent: Monday, April 21, 2014 10:22 PM
To: HDF Users Discussion List
Subject: Re: [Hdf-forum] Access violation after deleting link and recreating dataset

Hi Jim,

On Apr 20, 2014, at 11:56 PM, "Rowe, Jim" <J.Rowe@questintegrity.com> wrote:

Thank you, Elena. I think my problem was due to multi-threading issues in my client app. A separate thread(s) was accessing the file through a different fileid. I’ve reworked my thread synchronization and am preventing multiple instances from getting created and all is looking good.

Is there a simple way to just open the file for exclusive access? Similar to share_exlusive flag for windows OpenFile function. It seems that would prevent other threads
and processes from accessing it. Unless I am missing something, the only H5f open flags are read-only and read/write.

Unfortunately, there is no a simple way to open the file for exclusive access with the current HDF5. We contemplated the idea for the next major release, but implemented it only for the SWMR (single writer-multiple reader) mode.

Preventing file access by multiple writers or single writer/multiple readers without using SWMR mode will definitely make HDF5 little-bit more user-friendly ;-)! We may implement in the future.

Thank you!

Elena

Warm Regards,
Jim

From: Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org] On Behalf Of Elena Pourmal
Sent: Thursday, April 17, 2014 6:08 PM
To: HDF Users Discussion List
Subject: Re: [Hdf-forum] Access violation after deleting link and recreating dataset

Jim,

A short program that demonstrates the issue would be helpful.

I just tried in C and couldn't make it to fail (probably haven't try hard :slight_smile:

#include "hdf5.h"
#define FILE "dset.h5"

int main() {

   hid_t file_id, dataset_id, dataspace_id; /* identifiers */
   hsize_t dims[2];
   herr_t status;

   /* Create the data space for the dataset. */
   dims[0] = 4;
   dims[1] = 6;
   dataspace_id = H5Screate_simple(2, dims, NULL);

   /* Create a new file using default properties. */
   file_id = H5Fcreate(FILE, H5F_ACC_TRUNC, H5P_DEFAULT, H5P_DEFAULT);

   /* Create the dataset. */
   dataset_id = H5Dcreate2(file_id, "/dset", H5T_STD_I32BE, dataspace_id,
                          H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT);

   status = H5Dclose(dataset_id);

   /* Delete the dataset. */
   H5Ldelete (file_id, "/dset", H5P_DEFAULT);
   H5Fflush(file_id, H5F_SCOPE_GLOBAL);

   dataset_id = H5Dcreate2(file_id, "/dset", H5T_STD_I32BE, dataspace_id,
                          H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT);
   status = H5Dclose(dataset_id);

   status = H5Sclose(dataspace_id);
   status = H5Fclose(file_id);
}

Elena
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Elena Pourmal The HDF Group http://hdfgroup.org
1800 So. Oak St., Suite 203, Champaign IL 61820
217.531.6112
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

On Apr 17, 2014, at 5:01 PM, "Rowe, Jim" <J.Rowe@questintegrity.com> wrote:

Hello- I am consistently getting access violations after doing the following:

H5L.Delete(_H5FileId, dataSetPath);
H5F.flush(_H5FileId, H5F.Scope.GLOBAL);
// setup for create call omitted
H5D.create( … ) //creates same structure dataset to same dataSetPath

My intent is to completely clear out the dataset by removing and recreating it. Is there a preferred way to do this? Is H5L not the right call?

Any help would be appreciated.

Warm Regards,
Jim
_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org