hdf5dotnet memory leak after setExtents to shorten dataset

Hello, I am perplexed by a bug I have where after I change the dataset extents using H5D.setExtent, when I re-run my program and try to read from the datasets I get a dreaded "Fatal Engine Execution Error" cropping up.

Code to "reduce" data set size is like this (I'm always reducing on the first dimension):

H5DataSetId setId = H5D.open(_H5FileId, dataSetPath, new H5PropertyListId(H5P.Template.DEFAULT));
long[] newExtents = new long[1] { 135000, 72 };
H5D.setExtent(setId, newExtents);
H5F.flush(setId, H5F.Scope.LOCAL);
H5D.close(setId);

The datasets are "chunked" into 8K chunks and I read the data using hyperslabs. It is generally sometime after the reads that I get the fatal engine execution error. It is not in a consistent place or on the same line of code (sometimes HDF5 code, sometimes other). Here is how I am reading form the dataset.

H5DataSetId setId = H5D.open(_H5FileId, dataSetPath, new H5PropertyListId(H5P.Template.DEFAULT)); H5DataSpaceId spaceId = H5D.getSpace(setId);
H5DataSpaceId memSpaceId = H5S.create_simple(2, dims);
spaceId = H5D.getSpace(setId);
H5S.selectHyperslab(spaceId, H5S.SelectOperator.SET, offset, dims);
H5DataTypeId typeId = GetH5NativeType(typeof(T));
H5D.read(setId, typeId, memSpaceId, spaceId, new H5PropertyListId(H5P.Template.DEFAULT), new H5Array<T>(dataArray));

I used h5check and the file seems to be fine. Likewise, I can read/write using HDFView. I even tried repacking the file and I get the same results.

I am at a loss as to what could be going on. Any ideas?

Warm Regards,
Jim