MATLAB Unable to delete attribute in dense storage

Dear all,
I am facing a problem when trying to delete HDF5 attribute stored in dense
storage in MATLAB R2014b. The test code (below) works for small data size
(10x10) but causes "red errors" for bigger data (100x100).

close all;

clear all;
clc;

fileName = 'H5A_delete.h5';
groupName = 'test';
attrName = 'testAttribute';

data = 1.23*ones(100, 100);

if exist(fileName, 'file')
   delete(fileName);
end

faplID = H5P.create('H5P_FILE_ACCESS');
H5P.set_libver_bounds(faplID, 'H5F_LIBVER_18', 'H5F_LIBVER_LATEST');
fileID = H5F.create(fileName, 'H5F_ACC_TRUNC', 'H5P_DEFAULT', faplID);
H5P.close(faplID);
groupID = H5G.create(fileID, groupName, 'H5P_DEFAULT', 'H5P_DEFAULT',
'H5P_DEFAULT');
spaceID = H5S.create_simple(length(size(data)), fliplr(size(data)), []);
% attribute creation
acplID = H5P.create('H5P_ATTRIBUTE_CREATE');
attrID = H5A.create(groupID, attrName, 'H5T_IEEE_F64LE', spaceID, acplID);
H5P.close(acplID);
% writing attribute value
H5A.write(attrID, 'H5T_NATIVE_DOUBLE', data);
H5A.close(attrID);
H5S.close(spaceID);
H5G.close(groupID);
H5F.close(fileID);

clear acplID attrID data faplID fileID groupID spaceID;

% fileattrib(filename, '+w');
fileID = H5F.open(fileName, 'H5F_ACC_RDWR', 'H5P_DEFAULT');
groupID = H5G.open(fileID, groupName);
H5A.delete(groupID, attrName);
H5G.close(groupID);
H5F.close(fileID);

Error messages:

Error using hdf5lib2
The HDF5 library encountered an error and produced the following stack
trace information:

    H5FD_windows_read addr overflow
    H5FD_read driver read request failed
    H5F_accum_read driver read request failed
    H5F_block_read read through metadata accumulator failed
    H5HF_huge_op_real can't read 'huge' object's data from the
file
    H5HF_huge_op unable to operate on heap object
    H5HF_op can't operate on 'huge' object from fractal
heap
    H5A_dense_delete_bt2_cb heap op callback failed
    H5B2_delete_node iterator function failed
    H5B2_hdr_delete unable to delete B-tree nodes
    H5SL_insert_common can't insert duplicate key
    H5SL_insert can't create new skip list node
    H5FS_sect_link_rest can't insert free space node into merging
skip list
    H5FS_sect_link can't add section to non-size tracking data
structures
    H5FS_sect_add can't insert free space section into skip
list
    H5MF_xfree can't add section to file free space
    H5HF_cache_hdr_dest unable to free fractal heap header
    H5HF_cache_hdr_clear unable to destroy fractal heap header
    H5C_flush_single_entry can't clear entry
    H5C_unprotect Can't flush.
    H5AC_unprotect H5C_unprotect() failed.
    H5HF_hdr_delete unable to release fractal heap header
    H5HF_delete unable to delete fractal heap
    H5C_unprotect Entry already unprotected??
    H5AC_unprotect H5C_unprotect() failed.
    H5HF_delete unable to release fractal heap header
    H5A_dense_delete unable to delete fractal heap
    H5O_ainfo_delete unable to free dense attribute storage
    H5O_delete_mesg unable to delete file space for object
header message
    H5O_release_mesg unable to delete file space for object
header message
    H5O_msg_remove_cb unable to release message
    H5O_msg_iterate_real iterator function failed

Error in H5A.delete (line 21)
H5ML.hdf5lib2('H5Adelete', loc_id, name);

Error in h5a_delete (line 37)
H5A.delete(groupID, attrName);

Thank you for any help.

Best regards,
Vladimir Sedenka

Hi Vladimir,

I was able to reproduce this issue in a C example, and have entered a bug report for it.
For your reference it is HDFFV-9277.

Thank you for reporting this!

-Barbara
help@hdfgroup.org<mailto:help@hdfgroup.org>

···

From: Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org] On Behalf Of Vladimír Šedenka
Sent: Monday, March 30, 2015 12:37 PM
To: hdf-forum@lists.hdfgroup.org
Subject: [Hdf-forum] MATLAB Unable to delete attribute in dense storage

Dear all,
I am facing a problem when trying to delete HDF5 attribute stored in dense storage in MATLAB R2014b. The test code (below) works for small data size (10x10) but causes "red errors" for bigger data (100x100).
close all;
clear all;
clc;

fileName = 'H5A_delete.h5';
groupName = 'test';
attrName = 'testAttribute';

data = 1.23*ones(100, 100);

if exist(fileName, 'file')
   delete(fileName);
end

faplID = H5P.create('H5P_FILE_ACCESS');
H5P.set_libver_bounds(faplID, 'H5F_LIBVER_18', 'H5F_LIBVER_LATEST');
fileID = H5F.create(fileName, 'H5F_ACC_TRUNC', 'H5P_DEFAULT', faplID);
H5P.close(faplID);
groupID = H5G.create(fileID, groupName, 'H5P_DEFAULT', 'H5P_DEFAULT', 'H5P_DEFAULT');
spaceID = H5S.create_simple(length(size(data)), fliplr(size(data)), []);
% attribute creation
acplID = H5P.create('H5P_ATTRIBUTE_CREATE');
attrID = H5A.create(groupID, attrName, 'H5T_IEEE_F64LE', spaceID, acplID);
H5P.close(acplID);
% writing attribute value
H5A.write(attrID, 'H5T_NATIVE_DOUBLE', data);
H5A.close(attrID);
H5S.close(spaceID);
H5G.close(groupID);
H5F.close(fileID);

clear acplID attrID data faplID fileID groupID spaceID;

% fileattrib(filename, '+w');
fileID = H5F.open(fileName, 'H5F_ACC_RDWR', 'H5P_DEFAULT');
groupID = H5G.open(fileID, groupName);
H5A.delete(groupID, attrName);
H5G.close(groupID);
H5F.close(fileID);

Error messages:
Error using hdf5lib2
The HDF5 library encountered an error and produced the following stack trace information:

    H5FD_windows_read addr overflow
    H5FD_read driver read request failed
    H5F_accum_read driver read request failed
    H5F_block_read read through metadata accumulator failed
    H5HF_huge_op_real can't read 'huge' object's data from the file
    H5HF_huge_op unable to operate on heap object
    H5HF_op can't operate on 'huge' object from fractal heap
    H5A_dense_delete_bt2_cb heap op callback failed
    H5B2_delete_node iterator function failed
    H5B2_hdr_delete unable to delete B-tree nodes
    H5SL_insert_common can't insert duplicate key
    H5SL_insert can't create new skip list node
    H5FS_sect_link_rest can't insert free space node into merging skip list
    H5FS_sect_link can't add section to non-size tracking data structures
    H5FS_sect_add can't insert free space section into skip list
    H5MF_xfree can't add section to file free space
    H5HF_cache_hdr_dest unable to free fractal heap header
    H5HF_cache_hdr_clear unable to destroy fractal heap header
    H5C_flush_single_entry can't clear entry
    H5C_unprotect Can't flush.
    H5AC_unprotect H5C_unprotect() failed.
    H5HF_hdr_delete unable to release fractal heap header
    H5HF_delete unable to delete fractal heap
    H5C_unprotect Entry already unprotected??
    H5AC_unprotect H5C_unprotect() failed.
    H5HF_delete unable to release fractal heap header
    H5A_dense_delete unable to delete fractal heap
    H5O_ainfo_delete unable to free dense attribute storage
    H5O_delete_mesg unable to delete file space for object header message
    H5O_release_mesg unable to delete file space for object header message
    H5O_msg_remove_cb unable to release message
    H5O_msg_iterate_real iterator function failed

Error in H5A.delete (line 21)
H5ML.hdf5lib2('H5Adelete', loc_id, name);

Error in h5a_delete (line 37)
H5A.delete(groupID, attrName);

Thank you for any help.
Best regards,
Vladimir Sedenka

Dear Barbara,
I have to keep backward compatibility with this version of MATLAB so I will
have to check amount of data to be written and refuse to write them before
any error occurs. I will have to set a threshold - the maximal allowed
amount of the data for each dataType. However, the amount of the user data
is not exactly 64K(i)B. I have discovered the limits using trial and error
approach but string attributes can be tricky (because of variable char
count per cell). Therefore, I would like to ask you to bring some light
into the 64K limitation.

Here are the limits I have discovered for compound dataType, double, int32
and reference(8 x uint8). Meaning of the numbers follows this header:
<nubber_of_cells><dataTypes_per_cell><bytes_per_dataType> + extra bytes to
reach 65536

complex double (compound datatType 2x8 bytes per cell)
4084*2*8 + 192

double (8 bytes per cell)
8180*1*8 + 96

in32 (4 bytes per cell)
16365*1*4 + 76

reference (8 bytes per cell)
8183*8*1 + 72

Could you, please, explain, where are the the extra bytes (192, 96, 76, 72)
coming from? Origin of these numbers should help me to set proper threshold
for strings.

Best regards,
Vladimir

···

2015-04-08 20:58 GMT+02:00 Barbara Jones <bljones@hdfgroup.org>:

Hi Vladimir,

I was able to reproduce this issue in a C example, and have entered a bug
report for it.

For your reference it is HDFFV-9277.

Thank you for reporting this!

-Barbara

help@hdfgroup.org

*From:* Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org] *On
Behalf Of *Vladimír Šedenka
*Sent:* Monday, March 30, 2015 12:37 PM
*To:* hdf-forum@lists.hdfgroup.org
*Subject:* [Hdf-forum] MATLAB Unable to delete attribute in dense storage

Dear all,

I am facing a problem when trying to delete HDF5 attribute stored in dense
storage in MATLAB R2014b. The test code (below) works for small data size
(10x10) but causes "red errors" for bigger data (100x100).

close all;
clear all;
clc;

fileName = 'H5A_delete.h5';
groupName = 'test';
attrName = 'testAttribute';

data = 1.23*ones(100, 100);

if exist(fileName, 'file')
   delete(fileName);
end

faplID = H5P.create('H5P_FILE_ACCESS');
H5P.set_libver_bounds(faplID, 'H5F_LIBVER_18', 'H5F_LIBVER_LATEST');
fileID = H5F.create(fileName, 'H5F_ACC_TRUNC', 'H5P_DEFAULT', faplID);
H5P.close(faplID);
groupID = H5G.create(fileID, groupName, 'H5P_DEFAULT', 'H5P_DEFAULT',
'H5P_DEFAULT');
spaceID = H5S.create_simple(length(size(data)), fliplr(size(data)), []);
% attribute creation
acplID = H5P.create('H5P_ATTRIBUTE_CREATE');
attrID = H5A.create(groupID, attrName, 'H5T_IEEE_F64LE', spaceID, acplID);
H5P.close(acplID);
% writing attribute value
H5A.write(attrID, 'H5T_NATIVE_DOUBLE', data);
H5A.close(attrID);
H5S.close(spaceID);
H5G.close(groupID);
H5F.close(fileID);

clear acplID attrID data faplID fileID groupID spaceID;

% fileattrib(filename, '+w');
fileID = H5F.open(fileName, 'H5F_ACC_RDWR', 'H5P_DEFAULT');
groupID = H5G.open(fileID, groupName);
H5A.delete(groupID, attrName);
H5G.close(groupID);
H5F.close(fileID);

Error messages:

Error using hdf5lib2
The HDF5 library encountered an error and produced the following stack
trace information:

    H5FD_windows_read addr overflow
    H5FD_read driver read request failed
    H5F_accum_read driver read request failed
    H5F_block_read read through metadata accumulator failed
    H5HF_huge_op_real can't read 'huge' object's data from the
file
    H5HF_huge_op unable to operate on heap object
    H5HF_op can't operate on 'huge' object from fractal
heap
    H5A_dense_delete_bt2_cb heap op callback failed
    H5B2_delete_node iterator function failed
    H5B2_hdr_delete unable to delete B-tree nodes
    H5SL_insert_common can't insert duplicate key
    H5SL_insert can't create new skip list node
    H5FS_sect_link_rest can't insert free space node into merging
skip list
    H5FS_sect_link can't add section to non-size tracking data
structures
    H5FS_sect_add can't insert free space section into skip
list
    H5MF_xfree can't add section to file free space
    H5HF_cache_hdr_dest unable to free fractal heap header
    H5HF_cache_hdr_clear unable to destroy fractal heap header
    H5C_flush_single_entry can't clear entry
    H5C_unprotect Can't flush.
    H5AC_unprotect H5C_unprotect() failed.
    H5HF_hdr_delete unable to release fractal heap header
    H5HF_delete unable to delete fractal heap
    H5C_unprotect Entry already unprotected??
    H5AC_unprotect H5C_unprotect() failed.
    H5HF_delete unable to release fractal heap header
    H5A_dense_delete unable to delete fractal heap
    H5O_ainfo_delete unable to free dense attribute storage
    H5O_delete_mesg unable to delete file space for object
header message
    H5O_release_mesg unable to delete file space for object
header message
    H5O_msg_remove_cb unable to release message
    H5O_msg_iterate_real iterator function failed

Error in H5A.delete (line 21)
H5ML.hdf5lib2('H5Adelete', loc_id, name);

Error in h5a_delete (line 37)
H5A.delete(groupID, attrName);

Thank you for any help.

Best regards,

Vladimir Sedenka

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5

Hello,
please, does anyone know how to determine the limit for "normal" attribute
storage?

Best regards,
Vladimir

···

2015-04-23 14:38 GMT+02:00 Vladimír Šeděnka <vladimir.sedenka@gmail.com>:

Dear Barbara,
I have to keep backward compatibility with this version of MATLAB so I
will have to check amount of data to be written and refuse to write them
before any error occurs. I will have to set a threshold - the maximal
allowed amount of the data for each dataType. However, the amount of the
user data is not exactly 64K(i)B. I have discovered the limits using trial
and error approach but string attributes can be tricky (because of variable
char count per cell). Therefore, I would like to ask you to bring some
light into the 64K limitation.

Here are the limits I have discovered for compound dataType, double, int32
and reference(8 x uint8). Meaning of the numbers follows this header:
<nubber_of_cells><dataTypes_per_cell><bytes_per_dataType> + extra bytes to
reach 65536

complex double (compound datatType 2x8 bytes per cell)
4084*2*8 + 192

double (8 bytes per cell)
8180*1*8 + 96

in32 (4 bytes per cell)
16365*1*4 + 76

reference (8 bytes per cell)
8183*8*1 + 72

Could you, please, explain, where are the the extra bytes (192, 96, 76,
72) coming from? Origin of these numbers should help me to set proper
threshold for strings.

Best regards,
Vladimir

2015-04-08 20:58 GMT+02:00 Barbara Jones <bljones@hdfgroup.org>:

Hi Vladimir,

I was able to reproduce this issue in a C example, and have entered a bug
report for it.

For your reference it is HDFFV-9277.

Thank you for reporting this!

-Barbara

help@hdfgroup.org

*From:* Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org] *On
Behalf Of *Vladimír Šedenka
*Sent:* Monday, March 30, 2015 12:37 PM
*To:* hdf-forum@lists.hdfgroup.org
*Subject:* [Hdf-forum] MATLAB Unable to delete attribute in dense storage

Dear all,

I am facing a problem when trying to delete HDF5 attribute stored in
dense storage in MATLAB R2014b. The test code (below) works for small data
size (10x10) but causes "red errors" for bigger data (100x100).

close all;
clear all;
clc;

fileName = 'H5A_delete.h5';
groupName = 'test';
attrName = 'testAttribute';

data = 1.23*ones(100, 100);

if exist(fileName, 'file')
   delete(fileName);
end

faplID = H5P.create('H5P_FILE_ACCESS');
H5P.set_libver_bounds(faplID, 'H5F_LIBVER_18', 'H5F_LIBVER_LATEST');
fileID = H5F.create(fileName, 'H5F_ACC_TRUNC', 'H5P_DEFAULT', faplID);
H5P.close(faplID);
groupID = H5G.create(fileID, groupName, 'H5P_DEFAULT', 'H5P_DEFAULT',
'H5P_DEFAULT');
spaceID = H5S.create_simple(length(size(data)), fliplr(size(data)), []);
% attribute creation
acplID = H5P.create('H5P_ATTRIBUTE_CREATE');
attrID = H5A.create(groupID, attrName, 'H5T_IEEE_F64LE', spaceID, acplID);
H5P.close(acplID);
% writing attribute value
H5A.write(attrID, 'H5T_NATIVE_DOUBLE', data);
H5A.close(attrID);
H5S.close(spaceID);
H5G.close(groupID);
H5F.close(fileID);

clear acplID attrID data faplID fileID groupID spaceID;

% fileattrib(filename, '+w');
fileID = H5F.open(fileName, 'H5F_ACC_RDWR', 'H5P_DEFAULT');
groupID = H5G.open(fileID, groupName);
H5A.delete(groupID, attrName);
H5G.close(groupID);
H5F.close(fileID);

Error messages:

Error using hdf5lib2
The HDF5 library encountered an error and produced the following stack
trace information:

    H5FD_windows_read addr overflow
    H5FD_read driver read request failed
    H5F_accum_read driver read request failed
    H5F_block_read read through metadata accumulator failed
    H5HF_huge_op_real can't read 'huge' object's data from the
file
    H5HF_huge_op unable to operate on heap object
    H5HF_op can't operate on 'huge' object from
fractal heap
    H5A_dense_delete_bt2_cb heap op callback failed
    H5B2_delete_node iterator function failed
    H5B2_hdr_delete unable to delete B-tree nodes
    H5SL_insert_common can't insert duplicate key
    H5SL_insert can't create new skip list node
    H5FS_sect_link_rest can't insert free space node into merging
skip list
    H5FS_sect_link can't add section to non-size tracking
data structures
    H5FS_sect_add can't insert free space section into skip
list
    H5MF_xfree can't add section to file free space
    H5HF_cache_hdr_dest unable to free fractal heap header
    H5HF_cache_hdr_clear unable to destroy fractal heap header
    H5C_flush_single_entry can't clear entry
    H5C_unprotect Can't flush.
    H5AC_unprotect H5C_unprotect() failed.
    H5HF_hdr_delete unable to release fractal heap header
    H5HF_delete unable to delete fractal heap
    H5C_unprotect Entry already unprotected??
    H5AC_unprotect H5C_unprotect() failed.
    H5HF_delete unable to release fractal heap header
    H5A_dense_delete unable to delete fractal heap
    H5O_ainfo_delete unable to free dense attribute storage
    H5O_delete_mesg unable to delete file space for object
header message
    H5O_release_mesg unable to delete file space for object
header message
    H5O_msg_remove_cb unable to release message
    H5O_msg_iterate_real iterator function failed

Error in H5A.delete (line 21)
H5ML.hdf5lib2('H5Adelete', loc_id, name);

Error in h5a_delete (line 37)
H5A.delete(groupID, attrName);

Thank you for any help.

Best regards,

Vladimir Sedenka

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5

Hi Vladimir,

Could you please take a look at the H5Tencode<https://www.hdfgroup.org/HDF5/doc/RM/RM_H5T.html#Datatype-Encode> and H5Sencode<https://www.hdfgroup.org/HDF5/doc/RM/RM_H5S.html#Dataspace-Encode> functions to find the size of datatype and dataspace as it will be stored in the object header. This will hopefully help you to decide on the threshold and deal with the 64K limit.

Elena

···

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Elena Pourmal The HDF Group http://hdfgroup.org
1800 So. Oak St., Suite 203, Champaign IL 61820
217.531.6112
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

On May 9, 2015, at 5:38 AM, Vladimír Šeděnka <vladimir.sedenka@gmail.com<mailto:vladimir.sedenka@gmail.com>> wrote:

Hello,
please, does anyone know how to determine the limit for "normal" attribute storage?

Best regards,
Vladimir

2015-04-23 14:38 GMT+02:00 Vladimír Šeděnka <vladimir.sedenka@gmail.com<mailto:vladimir.sedenka@gmail.com>>:
Dear Barbara,
I have to keep backward compatibility with this version of MATLAB so I will have to check amount of data to be written and refuse to write them before any error occurs. I will have to set a threshold - the maximal allowed amount of the data for each dataType. However, the amount of the user data is not exactly 64K(i)B. I have discovered the limits using trial and error approach but string attributes can be tricky (because of variable char count per cell). Therefore, I would like to ask you to bring some light into the 64K limitation.

Here are the limits I have discovered for compound dataType, double, int32 and reference(8 x uint8). Meaning of the numbers follows this header:
<nubber_of_cells><dataTypes_per_cell><bytes_per_dataType> + extra bytes to reach 65536

complex double (compound datatType 2x8 bytes per cell)
4084*2*8 + 192

double (8 bytes per cell)
8180*1*8 + 96

in32 (4 bytes per cell)
16365*1*4 + 76

reference (8 bytes per cell)
8183*8*1 + 72

Could you, please, explain, where are the the extra bytes (192, 96, 76, 72) coming from? Origin of these numbers should help me to set proper threshold for strings.

Best regards,
Vladimir

2015-04-08 20:58 GMT+02:00 Barbara Jones <bljones@hdfgroup.org<mailto:bljones@hdfgroup.org>>:
Hi Vladimir,

I was able to reproduce this issue in a C example, and have entered a bug report for it.
For your reference it is HDFFV-9277.

Thank you for reporting this!

-Barbara
help@hdfgroup.org<mailto:help@hdfgroup.org>

From: Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org<mailto:hdf-forum-bounces@lists.hdfgroup.org>] On Behalf Of Vladimír Šedenka
Sent: Monday, March 30, 2015 12:37 PM
To: hdf-forum@lists.hdfgroup.org<mailto:hdf-forum@lists.hdfgroup.org>
Subject: [Hdf-forum] MATLAB Unable to delete attribute in dense storage

Dear all,
I am facing a problem when trying to delete HDF5 attribute stored in dense storage in MATLAB R2014b. The test code (below) works for small data size (10x10) but causes "red errors" for bigger data (100x100).
close all;
clear all;
clc;

fileName = 'H5A_delete.h5';
groupName = 'test';
attrName = 'testAttribute';

data = 1.23*ones(100, 100);

if exist(fileName, 'file')
   delete(fileName);
end

faplID = H5P.create('H5P_FILE_ACCESS');
H5P.set_libver_bounds(faplID, 'H5F_LIBVER_18', 'H5F_LIBVER_LATEST');
fileID = H5F.create(fileName, 'H5F_ACC_TRUNC', 'H5P_DEFAULT', faplID);
H5P.close(faplID);
groupID = H5G.create(fileID, groupName, 'H5P_DEFAULT', 'H5P_DEFAULT', 'H5P_DEFAULT');
spaceID = H5S.create_simple(length(size(data)), fliplr(size(data)), []);
% attribute creation
acplID = H5P.create('H5P_ATTRIBUTE_CREATE');
attrID = H5A.create(groupID, attrName, 'H5T_IEEE_F64LE', spaceID, acplID);
H5P.close(acplID);
% writing attribute value
H5A.write(attrID, 'H5T_NATIVE_DOUBLE', data);
H5A.close(attrID);
H5S.close(spaceID);
H5G.close(groupID);
H5F.close(fileID);

clear acplID attrID data faplID fileID groupID spaceID;

% fileattrib(filename, '+w');
fileID = H5F.open(fileName, 'H5F_ACC_RDWR', 'H5P_DEFAULT');
groupID = H5G.open(fileID, groupName);
H5A.delete(groupID, attrName);
H5G.close(groupID);
H5F.close(fileID);

Error messages:
Error using hdf5lib2
The HDF5 library encountered an error and produced the following stack trace information:

    H5FD_windows_read addr overflow
    H5FD_read driver read request failed
    H5F_accum_read driver read request failed
    H5F_block_read read through metadata accumulator failed
    H5HF_huge_op_real can't read 'huge' object's data from the file
    H5HF_huge_op unable to operate on heap object
    H5HF_op can't operate on 'huge' object from fractal heap
    H5A_dense_delete_bt2_cb heap op callback failed
    H5B2_delete_node iterator function failed
    H5B2_hdr_delete unable to delete B-tree nodes
    H5SL_insert_common can't insert duplicate key
    H5SL_insert can't create new skip list node
    H5FS_sect_link_rest can't insert free space node into merging skip list
    H5FS_sect_link can't add section to non-size tracking data structures
    H5FS_sect_add can't insert free space section into skip list
    H5MF_xfree can't add section to file free space
    H5HF_cache_hdr_dest unable to free fractal heap header
    H5HF_cache_hdr_clear unable to destroy fractal heap header
    H5C_flush_single_entry can't clear entry
    H5C_unprotect Can't flush.
    H5AC_unprotect H5C_unprotect() failed.
    H5HF_hdr_delete unable to release fractal heap header
    H5HF_delete unable to delete fractal heap
    H5C_unprotect Entry already unprotected??
    H5AC_unprotect H5C_unprotect() failed.
    H5HF_delete unable to release fractal heap header
    H5A_dense_delete unable to delete fractal heap
    H5O_ainfo_delete unable to free dense attribute storage
    H5O_delete_mesg unable to delete file space for object header message
    H5O_release_mesg unable to delete file space for object header message
    H5O_msg_remove_cb unable to release message
    H5O_msg_iterate_real iterator function failed

Error in H5A.delete (line 21)
H5ML.hdf5lib2('H5Adelete', loc_id, name);

Error in h5a_delete (line 37)
H5A.delete(groupID, attrName);

Thank you for any help.
Best regards,
Vladimir Sedenka

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org<mailto:Hdf-forum@lists.hdfgroup.org>
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org<mailto:Hdf-forum@lists.hdfgroup.org>
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5