Hdf5 view crashes without error message when opening a (corrupt) file

Dear HDF-Forum

I have a file, which causes an immediate crash of the HDF-viewer (no error message, tested with newest versions on Windows and Mac OS X).

I have to assume the file is corrupt. It was written with our Software (Imaris) which uses regular calls from the official HDF-library (v1.8.6). Loading the file with our Software causes a crash. In order to figure out, what was last written into the file, I opened the file in the HDF-viewer, but there it also crashes (Windows and Mac OS X). In the debugger on Windows it crashes inside "H5Gopen", but I don't understand, what happens there in detail. Try catch(...) doesn't catch - it must be a severe memory violation.

Obviously something went wrong when the file was written. Unfortunately I can't tell, what exactly was done there - it was a customer using our software, and "save" was successful. I'd like to do two things:

- Try to understand from the file content, what went wrong when the file was written. HDF View would be helpful, if it would show the non-corrupted parts of the file (instead of crashing).

- Show a more graceful message to the user, when he tries to open such a file in our Software. Ideally "H5Gopen" would return some error code, or at least do something catchable.

Is there a common place for the HDF-forum to upload my file? The zipped file size is 1.3 GB.

Please help!

Cheers,
Christoph
This message is intended only for the use of the addressee and may contain information that is confidential and/or subject to copyright. If you are not the intended recipient, you are hereby notified that any dissemination, copying, or redistribution of this message is strictly prohibited. If you have received this message in error please delete all copies immediately. Any views or opinions presented in this email are solely those of the author and do not necessarily represent those of Andor Technology Plc Companies. Andor Technology Plc has taken reasonable precautions to ensure that no viruses are contained in this email, but does not accept any responsibility once this email has been transmitted. Andor Technology PLC is a registered company in Northern Ireland, registration number: NI022466. Registered Office: Andor Technology, 7 Millennium Way, Springvale Business Park, Belfast, BT12 7AL.

Hi Christoph,

Dear HDF-Forum

I have a file, which causes an immediate crash of the HDF-viewer (no error message, tested with newest versions on Windows and Mac OS X).

HDFView cannot handle big files or files with a lot of objects. This is a known limitation that we are trying to address in the next major release. You will need to use other tools to confirm that the file is indeed corrupted.

I have to assume the file is corrupt. It was written with our Software (Imaris) which uses regular calls from the official HDF-library (v1.8.6). Loading the file with our Software causes a crash. In order to figure out, what was last written into the file, I opened the file in the HDF-viewer, but there it also crashes (Windows and Mac OS X). In the debugger on Windows it crashes inside “H5Gopen”, but I don’t understand, what happens there in detail. Try catch(…) doesn’t catch – it must be a severe memory violation.

Could you please try to run h5check tool to see if HDF5 metadata in this file is corrupted? If h5check doesn't find anything, try to run h5ls to see how far you can traverse the file. Information about both tools can be found here http://www.hdfgroup.org/HDF5/doc/RM/Tools.html

Obviously something went wrong when the file was written. Unfortunately I can’t tell, what exactly was done there – it was a customer using our software, and “save” was successful. I’d like to do two things:

- Try to understand from the file content, what went wrong when the file was written. HDF View would be helpful, if it would show the non-corrupted parts of the file (instead of crashing).
- Show a more graceful message to the user, when he tries to open such a file in our Software. Ideally “H5Gopen” would return some error code, or at least do something catchable.

Agree on both issues.

Is there a common place for the HDF-forum to upload my file? The zipped file size is 1.3 GB.

I'll check with our sysadmin people. Stay tuned.

Elena

···

On Feb 3, 2014, at 7:50 AM, Christoph Laimer <Christoph@bitplane.com> wrote:

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Elena Pourmal The HDF Group http://hdfgroup.org
1800 So. Oak St., Suite 203, Champaign IL 61820
217.531.6112
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Please help!

Cheers,
Christoph
This message is intended only for the use of the addressee and may contain information that is confidential and/or subject to copyright. If you are not the intended recipient, you are hereby notified that any dissemination, copying, or redistribution of this message is strictly prohibited. If you have received this message in error please delete all copies immediately. Any views or opinions presented in this email are solely those of the author and do not necessarily represent those of Andor Technology Plc Companies. Andor Technology Plc has taken reasonable precautions to ensure that no viruses are contained in this email, but does not accept any responsibility once this email has been transmitted. Andor Technology PLC is a registered company in Northern Ireland, registration number: NI022466. Registered Office: Andor Technology, 7 Millennium Way, Springvale Business Park, Belfast, BT12 7AL. _______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

Hi Elena

Thanks for your quick response.

H5check produces the output below. It ends with a segmentation fault. There is also an error message, but I don't know how to interpret this. Maybe an indicator of the subsequent crash. I've tested also other files, which are generated by our software, and "h5check" doesn't tell errors there.

./h5check --verbose=2 /Volumes/Data/tmp/Cropped_ferret182group2confocalalign.ims
VERBOSE is true:verbose # = 2

VALIDATING /Volumes/Data/tmp/Cropped_ferret182group2confocalalign.ims according to library version 1.8.0

FOUND super block signature
VALIDATING the super block at physical address 0...
Validating version 2 superblock...
INITIALIZING filters ...
VALIDATING the object header at logical address 48...
VALIDATING version 2 object header ...
FOUND Version 2 object header signature
VALIDATING the object header at logical address 1382334217...
VALIDATING version 1 object header...
Version 1 object header encountered
VALIDATING the local heap at logical address 1382334801...
FOUND local heap signature.
VALIDATING version 1 btree at logical address 1382334257...
FOUND version 1 btree signature.
VALIDATING the Symbol table node at logical address 1382335675...
FOUND Symbol table node signature.
VALIDATING the object header at logical address 1382334971...
VALIDATING version 1 object header...
Version 1 object header encountered
VALIDATING the local heap at logical address 1382335555...
FOUND local heap signature.
VALIDATING version 1 btree at logical address 1382335011...
FOUND version 1 btree signature.
VALIDATING the Symbol table node at logical address 1382352746...
FOUND Symbol table node signature.
VALIDATING the object header at logical address 1382352042...
VALIDATING version 1 object header...
Version 1 object header encountered
VALIDATING the local heap at logical address 1382352626...
FOUND local heap signature.
VALIDATING version 1 btree at logical address 1382352082...
FOUND version 1 btree signature.
VALIDATING the object header at logical address 1382336003...
VALIDATING version 1 object header...
Version 1 object header encountered
VALIDATING version 1 btree at logical address 1382336275...
FOUND version 1 btree signature.
VALIDATING the object header at logical address 267213...
VALIDATING version 2 object header ...
FOUND Version 2 object header signature
***Error***
version 2 Object Header:Couldn't find CONT signature at addr 1351906296
***End of Error messages***
VALIDATING the object header at logical address 267412...
VALIDATING version 1 object header...
***Error***
Object Header:Unable to read object header data at addr 267444
Version 1 Object Header:Bad version number at addr 267412; Value decoded: 120
***End of Error messages***
VALIDATING the object header at logical address 1096354995...
VALIDATING version 2 object header ...
FOUND Version 2 object header signature
VALIDATING the object header at logical address 1096355142...
VALIDATING version 2 object header ...
FOUND Version 2 object header signature
VALIDATING the object header at logical address 1096355289...
VALIDATING version 2 object header ...
FOUND Version 2 object header signature
VALIDATING the object header at logical address 1096355665...
VALIDATING version 2 object header ...
FOUND Version 2 object header signature
VALIDATING version 1 btree at logical address 1096358724...
Segmentation fault: 11

(I don't expect, that our customer was storing a lot of hdf5-objects. In situations, where our software potentially produces tons of "objects", we consequently use tables that contain these "objects" - and the file certainly contains 4d-image-data)

Cheers,
Christoph

···

From: Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org] On Behalf Of Elena Pourmal
Sent: Mittwoch, 5. Februar 2014 02:38
To: HDF Users Discussion List
Subject: Re: [Hdf-forum] Hdf5 view crashes without error message when opening a (corrupt) file

Hi Christoph,

On Feb 3, 2014, at 7:50 AM, Christoph Laimer <Christoph@bitplane.com<mailto:Christoph@bitplane.com>> wrote:

Dear HDF-Forum

I have a file, which causes an immediate crash of the HDF-viewer (no error message, tested with newest versions on Windows and Mac OS X).

HDFView cannot handle big files or files with a lot of objects. This is a known limitation that we are trying to address in the next major release. You will need to use other tools to confirm that the file is indeed corrupted.

I have to assume the file is corrupt. It was written with our Software (Imaris) which uses regular calls from the official HDF-library (v1.8.6). Loading the file with our Software causes a crash. In order to figure out, what was last written into the file, I opened the file in the HDF-viewer, but there it also crashes (Windows and Mac OS X). In the debugger on Windows it crashes inside "H5Gopen", but I don't understand, what happens there in detail. Try catch(...) doesn't catch - it must be a severe memory violation.

Could you please try to run h5check tool to see if HDF5 metadata in this file is corrupted? If h5check doesn't find anything, try to run h5ls to see how far you can traverse the file. Information about both tools can be found here http://www.hdfgroup.org/HDF5/doc/RM/Tools.html
Hi
Obviously something went wrong when the file was written. Unfortunately I can't tell, what exactly was done there - it was a customer using our software, and "save" was successful. I'd like to do two things:

- Try to understand from the file content, what went wrong when the file was written. HDF View would be helpful, if it would show the non-corrupted parts of the file (instead of crashing).
- Show a more graceful message to the user, when he tries to open such a file in our Software. Ideally "H5Gopen" would return some error code, or at least do something catchable.

Agree on both issues.

Is there a common place for the HDF-forum to upload my file? The zipped file size is 1.3 GB.

I'll check with our sysadmin people. Stay tuned.

Elena

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Elena Pourmal The HDF Group http://hdfgroup.org
1800 So. Oak St., Suite 203, Champaign IL 61820
217.531.6112
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Please help!

Cheers,
Christoph
This message is intended only for the use of the addressee and may contain information that is confidential and/or subject to copyright. If you are not the intended recipient, you are hereby notified that any dissemination, copying, or redistribution of this message is strictly prohibited. If you have received this message in error please delete all copies immediately. Any views or opinions presented in this email are solely those of the author and do not necessarily represent those of Andor Technology Plc Companies. Andor Technology Plc has taken reasonable precautions to ensure that no viruses are contained in this email, but does not accept any responsibility once this email has been transmitted. Andor Technology PLC is a registered company in Northern Ireland, registration number: NI022466. Registered Office: Andor Technology, 7 Millennium Way, Springvale Business Park, Belfast, BT12 7AL. _______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org<mailto:Hdf-forum@lists.hdfgroup.org>
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

Hi Christoph,

Did the software crash while writing the file?

It looks like the file is missing an object header. Since we use a metadata cache, this can happen if a program crashes while the cache contains unflushed metadata. HDF5 is not (currently) journaled, so we don't order or group metadata writes to avoid this situation (yet).

Dana

···

From: Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org] On Behalf Of Christoph Laimer
Sent: Wednesday, February 05, 2014 5:33 AM
To: HDF Users Discussion List
Subject: Re: [Hdf-forum] Hdf5 view crashes without error message when opening a (corrupt) file

Hi Elena

Thanks for your quick response.

H5check produces the output below. It ends with a segmentation fault. There is also an error message, but I don't know how to interpret this. Maybe an indicator of the subsequent crash. I've tested also other files, which are generated by our software, and "h5check" doesn't tell errors there.

./h5check --verbose=2 /Volumes/Data/tmp/Cropped_ferret182group2confocalalign.ims
VERBOSE is true:verbose # = 2

VALIDATING /Volumes/Data/tmp/Cropped_ferret182group2confocalalign.ims according to library version 1.8.0

FOUND super block signature
VALIDATING the super block at physical address 0...
Validating version 2 superblock...
INITIALIZING filters ...
VALIDATING the object header at logical address 48...
VALIDATING version 2 object header ...
FOUND Version 2 object header signature
VALIDATING the object header at logical address 1382334217...
VALIDATING version 1 object header...
Version 1 object header encountered
VALIDATING the local heap at logical address 1382334801...
FOUND local heap signature.
VALIDATING version 1 btree at logical address 1382334257...
FOUND version 1 btree signature.
VALIDATING the Symbol table node at logical address 1382335675...
FOUND Symbol table node signature.
VALIDATING the object header at logical address 1382334971...
VALIDATING version 1 object header...
Version 1 object header encountered
VALIDATING the local heap at logical address 1382335555...
FOUND local heap signature.
VALIDATING version 1 btree at logical address 1382335011...
FOUND version 1 btree signature.
VALIDATING the Symbol table node at logical address 1382352746...
FOUND Symbol table node signature.
VALIDATING the object header at logical address 1382352042...
VALIDATING version 1 object header...
Version 1 object header encountered
VALIDATING the local heap at logical address 1382352626...
FOUND local heap signature.
VALIDATING version 1 btree at logical address 1382352082...
FOUND version 1 btree signature.
VALIDATING the object header at logical address 1382336003...
VALIDATING version 1 object header...
Version 1 object header encountered
VALIDATING version 1 btree at logical address 1382336275...
FOUND version 1 btree signature.
VALIDATING the object header at logical address 267213...
VALIDATING version 2 object header ...
FOUND Version 2 object header signature
***Error***
version 2 Object Header:Couldn't find CONT signature at addr 1351906296
***End of Error messages***
VALIDATING the object header at logical address 267412...
VALIDATING version 1 object header...
***Error***
Object Header:Unable to read object header data at addr 267444
Version 1 Object Header:Bad version number at addr 267412; Value decoded: 120
***End of Error messages***
VALIDATING the object header at logical address 1096354995...
VALIDATING version 2 object header ...
FOUND Version 2 object header signature
VALIDATING the object header at logical address 1096355142...
VALIDATING version 2 object header ...
FOUND Version 2 object header signature
VALIDATING the object header at logical address 1096355289...
VALIDATING version 2 object header ...
FOUND Version 2 object header signature
VALIDATING the object header at logical address 1096355665...
VALIDATING version 2 object header ...
FOUND Version 2 object header signature
VALIDATING version 1 btree at logical address 1096358724...
Segmentation fault: 11

(I don't expect, that our customer was storing a lot of hdf5-objects. In situations, where our software potentially produces tons of "objects", we consequently use tables that contain these "objects" - and the file certainly contains 4d-image-data)

Cheers,
Christoph

From: Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org] On Behalf Of Elena Pourmal
Sent: Mittwoch, 5. Februar 2014 02:38
To: HDF Users Discussion List
Subject: Re: [Hdf-forum] Hdf5 view crashes without error message when opening a (corrupt) file

Hi Christoph,

On Feb 3, 2014, at 7:50 AM, Christoph Laimer <Christoph@bitplane.com<mailto:Christoph@bitplane.com>> wrote:

Dear HDF-Forum

I have a file, which causes an immediate crash of the HDF-viewer (no error message, tested with newest versions on Windows and Mac OS X).

HDFView cannot handle big files or files with a lot of objects. This is a known limitation that we are trying to address in the next major release. You will need to use other tools to confirm that the file is indeed corrupted.

I have to assume the file is corrupt. It was written with our Software (Imaris) which uses regular calls from the official HDF-library (v1.8.6). Loading the file with our Software causes a crash. In order to figure out, what was last written into the file, I opened the file in the HDF-viewer, but there it also crashes (Windows and Mac OS X). In the debugger on Windows it crashes inside "H5Gopen", but I don't understand, what happens there in detail. Try catch(...) doesn't catch - it must be a severe memory violation.

Could you please try to run h5check tool to see if HDF5 metadata in this file is corrupted? If h5check doesn't find anything, try to run h5ls to see how far you can traverse the file. Information about both tools can be found here http://www.hdfgroup.org/HDF5/doc/RM/Tools.html
Hi
Obviously something went wrong when the file was written. Unfortunately I can't tell, what exactly was done there - it was a customer using our software, and "save" was successful. I'd like to do two things:

- Try to understand from the file content, what went wrong when the file was written. HDF View would be helpful, if it would show the non-corrupted parts of the file (instead of crashing).
- Show a more graceful message to the user, when he tries to open such a file in our Software. Ideally "H5Gopen" would return some error code, or at least do something catchable.

Agree on both issues.

Is there a common place for the HDF-forum to upload my file? The zipped file size is 1.3 GB.

I'll check with our sysadmin people. Stay tuned.

Elena

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Elena Pourmal The HDF Group http://hdfgroup.org
1800 So. Oak St., Suite 203, Champaign IL 61820
217.531.6112
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Please help!

Cheers,
Christoph
This message is intended only for the use of the addressee and may contain information that is confidential and/or subject to copyright. If you are not the intended recipient, you are hereby notified that any dissemination, copying, or redistribution of this message is strictly prohibited. If you have received this message in error please delete all copies immediately. Any views or opinions presented in this email are solely those of the author and do not necessarily represent those of Andor Technology Plc Companies. Andor Technology Plc has taken reasonable precautions to ensure that no viruses are contained in this email, but does not accept any responsibility once this email has been transmitted. Andor Technology PLC is a registered company in Northern Ireland, registration number: NI022466. Registered Office: Andor Technology, 7 Millennium Way, Springvale Business Park, Belfast, BT12 7AL. _______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org<mailto:Hdf-forum@lists.hdfgroup.org>
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

Hi Christoph,

h5check shouldn't give segmentation fault. We need to fix this and will need the file. Could you please upload it to ftp://ftp.hdfgroup.uiuc.edu/pub/incoming/epourmal/

Thank you!

Elena

···

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Elena Pourmal The HDF Group http://hdfgroup.org
1800 So. Oak St., Suite 203, Champaign IL 61820
217.531.6112
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

On Feb 5, 2014, at 6:42 AM, Dana Robinson <derobins@hdfgroup.org> wrote:

Hi Christoph,

Did the software crash while writing the file?

It looks like the file is missing an object header. Since we use a metadata cache, this can happen if a program crashes while the cache contains unflushed metadata. HDF5 is not (currently) journaled, so we don't order or group metadata writes to avoid this situation (yet).

Dana

From: Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org] On Behalf Of Christoph Laimer
Sent: Wednesday, February 05, 2014 5:33 AM
To: HDF Users Discussion List
Subject: Re: [Hdf-forum] Hdf5 view crashes without error message when opening a (corrupt) file

Hi Elena

Thanks for your quick response.

H5check produces the output below. It ends with a segmentation fault. There is also an error message, but I don’t know how to interpret this. Maybe an indicator of the subsequent crash. I’ve tested also other files, which are generated by our software, and “h5check” doesn’t tell errors there.

./h5check --verbose=2 /Volumes/Data/tmp/Cropped_ferret182group2confocalalign.ims
VERBOSE is true:verbose # = 2

VALIDATING /Volumes/Data/tmp/Cropped_ferret182group2confocalalign.ims according to library version 1.8.0

FOUND super block signature
VALIDATING the super block at physical address 0...
Validating version 2 superblock...
INITIALIZING filters ...
VALIDATING the object header at logical address 48...
VALIDATING version 2 object header ...
FOUND Version 2 object header signature
VALIDATING the object header at logical address 1382334217...
VALIDATING version 1 object header...
Version 1 object header encountered
VALIDATING the local heap at logical address 1382334801...
FOUND local heap signature.
VALIDATING version 1 btree at logical address 1382334257...
FOUND version 1 btree signature.
VALIDATING the Symbol table node at logical address 1382335675...
FOUND Symbol table node signature.
VALIDATING the object header at logical address 1382334971...
VALIDATING version 1 object header...
Version 1 object header encountered
VALIDATING the local heap at logical address 1382335555...
FOUND local heap signature.
VALIDATING version 1 btree at logical address 1382335011...
FOUND version 1 btree signature.
VALIDATING the Symbol table node at logical address 1382352746...
FOUND Symbol table node signature.
VALIDATING the object header at logical address 1382352042...
VALIDATING version 1 object header...
Version 1 object header encountered
VALIDATING the local heap at logical address 1382352626...
FOUND local heap signature.
VALIDATING version 1 btree at logical address 1382352082...
FOUND version 1 btree signature.
VALIDATING the object header at logical address 1382336003...
VALIDATING version 1 object header...
Version 1 object header encountered
VALIDATING version 1 btree at logical address 1382336275...
FOUND version 1 btree signature.
VALIDATING the object header at logical address 267213...
VALIDATING version 2 object header ...
FOUND Version 2 object header signature
***Error***
version 2 Object Header:Couldn't find CONT signature at addr 1351906296
***End of Error messages***
VALIDATING the object header at logical address 267412...
VALIDATING version 1 object header...
***Error***
Object Header:Unable to read object header data at addr 267444
Version 1 Object Header:Bad version number at addr 267412; Value decoded: 120
***End of Error messages***
VALIDATING the object header at logical address 1096354995...
VALIDATING version 2 object header ...
FOUND Version 2 object header signature
VALIDATING the object header at logical address 1096355142...
VALIDATING version 2 object header ...
FOUND Version 2 object header signature
VALIDATING the object header at logical address 1096355289...
VALIDATING version 2 object header ...
FOUND Version 2 object header signature
VALIDATING the object header at logical address 1096355665...
VALIDATING version 2 object header ...
FOUND Version 2 object header signature
VALIDATING version 1 btree at logical address 1096358724...
Segmentation fault: 11

(I don’t expect, that our customer was storing a lot of hdf5-objects. In situations, where our software potentially produces tons of “objects”, we consequently use tables that contain these “objects” – and the file certainly contains 4d-image-data)

Cheers,
Christoph

From: Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org] On Behalf Of Elena Pourmal
Sent: Mittwoch, 5. Februar 2014 02:38
To: HDF Users Discussion List
Subject: Re: [Hdf-forum] Hdf5 view crashes without error message when opening a (corrupt) file

Hi Christoph,

On Feb 3, 2014, at 7:50 AM, Christoph Laimer <Christoph@bitplane.com> wrote:

Dear HDF-Forum

I have a file, which causes an immediate crash of the HDF-viewer (no error message, tested with newest versions on Windows and Mac OS X).

HDFView cannot handle big files or files with a lot of objects. This is a known limitation that we are trying to address in the next major release. You will need to use other tools to confirm that the file is indeed corrupted.

I have to assume the file is corrupt. It was written with our Software (Imaris) which uses regular calls from the official HDF-library (v1.8.6). Loading the file with our Software causes a crash. In order to figure out, what was last written into the file, I opened the file in the HDF-viewer, but there it also crashes (Windows and Mac OS X). In the debugger on Windows it crashes inside “H5Gopen”, but I don’t understand, what happens there in detail. Try catch(…) doesn’t catch – it must be a severe memory violation.

Could you please try to run h5check tool to see if HDF5 metadata in this file is corrupted? If h5check doesn't find anything, try to run h5ls to see how far you can traverse the file. Information about both tools can be found here http://www.hdfgroup.org/HDF5/doc/RM/Tools.html
Hi
Obviously something went wrong when the file was written. Unfortunately I can’t tell, what exactly was done there – it was a customer using our software, and “save” was successful. I’d like to do two things:

- Try to understand from the file content, what went wrong when the file was written. HDF View would be helpful, if it would show the non-corrupted parts of the file (instead of crashing).
- Show a more graceful message to the user, when he tries to open such a file in our Software. Ideally “H5Gopen” would return some error code, or at least do something catchable.

Agree on both issues.

Is there a common place for the HDF-forum to upload my file? The zipped file size is 1.3 GB.

I'll check with our sysadmin people. Stay tuned.

Elena

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Elena Pourmal The HDF Group http://hdfgroup.org
1800 So. Oak St., Suite 203, Champaign IL 61820
217.531.6112
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Please help!

Cheers,
Christoph
This message is intended only for the use of the addressee and may contain information that is confidential and/or subject to copyright. If you are not the intended recipient, you are hereby notified that any dissemination, copying, or redistribution of this message is strictly prohibited. If you have received this message in error please delete all copies immediately. Any views or opinions presented in this email are solely those of the author and do not necessarily represent those of Andor Technology Plc Companies. Andor Technology Plc has taken reasonable precautions to ensure that no viruses are contained in this email, but does not accept any responsibility once this email has been transmitted. Andor Technology PLC is a registered company in Northern Ireland, registration number: NI022466. Registered Office: Andor Technology, 7 Millennium Way, Springvale Business Park, Belfast, BT12 7AL. _______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

Hi Elena,
Hi Dana

I've uploaded "Cropped_ferret182group2confocalalign.zip" (it took more than 4 hours).

At the moment I can't tell if the software crashed while writing - it could even be that the user pulled the power plug of his computer :wink: ... I'm investigating there.

Cheers,
Christoph

···

From: Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org] On Behalf Of Elena Pourmal
Sent: Mittwoch, 5. Februar 2014 16:59
To: HDF Users Discussion List
Subject: Re: [Hdf-forum] Hdf5 view crashes without error message when opening a (corrupt) file

Hi Christoph,

h5check shouldn't give segmentation fault. We need to fix this and will need the file. Could you please upload it to ftp://ftp.hdfgroup.uiuc.edu/pub/incoming/epourmal/

Thank you!

Elena

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Elena Pourmal The HDF Group http://hdfgroup.org
1800 So. Oak St., Suite 203, Champaign IL 61820
217.531.6112
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

On Feb 5, 2014, at 6:42 AM, Dana Robinson <derobins@hdfgroup.org<mailto:derobins@hdfgroup.org>> wrote:

Hi Christoph,

Did the software crash while writing the file?

It looks like the file is missing an object header. Since we use a metadata cache, this can happen if a program crashes while the cache contains unflushed metadata. HDF5 is not (currently) journaled, so we don't order or group metadata writes to avoid this situation (yet).

Dana

From: Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org<mailto:forum-bounces@lists.hdfgroup.org>] On Behalf Of Christoph Laimer
Sent: Wednesday, February 05, 2014 5:33 AM
To: HDF Users Discussion List
Subject: Re: [Hdf-forum] Hdf5 view crashes without error message when opening a (corrupt) file

Hi Elena

Thanks for your quick response.

H5check produces the output below. It ends with a segmentation fault. There is also an error message, but I don't know how to interpret this. Maybe an indicator of the subsequent crash. I've tested also other files, which are generated by our software, and "h5check" doesn't tell errors there.

./h5check --verbose=2 /Volumes/Data/tmp/Cropped_ferret182group2confocalalign.ims
VERBOSE is true:verbose # = 2

VALIDATING /Volumes/Data/tmp/Cropped_ferret182group2confocalalign.ims according to library version 1.8.0

FOUND super block signature
VALIDATING the super block at physical address 0...
Validating version 2 superblock...
INITIALIZING filters ...
VALIDATING the object header at logical address 48...
VALIDATING version 2 object header ...
FOUND Version 2 object header signature
VALIDATING the object header at logical address 1382334217...
VALIDATING version 1 object header...
Version 1 object header encountered
VALIDATING the local heap at logical address 1382334801...
FOUND local heap signature.
VALIDATING version 1 btree at logical address 1382334257...
FOUND version 1 btree signature.
VALIDATING the Symbol table node at logical address 1382335675...
FOUND Symbol table node signature.
VALIDATING the object header at logical address 1382334971...
VALIDATING version 1 object header...
Version 1 object header encountered
VALIDATING the local heap at logical address 1382335555...
FOUND local heap signature.
VALIDATING version 1 btree at logical address 1382335011...
FOUND version 1 btree signature.
VALIDATING the Symbol table node at logical address 1382352746...
FOUND Symbol table node signature.
VALIDATING the object header at logical address 1382352042...
VALIDATING version 1 object header...
Version 1 object header encountered
VALIDATING the local heap at logical address 1382352626...
FOUND local heap signature.
VALIDATING version 1 btree at logical address 1382352082...
FOUND version 1 btree signature.
VALIDATING the object header at logical address 1382336003...
VALIDATING version 1 object header...
Version 1 object header encountered
VALIDATING version 1 btree at logical address 1382336275...
FOUND version 1 btree signature.
VALIDATING the object header at logical address 267213...
VALIDATING version 2 object header ...
FOUND Version 2 object header signature
***Error***
version 2 Object Header:Couldn't find CONT signature at addr 1351906296
***End of Error messages***
VALIDATING the object header at logical address 267412...
VALIDATING version 1 object header...
***Error***
Object Header:Unable to read object header data at addr 267444
Version 1 Object Header:Bad version number at addr 267412; Value decoded: 120
***End of Error messages***
VALIDATING the object header at logical address 1096354995...
VALIDATING version 2 object header ...
FOUND Version 2 object header signature
VALIDATING the object header at logical address 1096355142...
VALIDATING version 2 object header ...
FOUND Version 2 object header signature
VALIDATING the object header at logical address 1096355289...
VALIDATING version 2 object header ...
FOUND Version 2 object header signature
VALIDATING the object header at logical address 1096355665...
VALIDATING version 2 object header ...
FOUND Version 2 object header signature
VALIDATING version 1 btree at logical address 1096358724...
Segmentation fault: 11

(I don't expect, that our customer was storing a lot of hdf5-objects. In situations, where our software potentially produces tons of "objects", we consequently use tables that contain these "objects" - and the file certainly contains 4d-image-data)

Cheers,
Christoph

From: Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org] On Behalf Of Elena Pourmal
Sent: Mittwoch, 5. Februar 2014 02:38
To: HDF Users Discussion List
Subject: Re: [Hdf-forum] Hdf5 view crashes without error message when opening a (corrupt) file

Hi Christoph,

On Feb 3, 2014, at 7:50 AM, Christoph Laimer <Christoph@bitplane.com<mailto:Christoph@bitplane.com>> wrote:

Dear HDF-Forum

I have a file, which causes an immediate crash of the HDF-viewer (no error message, tested with newest versions on Windows and Mac OS X).

HDFView cannot handle big files or files with a lot of objects. This is a known limitation that we are trying to address in the next major release. You will need to use other tools to confirm that the file is indeed corrupted.

I have to assume the file is corrupt. It was written with our Software (Imaris) which uses regular calls from the official HDF-library (v1.8.6). Loading the file with our Software causes a crash. In order to figure out, what was last written into the file, I opened the file in the HDF-viewer, but there it also crashes (Windows and Mac OS X). In the debugger on Windows it crashes inside "H5Gopen", but I don't understand, what happens there in detail. Try catch(...) doesn't catch - it must be a severe memory violation.

Could you please try to run h5check tool to see if HDF5 metadata in this file is corrupted? If h5check doesn't find anything, try to run h5ls to see how far you can traverse the file. Information about both tools can be found here http://www.hdfgroup.org/HDF5/doc/RM/Tools.html
Hi
Obviously something went wrong when the file was written. Unfortunately I can't tell, what exactly was done there - it was a customer using our software, and "save" was successful. I'd like to do two things:

- Try to understand from the file content, what went wrong when the file was written. HDF View would be helpful, if it would show the non-corrupted parts of the file (instead of crashing).
- Show a more graceful message to the user, when he tries to open such a file in our Software. Ideally "H5Gopen" would return some error code, or at least do something catchable.

Agree on both issues.

Is there a common place for the HDF-forum to upload my file? The zipped file size is 1.3 GB.

I'll check with our sysadmin people. Stay tuned.

Elena

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Elena Pourmal The HDF Group http://hdfgroup.org
1800 So. Oak St., Suite 203, Champaign IL 61820
217.531.6112
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Please help!

Cheers,
Christoph
This message is intended only for the use of the addressee and may contain information that is confidential and/or subject to copyright. If you are not the intended recipient, you are hereby notified that any dissemination, copying, or redistribution of this message is strictly prohibited. If you have received this message in error please delete all copies immediately. Any views or opinions presented in this email are solely those of the author and do not necessarily represent those of Andor Technology Plc Companies. Andor Technology Plc has taken reasonable precautions to ensure that no viruses are contained in this email, but does not accept any responsibility once this email has been transmitted. Andor Technology PLC is a registered company in Northern Ireland, registration number: NI022466. Registered Office: Andor Technology, 7 Millennium Way, Springvale Business Park, Belfast, BT12 7AL. _______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org<mailto:Hdf-forum@lists.hdfgroup.org>
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org<mailto:Hdf-forum@lists.hdfgroup.org>
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

Hi all

just a hint. Simply running out of disk space while running will cause
similar problems. The file looks corrupted because the eoa is written when
the request is made to allocate space on disk and not when the disk space
is actually allocated. This later creates some out of bounds acesses which
cause the calling process to crash. Upon opening the file it looks
corrupted because eoa!=eof. Furthrer assertiosn and eventually crashes
might occur.

Cheers

Dimitris Servis

···

2014-02-07 Christoph Laimer <Christoph@bitplane.com>:

Hi Elena,

Hi Dana

I've uploaded "Cropped_ferret182group2confocalalign.zip" (it took more
than 4 hours).

At the moment I can't tell if the software crashed while writing - it
could even be that the user pulled the power plug of his computer :wink: ... I'm
investigating there.

Cheers,

Christoph

*From:* Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org] *On
Behalf Of *Elena Pourmal
*Sent:* Mittwoch, 5. Februar 2014 16:59
*To:* HDF Users Discussion List
*Subject:* Re: [Hdf-forum] Hdf5 view crashes without error message when
opening a (corrupt) file

Hi Christoph,

h5check shouldn't give segmentation fault. We need to fix this and will
need the file. Could you please upload it to
ftp://ftp.hdfgroup.uiuc.edu/pub/incoming/epourmal/

Thank you!

Elena

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Elena Pourmal The HDF Group http://hdfgroup.org
1800 So. Oak St., Suite 203, Champaign IL 61820
217.531.6112
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

On Feb 5, 2014, at 6:42 AM, Dana Robinson <derobins@hdfgroup.org> wrote:

  Hi Christoph,

Did the software crash while writing the file?

It looks like the file is missing an object header. Since we use a
metadata cache, this can happen if a program crashes while the cache
contains unflushed metadata. HDF5 is not (currently) journaled, so we don't
order or group metadata writes to avoid this situation (yet).

Dana

*From:* Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org] *On
Behalf Of *Christoph Laimer
*Sent:* Wednesday, February 05, 2014 5:33 AM
*To:* HDF Users Discussion List
*Subject:* Re: [Hdf-forum] Hdf5 view crashes without error message when
opening a (corrupt) file

Hi Elena

Thanks for your quick response.

H5check produces the output below. It ends with a segmentation fault.
There is also an error message, but I don't know how to interpret this.
Maybe an indicator of the subsequent crash. I've tested also other files,
which are generated by our software, and "h5check" doesn't tell errors
there.

./h5check --verbose=2
/Volumes/Data/tmp/Cropped_ferret182group2confocalalign.ims

VERBOSE is true:verbose # = 2

VALIDATING /Volumes/Data/tmp/Cropped_ferret182group2confocalalign.ims
according to library version 1.8.0

FOUND super block signature

VALIDATING the super block at physical address 0...

Validating version 2 superblock...

INITIALIZING filters ...

VALIDATING the object header at logical address 48...

VALIDATING version 2 object header ...

FOUND Version 2 object header signature

VALIDATING the object header at logical address 1382334217...

VALIDATING version 1 object header...

Version 1 object header encountered

VALIDATING the local heap at logical address 1382334801...

FOUND local heap signature.

VALIDATING version 1 btree at logical address 1382334257...

FOUND version 1 btree signature.

VALIDATING the Symbol table node at logical address 1382335675...

FOUND Symbol table node signature.

VALIDATING the object header at logical address 1382334971...

VALIDATING version 1 object header...

Version 1 object header encountered

VALIDATING the local heap at logical address 1382335555...

FOUND local heap signature.

VALIDATING version 1 btree at logical address 1382335011...

FOUND version 1 btree signature.

VALIDATING the Symbol table node at logical address 1382352746...

FOUND Symbol table node signature.

VALIDATING the object header at logical address 1382352042...

VALIDATING version 1 object header...

Version 1 object header encountered

VALIDATING the local heap at logical address 1382352626...

FOUND local heap signature.

VALIDATING version 1 btree at logical address 1382352082...

FOUND version 1 btree signature.

VALIDATING the object header at logical address 1382336003...

VALIDATING version 1 object header...

Version 1 object header encountered

VALIDATING version 1 btree at logical address 1382336275...

FOUND version 1 btree signature.

VALIDATING the object header at logical address 267213...

VALIDATING version 2 object header ...

FOUND Version 2 object header signature

***Error***

version 2 Object Header:Couldn't find CONT signature at addr 1351906296

***End of Error messages***

VALIDATING the object header at logical address 267412...

VALIDATING version 1 object header...

***Error***

Object Header:Unable to read object header data at addr 267444

Version 1 Object Header:Bad version number at addr 267412; Value decoded:
120

***End of Error messages***

VALIDATING the object header at logical address 1096354995...

VALIDATING version 2 object header ...

FOUND Version 2 object header signature

VALIDATING the object header at logical address 1096355142...

VALIDATING version 2 object header ...

FOUND Version 2 object header signature

VALIDATING the object header at logical address 1096355289...

VALIDATING version 2 object header ...

FOUND Version 2 object header signature

VALIDATING the object header at logical address 1096355665...

VALIDATING version 2 object header ...

FOUND Version 2 object header signature

VALIDATING version 1 btree at logical address 1096358724...

Segmentation fault: 11

(I don't expect, that our customer was storing a lot of hdf5-objects. In
situations, where our software potentially produces tons of "objects", we
consequently use tables that contain these "objects" - and the file
certainly contains 4d-image-data)

Cheers,

Christoph

*From:* Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org<hdf-forum-bounces@lists.hdfgroup.org>
] *On Behalf Of *Elena Pourmal
*Sent:* Mittwoch, 5. Februar 2014 02:38
*To:* HDF Users Discussion List
*Subject:* Re: [Hdf-forum] Hdf5 view crashes without error message when
opening a (corrupt) file

Hi Christoph,

On Feb 3, 2014, at 7:50 AM, Christoph Laimer <Christoph@bitplane.com> > wrote:

Dear HDF-Forum

I have a file, which causes an immediate crash of the HDF-viewer (no error
message, tested with newest versions on Windows and Mac OS X).

HDFView cannot handle big files or files with a lot of objects. This is a
known limitation that we are trying to address in the next major release.
You will need to use other tools to confirm that the file is indeed
corrupted.

I have to assume the file is corrupt. It was written with our Software
(Imaris) which uses regular calls from the official HDF-library (v1.8.6).
Loading the file with our Software causes a crash. In order to figure out,
what was last written into the file, I opened the file in the HDF-viewer,
but there it also crashes (Windows and Mac OS X). In the debugger on
Windows it crashes inside "H5Gopen", but I don't understand, what happens
there in detail. Try catch(...) doesn't catch - it must be a severe memory
violation.

Could you please try to run h5check tool to see if HDF5 metadata in this
file is corrupted? If h5check doesn't find anything, try to run h5ls to see
how far you can traverse the file. Information about both tools can be
found here http://www.hdfgroup.org/HDF5/doc/RM/Tools.html

  Hi

Obviously something went wrong when the file was written. Unfortunately I
can't tell, what exactly was done there - it was a customer using our
software, and "save" was successful. I'd like to do two things:

- Try to understand from the file content, what went wrong when
the file was written. HDF View would be helpful, if it would show the
non-corrupted parts of the file (instead of crashing).

  - Show a more graceful message to the user, when he tries to
open such a file in our Software. Ideally "H5Gopen" would return some error
code, or at least do something catchable.

Agree on both issues.

Is there a common place for the HDF-forum to upload my file? The zipped
file size is 1.3 GB.

I'll check with our sysadmin people. Stay tuned.

Elena

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Elena Pourmal The HDF Group http://hdfgroup.org
1800 So. Oak St., Suite 203, Champaign IL 61820
217.531.6112
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

   Please help!

Cheers,

Christoph

*This message is intended only for the use of the addressee and may
contain information that is confidential and/or subject to copyright. If
you are not the intended recipient, you are hereby notified that any
dissemination, copying, or redistribution of this message is strictly
prohibited. If you have received this message in error please delete all
copies immediately. Any views or opinions presented in this email are
solely those of the author and do not necessarily represent those of Andor
Technology Plc Companies. Andor Technology Plc has taken reasonable
precautions to ensure that no viruses are contained in this email, but does
not accept any responsibility once this email has been transmitted. Andor
Technology PLC is a registered company in Northern Ireland, registration
number: NI022466. Registered Office: Andor Technology, 7 Millennium Way,
Springvale Business Park, Belfast, BT12 7AL.*
_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org

http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org

http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org

http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

Hi Dimitris

Can't tell, if our customer ran out of disk space.

I'll double-check, what our software does in this situation. Potentially the HDF-library is signaling out-of-disk-space, while a file is written!? Our software would then need to inform the user, and probably delete the file (if it was a new file). Our software uses also HDF-functions to append data to an existing file - could be a challenge to handle this outside the HDF-library.

Could it make sense that H5Gopen checks eoa==eof? Resp. I assume if our software would implement that test, it would prevent the crash. Are there valid HDF-files, where eoa and eof are different? (sorry for my stupid questions)

Cheers,
Christoph

···

From: Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org] On Behalf Of Dimitris Servis
Sent: Freitag, 7. Februar 2014 11:50
To: HDF Users Discussion List
Subject: Re: [Hdf-forum] Hdf5 view crashes without error message when opening a (corrupt) file

Hi all
just a hint. Simply running out of disk space while running will cause similar problems. The file looks corrupted because the eoa is written when the request is made to allocate space on disk and not when the disk space is actually allocated. This later creates some out of bounds acesses which cause the calling process to crash. Upon opening the file it looks corrupted because eoa!=eof. Furthrer assertiosn and eventually crashes might occur.
Cheers
Dimitris Servis

2014-02-07 Christoph Laimer <Christoph@bitplane.com<mailto:Christoph@bitplane.com>>:
Hi Elena,
Hi Dana

I've uploaded "Cropped_ferret182group2confocalalign.zip" (it took more than 4 hours).

At the moment I can't tell if the software crashed while writing - it could even be that the user pulled the power plug of his computer :wink: ... I'm investigating there.

Cheers,
Christoph

From: Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org<mailto:hdf-forum-bounces@lists.hdfgroup.org>] On Behalf Of Elena Pourmal
Sent: Mittwoch, 5. Februar 2014 16:59
To: HDF Users Discussion List
Subject: Re: [Hdf-forum] Hdf5 view crashes without error message when opening a (corrupt) file

Hi Christoph,

h5check shouldn't give segmentation fault. We need to fix this and will need the file. Could you please upload it to ftp://ftp.hdfgroup.uiuc.edu/pub/incoming/epourmal/

Thank you!

Elena

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Elena Pourmal The HDF Group http://hdfgroup.org
1800 So. Oak St., Suite 203, Champaign IL 61820
217.531.6112<tel:217.531.6112>
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

On Feb 5, 2014, at 6:42 AM, Dana Robinson <derobins@hdfgroup.org<mailto:derobins@hdfgroup.org>> wrote:

Hi Christoph,

Did the software crash while writing the file?

It looks like the file is missing an object header. Since we use a metadata cache, this can happen if a program crashes while the cache contains unflushed metadata. HDF5 is not (currently) journaled, so we don't order or group metadata writes to avoid this situation (yet).

Dana

From: Hdf-forum [mailto:hdf-<mailto:hdf->forum-bounces@lists.hdfgroup.org<mailto:forum-bounces@lists.hdfgroup.org>] On Behalf Of Christoph Laimer
Sent: Wednesday, February 05, 2014 5:33 AM
To: HDF Users Discussion List
Subject: Re: [Hdf-forum] Hdf5 view crashes without error message when opening a (corrupt) file

Hi Elena

Thanks for your quick response.

H5check produces the output below. It ends with a segmentation fault. There is also an error message, but I don't know how to interpret this. Maybe an indicator of the subsequent crash. I've tested also other files, which are generated by our software, and "h5check" doesn't tell errors there.

./h5check --verbose=2 /Volumes/Data/tmp/Cropped_ferret182group2confocalalign.ims
VERBOSE is true:verbose # = 2

VALIDATING /Volumes/Data/tmp/Cropped_ferret182group2confocalalign.ims according to library version 1.8.0

FOUND super block signature
VALIDATING the super block at physical address 0...
Validating version 2 superblock...
INITIALIZING filters ...
VALIDATING the object header at logical address 48...
VALIDATING version 2 object header ...
FOUND Version 2 object header signature
VALIDATING the object header at logical address 1382334217...
VALIDATING version 1 object header...
Version 1 object header encountered
VALIDATING the local heap at logical address 1382334801...
FOUND local heap signature.
VALIDATING version 1 btree at logical address 1382334257...
FOUND version 1 btree signature.
VALIDATING the Symbol table node at logical address 1382335675...
FOUND Symbol table node signature.
VALIDATING the object header at logical address 1382334971...
VALIDATING version 1 object header...
Version 1 object header encountered
VALIDATING the local heap at logical address 1382335555...
FOUND local heap signature.
VALIDATING version 1 btree at logical address 1382335011...
FOUND version 1 btree signature.
VALIDATING the Symbol table node at logical address 1382352746...
FOUND Symbol table node signature.
VALIDATING the object header at logical address 1382352042...
VALIDATING version 1 object header...
Version 1 object header encountered
VALIDATING the local heap at logical address 1382352626...
FOUND local heap signature.
VALIDATING version 1 btree at logical address 1382352082...
FOUND version 1 btree signature.
VALIDATING the object header at logical address 1382336003...
VALIDATING version 1 object header...
Version 1 object header encountered
VALIDATING version 1 btree at logical address 1382336275...
FOUND version 1 btree signature.
VALIDATING the object header at logical address 267213...
VALIDATING version 2 object header ...
FOUND Version 2 object header signature
***Error***
version 2 Object Header:Couldn't find CONT signature at addr 1351906296
***End of Error messages***
VALIDATING the object header at logical address 267412...
VALIDATING version 1 object header...
***Error***
Object Header:Unable to read object header data at addr 267444
Version 1 Object Header:Bad version number at addr 267412; Value decoded: 120
***End of Error messages***
VALIDATING the object header at logical address 1096354995...
VALIDATING version 2 object header ...
FOUND Version 2 object header signature
VALIDATING the object header at logical address 1096355142...
VALIDATING version 2 object header ...
FOUND Version 2 object header signature
VALIDATING the object header at logical address 1096355289...
VALIDATING version 2 object header ...
FOUND Version 2 object header signature
VALIDATING the object header at logical address 1096355665...
VALIDATING version 2 object header ...
FOUND Version 2 object header signature
VALIDATING version 1 btree at logical address 1096358724...
Segmentation fault: 11

(I don't expect, that our customer was storing a lot of hdf5-objects. In situations, where our software potentially produces tons of "objects", we consequently use tables that contain these "objects" - and the file certainly contains 4d-image-data)

Cheers,
Christoph

From: Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org] On Behalf Of Elena Pourmal
Sent: Mittwoch, 5. Februar 2014 02:38
To: HDF Users Discussion List
Subject: Re: [Hdf-forum] Hdf5 view crashes without error message when opening a (corrupt) file

Hi Christoph,

On Feb 3, 2014, at 7:50 AM, Christoph Laimer <Christoph@bitplane.com<mailto:Christoph@bitplane.com>> wrote:

Dear HDF-Forum

I have a file, which causes an immediate crash of the HDF-viewer (no error message, tested with newest versions on Windows and Mac OS X).

HDFView cannot handle big files or files with a lot of objects. This is a known limitation that we are trying to address in the next major release. You will need to use other tools to confirm that the file is indeed corrupted.

I have to assume the file is corrupt. It was written with our Software (Imaris) which uses regular calls from the official HDF-library (v1.8.6). Loading the file with our Software causes a crash. In order to figure out, what was last written into the file, I opened the file in the HDF-viewer, but there it also crashes (Windows and Mac OS X). In the debugger on Windows it crashes inside "H5Gopen", but I don't understand, what happens there in detail. Try catch(...) doesn't catch - it must be a severe memory violation.

Could you please try to run h5check tool to see if HDF5 metadata in this file is corrupted? If h5check doesn't find anything, try to run h5ls to see how far you can traverse the file. Information about both tools can be found here http://www.hdfgroup.org/HDF5/doc/RM/Tools.html
Hi
Obviously something went wrong when the file was written. Unfortunately I can't tell, what exactly was done there - it was a customer using our software, and "save" was successful. I'd like to do two things:

- Try to understand from the file content, what went wrong when the file was written. HDF View would be helpful, if it would show the non-corrupted parts of the file (instead of crashing).
- Show a more graceful message to the user, when he tries to open such a file in our Software. Ideally "H5Gopen" would return some error code, or at least do something catchable.

Agree on both issues.

Is there a common place for the HDF-forum to upload my file? The zipped file size is 1.3 GB.

I'll check with our sysadmin people. Stay tuned.

Elena

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Elena Pourmal The HDF Group http://hdfgroup.org
1800 So. Oak St., Suite 203, Champaign IL 61820
217.531.6112<tel:217.531.6112>
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Please help!

Cheers,
Christoph
This message is intended only for the use of the addressee and may contain information that is confidential and/or subject to copyright. If you are not the intended recipient, you are hereby notified that any dissemination, copying, or redistribution of this message is strictly prohibited. If you have received this message in error please delete all copies immediately. Any views or opinions presented in this email are solely those of the author and do not necessarily represent those of Andor Technology Plc Companies. Andor Technology Plc has taken reasonable precautions to ensure that no viruses are contained in this email, but does not accept any responsibility once this email has been transmitted. Andor Technology PLC is a registered company in Northern Ireland, registration number: NI022466. Registered Office: Andor Technology, 7 Millennium Way, Springvale Business Park, Belfast, BT12 7AL. _______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org<mailto:Hdf-forum@lists.hdfgroup.org>
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org<mailto:Hdf-forum@lists.hdfgroup.org>
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org<mailto:Hdf-forum@lists.hdfgroup.org>
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org