HDF5 1.8.11-pre2 dynamically loaded filter issues

Hi,

I am trying to use the new dynamically loaded filter plugin feature in 1.8.11-pre2 by adding the filter information to an existing library:

                 U H5Oexists_by_name
                 U H5Oget_info_by_name
                 U H5Oopen
000000000003ec86 T H5PLget_plugin_info
000000000003ec80 T H5PLget_plugin_type
                 U H5P_CLS_DATASET_CREATE_g
                 U H5P_CLS_FILE_ACCESS_g
                 U H5Pclose
[partial output from using 'nm' on my shared object which should be used as a filter]

I have found that I need to manually register the plugin with 'H5Zregister' in order to use it to write data. This appears to work - producing files of the expected size in a believable amount of time - but won't allow the files to be read with any program that that doesn't explicitly load the filter, such as 'h5dump'. I have not written any code to read the file through this filter, so have not verified that any data is actually being written. Without manually registering the filter I get the following errors when trying to set a custom filter using 'H5Pset_filter':

HDF5-DIAG: Error detected in HDF5 (1.8.11-pre2) thread 0:
  #000: H5Pocpl.c line 753 in H5Pset_filter(): failed to call private function
    major: Property lists
    minor: Can't set value
  #001: H5Pocpl.c line 814 in H5P__set_filter(): failed to load dynamically loaded plugin
    major: Data filters
    minor: Unable to load metadata into cache
[errors trying to dynamically load a filter plugin before writing data to a file]

After reading through 'HDF5DynamicallyLoadedFilters.pdf' I have attempted to set the environment variable 'HDF5_PLUGIN_PATH' to the appropriate value for my plugin library (the directory containing the shared object), but this only changes the error messages in the output of h5dump:

HDF5-DIAG: Error detected in HDF5 (1.8.11) thread 0:
  #000: H5Dio.c line 182 in H5Dread(): can't read data
    major: Dataset
    minor: Read failed
  #001: H5Dio.c line 550 in H5D__read(): can't read data
    major: Dataset
    minor: Read failed
  #002: H5Dchunk.c line 1837 in H5D__chunk_read(): unable to read raw data chunk
    major: Low-level I/O
    minor: Read failed
  #003: H5Dchunk.c line 2868 in H5D__chunk_lock(): data pipeline read failed
    major: Data filters
    minor: Filter operation failed
  #004: H5Z.c line 1150 in H5Z_pipeline(): required filter 'HDF5 CBF compression filters' is not registered
    major: Data filters
    minor: Read failed
h5dump error: unable to print data
[errors from 'h5dump' with or without the environment variable]

I have not been able to attempt to put the plugin library in '/usr/local/hdf5/lib/plugin' due to insufficient permissions.

I also get the following errors from the plugin test, when executed as './plugin' from within the 'test' directory:

Testing DYNLIB1 filter
*FAILED*
HDF5-DIAG: Error detected in HDF5 (1.8.11) thread 0:
  #000: H5Pocpl.c line 753 in H5Pset_filter(): failed to call private function
    major: Property lists
    minor: Can't set value
  #001: H5Pocpl.c line 814 in H5P__set_filter(): failed to load dynamically loaded plugin
    major: Data filters
    minor: Unable to load metadata into cache
Testing Testing DYNLIB3 filter for group *FAILED*
HDF5-DIAG: Error detected in HDF5 (1.8.11) thread 0:
  #000: H5Pocpl.c line 753 in H5Pset_filter(): failed to call private function
    major: Property lists
    minor: Can't set value
  #001: H5Pocpl.c line 814 in H5P__set_filter(): failed to load dynamically loaded plugin
    major: Data filters
    minor: Unable to load metadata into cache
  #002: H5PL.c line 293 in H5PL_load(): search in paths failed
    major: Plugin for dynamically loaded library
    minor: Can't get value
  #003: H5PL.c line 440 in H5PL__find(): can't close directory: Bad file descriptor
    major: File accessibilty
    minor: Close failed
[errors from first failing plugin test]

These errors are the first to occur, and happen with both sets of file format tests ('new' and 'old'). There are some further errors which appear to be due to issues closing files.

I would like to know if this is a minor configuration error on my part (and if so, how to fix it), or a legitimate bug in the library.

I am using Red Hat Enterprise Linux 5.9, with v 2.6.18-348.4.1.el5 of the linux kernel.

Thanks.

···

--
This e-mail and any attachments may contain confidential, copyright and or privileged material, and are for the use of the intended addressee only. If you are not the intended addressee or an authorised recipient of the addressee please notify us of receipt by returning the e-mail and do not use, copy, retain, distribute or disclose the information in or attached to the e-mail.
Any opinions expressed within this e-mail are those of the individual and not necessarily of Diamond Light Source Ltd.
Diamond Light Source Ltd. cannot guarantee that this e-mail or any attachments are free from viruses and we cannot accept liability for any damage which you may sustain as a result of software viruses which may be transmitted in or with the message.
Diamond Light Source Limited (company no. 4375679). Registered in England and Wales with its registered office at Diamond House, Harwell Science and Innovation Campus, Didcot, Oxfordshire, OX11 0DE, United Kingdom

This also occurred with version 1.8.11.

I eventually solved it by adding the path to the libhdf5.so to the rpath of my filter library and setting 'HDF5_PLUGIN_PATH' to the path to my filter library. The symptoms were that 'ldd /path/to/plugin.so' reported that it could not find the hdf5 library.

The plugin tests (cd /path/to/hdf5/test && ./plugin) continue to fail with the same error as above, but I still don't know if that is the correct way to run them.

···

________________________________________
From: jonathan.sloan@diamond.ac.uk [jonathan.sloan@diamond.ac.uk]
Sent: 16 July 2013 11:35
To: hdf-forum@lists.hdfgroup.org
Subject: [Hdf-forum] HDF5 1.8.11-pre2 dynamically loaded filter issues

Hi,

I am trying to use the new dynamically loaded filter plugin feature in 1.8.11-pre2 by adding the filter information to an existing library:

                 U H5Oexists_by_name
                 U H5Oget_info_by_name
                 U H5Oopen
000000000003ec86 T H5PLget_plugin_info
000000000003ec80 T H5PLget_plugin_type
                 U H5P_CLS_DATASET_CREATE_g
                 U H5P_CLS_FILE_ACCESS_g
                 U H5Pclose
[partial output from using 'nm' on my shared object which should be used as a filter]

I have found that I need to manually register the plugin with 'H5Zregister' in order to use it to write data. This appears to work - producing files of the expected size in a believable amount of time - but won't allow the files to be read with any program that that doesn't explicitly load the filter, such as 'h5dump'. I have not written any code to read the file through this filter, so have not verified that any data is actually being written. Without manually registering the filter I get the following errors when trying to set a custom filter using 'H5Pset_filter':

HDF5-DIAG: Error detected in HDF5 (1.8.11-pre2) thread 0:
  #000: H5Pocpl.c line 753 in H5Pset_filter(): failed to call private function
    major: Property lists
    minor: Can't set value
  #001: H5Pocpl.c line 814 in H5P__set_filter(): failed to load dynamically loaded plugin
    major: Data filters
    minor: Unable to load metadata into cache
[errors trying to dynamically load a filter plugin before writing data to a file]

After reading through 'HDF5DynamicallyLoadedFilters.pdf' I have attempted to set the environment variable 'HDF5_PLUGIN_PATH' to the appropriate value for my plugin library (the directory containing the shared object), but this only changes the error messages in the output of h5dump:

HDF5-DIAG: Error detected in HDF5 (1.8.11) thread 0:
  #000: H5Dio.c line 182 in H5Dread(): can't read data
    major: Dataset
    minor: Read failed
  #001: H5Dio.c line 550 in H5D__read(): can't read data
    major: Dataset
    minor: Read failed
  #002: H5Dchunk.c line 1837 in H5D__chunk_read(): unable to read raw data chunk
    major: Low-level I/O
    minor: Read failed
  #003: H5Dchunk.c line 2868 in H5D__chunk_lock(): data pipeline read failed
    major: Data filters
    minor: Filter operation failed
  #004: H5Z.c line 1150 in H5Z_pipeline(): required filter 'HDF5 CBF compression filters' is not registered
    major: Data filters
    minor: Read failed
h5dump error: unable to print data
[errors from 'h5dump' with or without the environment variable]

I have not been able to attempt to put the plugin library in '/usr/local/hdf5/lib/plugin' due to insufficient permissions.

I also get the following errors from the plugin test, when executed as './plugin' from within the 'test' directory:

Testing DYNLIB1 filter
*FAILED*
HDF5-DIAG: Error detected in HDF5 (1.8.11) thread 0:
  #000: H5Pocpl.c line 753 in H5Pset_filter(): failed to call private function
    major: Property lists
    minor: Can't set value
  #001: H5Pocpl.c line 814 in H5P__set_filter(): failed to load dynamically loaded plugin
    major: Data filters
    minor: Unable to load metadata into cache
Testing Testing DYNLIB3 filter for group *FAILED*
HDF5-DIAG: Error detected in HDF5 (1.8.11) thread 0:
  #000: H5Pocpl.c line 753 in H5Pset_filter(): failed to call private function
    major: Property lists
    minor: Can't set value
  #001: H5Pocpl.c line 814 in H5P__set_filter(): failed to load dynamically loaded plugin
    major: Data filters
    minor: Unable to load metadata into cache
  #002: H5PL.c line 293 in H5PL_load(): search in paths failed
    major: Plugin for dynamically loaded library
    minor: Can't get value
  #003: H5PL.c line 440 in H5PL__find(): can't close directory: Bad file descriptor
    major: File accessibilty
    minor: Close failed
[errors from first failing plugin test]

These errors are the first to occur, and happen with both sets of file format tests ('new' and 'old'). There are some further errors which appear to be due to issues closing files.

I would like to know if this is a minor configuration error on my part (and if so, how to fix it), or a legitimate bug in the library.

I am using Red Hat Enterprise Linux 5.9, with v 2.6.18-348.4.1.el5 of the linux kernel.

Thanks.

--
This e-mail and any attachments may contain confidential, copyright and or privileged material, and are for the use of the intended addressee only. If you are not the intended addressee or an authorised recipient of the addressee please notify us of receipt by returning the e-mail and do not use, copy, retain, distribute or disclose the information in or attached to the e-mail.
Any opinions expressed within this e-mail are those of the individual and not necessarily of Diamond Light Source Ltd.
Diamond Light Source Ltd. cannot guarantee that this e-mail or any attachments are free from viruses and we cannot accept liability for any damage which you may sustain as a result of software viruses which may be transmitted in or with the message.
Diamond Light Source Limited (company no. 4375679). Registered in England and Wales with its registered office at Diamond House, Harwell Science and Innovation Campus, Didcot, Oxfordshire, OX11 0DE, United Kingdom

--
This e-mail and any attachments may contain confidential, copyright and or privileged material, and are for the use of the intended addressee only. If you are not the intended addressee or an authorised recipient of the addressee please notify us of receipt by returning the e-mail and do not use, copy, retain, distribute or disclose the information in or attached to the e-mail.
Any opinions expressed within this e-mail are those of the individual and not necessarily of Diamond Light Source Ltd.
Diamond Light Source Ltd. cannot guarantee that this e-mail or any attachments are free from viruses and we cannot accept liability for any damage which you may sustain as a result of software viruses which may be transmitted in or with the message.
Diamond Light Source Limited (company no. 4375679). Registered in England and Wales with its registered office at Diamond House, Harwell Science and Innovation Campus, Didcot, Oxfordshire, OX11 0DE, United Kingdom

Jonathan,

Let's focus on the test in HDF5 library first. Did you use the command "make check" to run the test suite after you built the library? If you want to run the test for plugin by hand, you should run the script file "test_plugin.sh", not "plugin" directly.

If you run "make check" or "test_plugin.sh" and still get the failure, I would like to see the output of your configure. There is a table called "Summary of the HDF5 Configuration" in the end of the output. I want to see it particularly.

Thanks.

Ray

···

On Jul 18, 2013, at 9:24 AM, <jonathan.sloan@diamond.ac.uk> <jonathan.sloan@diamond.ac.uk> wrote:

This also occurred with version 1.8.11.

I eventually solved it by adding the path to the libhdf5.so to the rpath of my filter library and setting 'HDF5_PLUGIN_PATH' to the path to my filter library. The symptoms were that 'ldd /path/to/plugin.so' reported that it could not find the hdf5 library.

The plugin tests (cd /path/to/hdf5/test && ./plugin) continue to fail with the same error as above, but I still don't know if that is the correct way to run them.
________________________________________
From: jonathan.sloan@diamond.ac.uk [jonathan.sloan@diamond.ac.uk]
Sent: 16 July 2013 11:35
To: hdf-forum@lists.hdfgroup.org
Subject: [Hdf-forum] HDF5 1.8.11-pre2 dynamically loaded filter issues

Hi,

I am trying to use the new dynamically loaded filter plugin feature in 1.8.11-pre2 by adding the filter information to an existing library:

                U H5Oexists_by_name
                U H5Oget_info_by_name
                U H5Oopen
000000000003ec86 T H5PLget_plugin_info
000000000003ec80 T H5PLget_plugin_type
                U H5P_CLS_DATASET_CREATE_g
                U H5P_CLS_FILE_ACCESS_g
                U H5Pclose
[partial output from using 'nm' on my shared object which should be used as a filter]

I have found that I need to manually register the plugin with 'H5Zregister' in order to use it to write data. This appears to work - producing files of the expected size in a believable amount of time - but won't allow the files to be read with any program that that doesn't explicitly load the filter, such as 'h5dump'. I have not written any code to read the file through this filter, so have not verified that any data is actually being written. Without manually registering the filter I get the following errors when trying to set a custom filter using 'H5Pset_filter':

HDF5-DIAG: Error detected in HDF5 (1.8.11-pre2) thread 0:
#000: H5Pocpl.c line 753 in H5Pset_filter(): failed to call private function
   major: Property lists
   minor: Can't set value
#001: H5Pocpl.c line 814 in H5P__set_filter(): failed to load dynamically loaded plugin
   major: Data filters
   minor: Unable to load metadata into cache
[errors trying to dynamically load a filter plugin before writing data to a file]

After reading through 'HDF5DynamicallyLoadedFilters.pdf' I have attempted to set the environment variable 'HDF5_PLUGIN_PATH' to the appropriate value for my plugin library (the directory containing the shared object), but this only changes the error messages in the output of h5dump:

HDF5-DIAG: Error detected in HDF5 (1.8.11) thread 0:
#000: H5Dio.c line 182 in H5Dread(): can't read data
   major: Dataset
   minor: Read failed
#001: H5Dio.c line 550 in H5D__read(): can't read data
   major: Dataset
   minor: Read failed
#002: H5Dchunk.c line 1837 in H5D__chunk_read(): unable to read raw data chunk
   major: Low-level I/O
   minor: Read failed
#003: H5Dchunk.c line 2868 in H5D__chunk_lock(): data pipeline read failed
   major: Data filters
   minor: Filter operation failed
#004: H5Z.c line 1150 in H5Z_pipeline(): required filter 'HDF5 CBF compression filters' is not registered
   major: Data filters
   minor: Read failed
h5dump error: unable to print data
[errors from 'h5dump' with or without the environment variable]

I have not been able to attempt to put the plugin library in '/usr/local/hdf5/lib/plugin' due to insufficient permissions.

I also get the following errors from the plugin test, when executed as './plugin' from within the 'test' directory:

Testing DYNLIB1 filter
*FAILED*
HDF5-DIAG: Error detected in HDF5 (1.8.11) thread 0:
#000: H5Pocpl.c line 753 in H5Pset_filter(): failed to call private function
   major: Property lists
   minor: Can't set value
#001: H5Pocpl.c line 814 in H5P__set_filter(): failed to load dynamically loaded plugin
   major: Data filters
   minor: Unable to load metadata into cache
Testing Testing DYNLIB3 filter for group *FAILED*
HDF5-DIAG: Error detected in HDF5 (1.8.11) thread 0:
#000: H5Pocpl.c line 753 in H5Pset_filter(): failed to call private function
   major: Property lists
   minor: Can't set value
#001: H5Pocpl.c line 814 in H5P__set_filter(): failed to load dynamically loaded plugin
   major: Data filters
   minor: Unable to load metadata into cache
#002: H5PL.c line 293 in H5PL_load(): search in paths failed
   major: Plugin for dynamically loaded library
   minor: Can't get value
#003: H5PL.c line 440 in H5PL__find(): can't close directory: Bad file descriptor
   major: File accessibilty
   minor: Close failed
[errors from first failing plugin test]

These errors are the first to occur, and happen with both sets of file format tests ('new' and 'old'). There are some further errors which appear to be due to issues closing files.

I would like to know if this is a minor configuration error on my part (and if so, how to fix it), or a legitimate bug in the library.

I am using Red Hat Enterprise Linux 5.9, with v 2.6.18-348.4.1.el5 of the linux kernel.

Thanks.

--
This e-mail and any attachments may contain confidential, copyright and or privileged material, and are for the use of the intended addressee only. If you are not the intended addressee or an authorised recipient of the addressee please notify us of receipt by returning the e-mail and do not use, copy, retain, distribute or disclose the information in or attached to the e-mail.
Any opinions expressed within this e-mail are those of the individual and not necessarily of Diamond Light Source Ltd.
Diamond Light Source Ltd. cannot guarantee that this e-mail or any attachments are free from viruses and we cannot accept liability for any damage which you may sustain as a result of software viruses which may be transmitted in or with the message.
Diamond Light Source Limited (company no. 4375679). Registered in England and Wales with its registered office at Diamond House, Harwell Science and Innovation Campus, Didcot, Oxfordshire, OX11 0DE, United Kingdom

--
This e-mail and any attachments may contain confidential, copyright and or privileged material, and are for the use of the intended addressee only. If you are not the intended addressee or an authorised recipient of the addressee please notify us of receipt by returning the e-mail and do not use, copy, retain, distribute or disclose the information in or attached to the e-mail.
Any opinions expressed within this e-mail are those of the individual and not necessarily of Diamond Light Source Ltd.
Diamond Light Source Ltd. cannot guarantee that this e-mail or any attachments are free from viruses and we cannot accept liability for any damage which you may sustain as a result of software viruses which may be transmitted in or with the message.
Diamond Light Source Limited (company no. 4375679). Registered in England and Wales with its registered office at Diamond House, Harwell Science and Innovation Campus, Didcot, Oxfordshire, OX11 0DE, United Kingdom

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org