Incorrect linking when using the "configure build" method

Dear All,

I encountered this issue just recently. We build HDF5 for our data analysis software by hand, on Scientific Linux 6. For a “standalone build” we would configure HDF5 like:

../hdf5-1.10.1/configure --prefix=/home/krasznaa/projects/hdf5/hdf5-1.10.1-install --with-zlib=/usr/include,/usr/lib64 --with-pthread=/usr/include,/usr/lib --enable-cxx --enable-hl --enable-threadsafe --enable-unsupported --enable-build-mode=production

By “standalone”, in this context I mean that we only pick up dependencies from the system (or from other components that we build ourselves). This is in contrast to the build we do against LCG releases. In which case the configuration command looks something like:

../hdf5-1.10.1/configure --prefix=/home/krasznaa/projects/hdf5/hdf5-1.10.1-install --with-zlib=/cvmfs/sft.cern.ch/lcg/releases/LCG_93/zlib/1.2.11/x86_64-slc6-gcc62-opt/include,/cvmfs/sft.cern.ch/lcg/releases/LCG_93/zlib/1.2.11/x86_64-slc6-gcc62-opt/lib --with-pthread=/usr/include,/usr/lib --enable-cxx --enable-hl --enable-threadsafe --enable-unsupported --enable-build-mode=production

I.e. in that case we take ZLIB from a central installation. (It’s a long story…)

Now, when I tell configure to take ZLIB explicitly from /usr/lib64, something very unfortunate happens. If we already have a version of HDF5 installed on the system (which would be under /usr/lib64 as well), libraries in our build, which need libhdf5.so themselves, end up linked against /usr/lib64/libhdf5.so.6. Like:

[bash][pcadp02]:build > ldd -r ../hdf5-1.10.1-install/lib/libhdf5_cpp.so
	linux-vdso.so.1 =>  (0x00007ffe9f11f000)
	libhdf5.so.6 => /usr/lib64/libhdf5.so.6 (0x00007fa1da2eb000)
	librt.so.1 => /lib64/librt.so.1 (0x00007fa1da0e2000)
	libpthread.so.0 => /lib64/libpthread.so.0 (0x00007fa1d9ec5000)
	libz.so.1 => /lib64/libz.so.1 (0x00007fa1d9caf000)
...

Which leads to these sort of warning when we later on try to use this libhdf5_cpp.so library:

/cvmfs/sft.cern.ch/lcg/contrib/bintuils/2.28/x86_64-slc6/bin/ld: warning: libhdf5.so.6, needed by /build1/atnight/localbuilds/nightlies/21.2/build/install/AnalysisBaseExternals/21.2.25/InstallArea/x86_64-slc6-gcc62-opt/lib/libhdf5_cpp.so, may conflict with libhdf5.so.101

So… My questions are:

  • Is the “configure” build still maintained, or is the development only focusing on the CMake style build by now?
  • Was this issue known already?
  • Could this behaviour be fixed? So that the HDF5 build would always pick up the libraries that it built just now for its own linking, and only take external libraries (pthreads and ZLIB) from somewhere else.

As far as I can see, this issue only affects the configure based build. The CMake build know how to link against the correct libraries. But for some technical reasons we’d like to rely on the configure based build for now.

Cheers,
Attila

Hi Attila,

The Autotools are definitely still supported. Probably more so than CMake.

I’ll take a look at how we are using the load paths in the Autotools and update this reply when I know more. It sounds like we might need to change how we are using the paths.

btw, you probably don’t need the --with-pthreads configure option unless you are using a special version of pthreads. configure usually picks up the system pthreads just fine.

Dear Dana,

I have to admit that I didn’t spend too much time with finding the optimal settings for the configure command. I took the configuration that LCG uses, and only modified that slightly for our use case.

Thanks for the tip, I may very well remove that option from the configuration in the future.

Cheers,
Attila

P.S. Our own build configuration is here if you’re interested:

btw, you also might not need the --with-zlib option if zlib is in the normal search path. On POSIX-y systems built with configure, zlib compression is enabled by default and we’ll usually find the system zlib without specifying the include and lib directories.

Using the --with-zlib option was just the simplest thing in our case. (https://gitlab.cern.ch/atlas/atlasexternals/blob/1.0/External/HDF5/CMakeLists.txt) Depending on the setup of our project, the

find_package( ZLIB REQUIRED )

line may find the zlib library in different locations. We need to force HDF5 to use the same version of zlib that the rest of our project uses. So while in the case where we end up picking up zlib from /usr, we could in principle leave it up to HDF5 to find zlib on its own, programmatically I don’t like that approach. That exposes our configuration to implementation details of the HDF5 build system, which may very well change in the future without explicit notice.

If you are using CMake you could build HDF5 with CMake instead of autotools - more control over options.
See our H4H5 tool source for how we use ExternalProject.

Also if you build HDF5 from source with CMake, you can install it and have CMake use the installed config files for the CMake import library support.

In this case, see our HDF5Examples sorce included with the CMake source bundle.

Allen

Hi Allen,

Unfortunately the CMake based build of HDF5 is not appropriate for us. :frowning_face: You see, that sort of build (at least with version 1.10.2) doesn’t produce the h5cc and h5c++ scripts. While our own code doesn’t care about those scripts, CMake’s own FindHDF5.cmake module relies heavily on them.

What I found was that even if we provided our own version of HDF5, and put it in a directory that was “further ahead” in CMAKE_PREFIX_PATH than another version, that other version would still win out if it comes with the h5cc and h5c++ scripts, and our build doesn’t.

While this can be seen as a bug in FindHDF5.cmake, that doesn’t help us much… We just have to use configure for the moment to be able to pick up our own build instead of any other available build.

Finally: We unfortunately also can’t use the CMake code generated by the CMake based build of HDF5. It’s not the fault of HDF5, that’s the fault of CMake. You see, CMake by construction assumes that if during build time ExternalA is in location /somewhere/ExternalA, it will be in exactly the same location at runtime as well. But in our builds that’s not the case. We build relocatable RPMs from our code, which we do install in vastly different locations on different platforms/clusters/etc. And this sort of relocatability can only be achieved by using CMake’s own FindHDF5.cmake module, and “massaging” a bit its output. See:

I know that I didn’t describe the problem in full detail, but trust me, we just can’t rely on the code generated by CMake’s export mechanism…

Cheers,
Attila

Hi @Dana_Robinson,

I don’t mean to nag, but do you expect to fix Autotools in the this situation in the near future? Specifically, in the case where the system already has libhdf5.so.6 installed we’re seeing multiple versions of HDF5 linked after we build HDF5-1.10.1. Our deployments still work (most of the time) because most of the systems they run on also have this library, but we’d like to fix this properly.

Dan