I had a hard time building hdfview. I encountered numerous sets of instructions between the old and new support sites including one aimed at CMake build; variations due to 32 and 64 bit identifiers. I eventually wound up downloading and attempting to build all these pieces…
Yes there has been a lot of development and not much cleanup.
The history of HDFView 2 (and references to hdf-java) are tied to hdf5 1.8 versions. This includes IDs as ints.
The JNI was pulled from HDFView2 and CMake build system into its own hdf-java project and HDFView 2.14 converted to ANT build system.
The introduction of hdf5 1.10 required that IDs be longs. We decided to include the java/JNI code inside the hdf and hdf5 source code (new option --enable-java for autotools and HDF5_BUILD_JAVA in CMake). These hdf library builds feed into HDFView 3 and use of ANT build system for HDFView 3.
Soon to be released is HDFView 3.0 based on HDF 4.2.14 and HDF5 1.10.3. This has been refactored to use the SWT UI interface instead of Swing. The class structures have been overhauled. HDF and HDF5 java libraries will need to be installed to build from source.
Now, we have not been able to address the loading of really large files - although it now opens read only by default. There are some Array of Strings and Array of Compounds issues, but most all other datatype combinations should display.
Ok, great to hear about read-only default change. I neglected to ask about one other thing…any chance you can make HDF(4) optional rather than required? And, probably the HDF(4) folks would like HDF5 optional too, I dunno? But, I’d rather not have to build HDF(4) when I don’t need it just to get this tool.
We hope to provide clear build instructions for this next release.
Building from source will require hdf 4.2.14 (optional) and hdf5 1.10.3 binaries/installs.
There is a “build.properties” file that has environment variables for the user specific requirements. Either set the environment variables or replace with hard-coded paths.
Execute ant; some ant knowledge wrt targets may be needed, however executing “ant run” within the source folder will compile, create an executable jar and run HDFView 3.
“ant package” will create an install file.
BTW…the verbiage “…will require hdf 4.2.14 (optional)” is a tad confusing but I think you mean if hdf is used, version 4.2.14 is required. It won’t work with earlier versions of hdf, right?
Regarding the basics…I ran into problems with ant and the antcontrib jar file and I went googling. I think the problem I ran into is pretty common with ant in general and maybe even with ant+hdfview. At one point, I wound up having to hard-code the path to the antcontrib-1b03.jar file in one of the binary downloads from Apache into hdview’s build.xml file to get it to build. So, something to watch out for and simplify, if possible, going forward.
First, use the pre-release candidate source.
Second, if you do not want hdf4 support set "hdf.lib.dir ="in build.properties file.
Third, building from source expects proper settings are in the build.properties file and that hdf/hdf5 libraries have been prebuilt/installed.
build.properties file has default env vars that can be overwritten or set in the environment.
Just going to chip my experience in, hoping it will help others.
What did I try to do:
Compile HDFView 3.1.0 on Ubuntu 18.06 LTS due to lack of official distribution packages and the one in the ubuntu repository being outdated and unable to properly open my HDF file.
How did I do it:
Download sources for HDF4 and HDF5 as - again - the packages from the official ubuntu repository seem outdated or unsuitable for compiling HDF View.
Compile HDF4 by running (in a build folder)
configure --enable-java --disable-fortran; make -j; make check; make install
(don’t have a fortran compiler nor do I see the need for it)
Compile HDF5 by running (in a build folder)
configure --enable-java; make -j; make check; make install
Set Environment variables to point to the build/hdfX folders and JAVA_HOME to the openjdk-12 folder. I only later realized (after also trying with 1.8) that apparently 1.11 was the correct and supported version.
on the HDF View sources.
fail because the output of jdeps is not correctly parsed. jdeps also produces lines for missing dependencies (such as twelvemonkeys and what not) as well as warnings for split packages.
Making these modifications to build.xml
<!-- Generate list of needed Java modules by using `jdeps` on all JAR files in the release directory -->