hdf java linux64 binaries

Hi,

I am having a problem with the HDF Java linux 64 bit binaries. We've been using HDF Java for quite some time now on Windows, but I'm unable to get it running on our linux 64 systems using the 2.7 linux 64 binaries as downloaded from the production site.

First of all, here is some information on the linux host and java version I am using:

[chrisbr@pdelogin08 ~/temp]$ uname -a
Linux pdelogin08.amd.com 2.6.9-55.ELsmp #1 SMP Fri Apr 20 16:36:54 EDT 2007 x86_64 x86_64 x86_64 GNU/Linux
[chrisbr@pdelogin08 ~/temp]$ java -version
java version "1.6.0_04"
Java(TM) SE Runtime Environment (build 1.6.0_04-b12)
Java HotSpot(TM) 64-Bit Server VM (build 10.0-b19, mixed mode)

I have a very simple java test program that makes a call to H5.H5Fopen and H5.H5fclose(), but when I run it I get a JVM dump:

[chrisbr@pdelogin08 ~/temp]$ ls
hdftest.jar hydra_sort.h5 jhdf5.jar jhdf5obj.jar jhdf.jar jhdfobj.jar libjhdf5.so libjhdf.so
[chrisbr@pdelogin08 ~/temp]$ java -cp "hdftest.jar:jhdf5.jar:jhdf5obj.jar:jhdf.jar:jhdfobj.jar" -Djava.library.path="/home/chrisbr/temp" com.amd.hdf.test.TestHdf

···

#
# An unexpected error has been detected by Java Runtime Environment:
#
# SIGFPE (0x8) at pc=0x000000313f807827, pid=424, tid=1076017504
#
# Java VM: Java HotSpot(TM) 64-Bit Server VM (10.0-b19 mixed mode linux-amd64)
# Problematic frame:
# C [ld-linux-x86-64.so.2+0x7827]
#
# An error report file with more information is saved as:
# /home/chrisbr/temp/hs_err_pid424.log
#
# If you would like to submit a bug report, please visit:
# http://java.sun.com/webapps/bugreport/crash.jsp
# The crash happened outside the Java Virtual Machine in native code.
# See problematic frame for where to report the bug.
#
Abort

I looked through the documentation, but didn't see any notes about specific linux versions supported/unsupported. Did I maybe miss something? I was very much hoping to get this running using the provided binaries as I'm not much of a C expert and would like to avoid having to build from source myself.

Any help or tips would be greatly appreciated.

Regards,
Chris

Chris,
   The first question I need to ask; Can you run the hdfview.sh script from the bin folder?

Allen

···

Hi,

I am having a problem with the HDF Java linux 64 bit binaries. We've been using HDF Java for quite some time now on Windows, but I'm unable to get it running on our linux 64 systems using the 2.7 linux 64 binaries as downloaded from the production site.

First of all, here is some information on the linux host and java version I am using:

[chrisbr@pdelogin08 ~/temp]$ uname -a
Linux pdelogin08.amd.com 2.6.9-55.ELsmp #1 SMP Fri Apr 20 16:36:54 EDT 2007 x86_64 x86_64 x86_64 GNU/Linux
[chrisbr@pdelogin08 ~/temp]$ java -version
java version "1.6.0_04"
Java(TM) SE Runtime Environment (build 1.6.0_04-b12)
Java HotSpot(TM) 64-Bit Server VM (build 10.0-b19, mixed mode)

I have a very simple java test program that makes a call to H5.H5Fopen and H5.H5fclose(), but when I run it I get a JVM dump:

[chrisbr@pdelogin08 ~/temp]$ ls
hdftest.jar hydra_sort.h5 jhdf5.jar jhdf5obj.jar jhdf.jar jhdfobj.jar libjhdf5.so libjhdf.so
[chrisbr@pdelogin08 ~/temp]$ java -cp "hdftest.jar:jhdf5.jar:jhdf5obj.jar:jhdf.jar:jhdfobj.jar" -Djava.library.path="/home/chrisbr/temp" com.amd.hdf.test.TestHdf
#
# An unexpected error has been detected by Java Runtime Environment:
#
# SIGFPE (0x8) at pc=0x000000313f807827, pid=424, tid=1076017504
#
# Java VM: Java HotSpot(TM) 64-Bit Server VM (10.0-b19 mixed mode linux-amd64)
# Problematic frame:
# C [ld-linux-x86-64.so.2+0x7827]
#
# An error report file with more information is saved as:
# /home/chrisbr/temp/hs_err_pid424.log
#
# If you would like to submit a bug report, please visit:
# http://java.sun.com/webapps/bugreport/crash.jsp
# The crash happened outside the Java Virtual Machine in native code.
# See problematic frame for where to report the bug.
#
Abort

I looked through the documentation, but didn't see any notes about specific linux versions supported/unsupported. Did I maybe miss something? I was very much hoping to get this running using the provided binaries as I'm not much of a C expert and would like to avoid having to build from source myself.

Any help or tips would be greatly appreciated.

Regards,
Chris

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

Hello Allen,

Thanks for your response. I am unable to run hdfview.sh from the bin folder either. I encounter a similar JVM dump when I attempt this:

[chrisbr@pdelogin08 bin]$ ./hdfview.sh

···

#
# An unexpected error has been detected by Java Runtime Environment:
#
# SIGFPE (0x8) at pc=0x000000313f807827, pid=18472, tid=1092860256
#
# Java VM: Java HotSpot(TM) 64-Bit Server VM (10.0-b19 mixed mode linux-amd64)
# Problematic frame:
# C [ld-linux-x86-64.so.2+0x7827]
#
# An error report file with more information is saved as:
# /home/chrisbr/temp2/hdf-java/bin/hs_err_pid18472.log
#
# If you would like to submit a bug report, please visit:
# http://java.sun.com/webapps/bugreport/crash.jsp
# The crash happened outside the Java Virtual Machine in native code.
# See problematic frame for where to report the bug.
#
./hdfview.sh: line 92: 18472 Aborted $JAVAPATH/java -Xmx1000m -Djava.library.path=$LD_LIBRARY_PATH ncsa.hdf.view.HDFView -root $HDFVIEW_HOME $*

Thanks,
Chris

From: Allen D Byrne [mailto:byrn@hdfgroup.org]
Sent: Wednesday, June 01, 2011 8:46 AM
To: hdf-forum@hdfgroup.org
Cc: Brown, Chris
Subject: Re: [Hdf-forum] hdf java linux64 binaries

Chris,
The first question I need to ask; Can you run the hdfview.sh script from the bin folder?
Allen

Hi,

I am having a problem with the HDF Java linux 64 bit binaries. We've been using HDF Java for quite some time now on Windows, but I'm unable to get it running on our linux 64 systems using the 2.7 linux 64 binaries as downloaded from the production site.

First of all, here is some information on the linux host and java version I am using:

[chrisbr@pdelogin08 ~/temp]$ uname -a
Linux pdelogin08.amd.com 2.6.9-55.ELsmp #1 SMP Fri Apr 20 16:36:54 EDT 2007 x86_64 x86_64 x86_64 GNU/Linux
[chrisbr@pdelogin08 ~/temp]$ java -version
java version "1.6.0_04"
Java(TM) SE Runtime Environment (build 1.6.0_04-b12)
Java HotSpot(TM) 64-Bit Server VM (build 10.0-b19, mixed mode)

I have a very simple java test program that makes a call to H5.H5Fopen and H5.H5fclose(), but when I run it I get a JVM dump:

[chrisbr@pdelogin08 ~/temp]$ ls
hdftest.jar hydra_sort.h5 jhdf5.jar jhdf5obj.jar jhdf.jar jhdfobj.jar libjhdf5.so libjhdf.so
[chrisbr@pdelogin08 ~/temp]$ java -cp "hdftest.jar:jhdf5.jar:jhdf5obj.jar:jhdf.jar:jhdfobj.jar" -Djava.library.path="/home/chrisbr/temp" com.amd.hdf.test.TestHdf
#
# An unexpected error has been detected by Java Runtime Environment:
#
# SIGFPE (0x8) at pc=0x000000313f807827, pid=424, tid=1076017504
#
# Java VM: Java HotSpot(TM) 64-Bit Server VM (10.0-b19 mixed mode linux-amd64)
# Problematic frame:
# C [ld-linux-x86-64.so.2+0x7827]
#
# An error report file with more information is saved as:
# /home/chrisbr/temp/hs_err_pid424.log
#
# If you would like to submit a bug report, please visit:
# http://java.sun.com/webapps/bugreport/crash.jsp
# The crash happened outside the Java Virtual Machine in native code.
# See problematic frame for where to report the bug.
#
Abort

I looked through the documentation, but didn't see any notes about specific linux versions supported/unsupported. Did I maybe miss something? I was very much hoping to get this running using the provided binaries as I'm not much of a C expert and would like to avoid having to build from source myself.

Any help or tips would be greatly appreciated.

Regards,
Chris

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

Hi Barbara,

Thanks for the tip, looks like you are exactly right. I confirmed this works on a more recent linux kernel, although I need to get it working on this older version. I compiled the C code with no problem, and then moved on to compiling the Java code on top of this. However, the configuration script is giving me some type of write error, and I'm not sure what to make of it. Based on the script output below, can you tell me what needs to be writable?

Thanks,
Chris

[chrisbr@pdelogin08 hdf-java]$ ./runconfig-example.sh
checking if tr works... yes
checking for gawk... gawk
checking if expr works... yes
checking for gcc... gcc
checking for C compiler default output file name... a.out
checking whether the C compiler works... yes
checking whether we are cross compiling... no
checking for suffix of executables...
checking for suffix of object files... o
checking whether we are using the GNU C compiler... yes
checking whether gcc accepts -g... yes
checking for gcc option to accept ANSI C... none needed
checking whether make sets $(MAKE)... yes
checking for a BSD-compatible install... /usr/bin/install -c
checking how to run the C preprocessor... gcc -E
checking for egrep... grep -E
checking for ANSI C header files... yes
checking for ceil in -lm... yes
checking for rm... /bin/rm
checking for find... /usr/bin/find
checking build system type... x86_64-pc-linux-gnuoldld
checking host system type... x86_64-pc-linux-gnuoldld
checking target system type... x86_64-pc-linux-gnuoldld
FYI the target here is: x86_64-pc-linux-gnuoldld
checking gzip compression... yes
zlib found: /home/chrisbr/temp/zlib-1.2.3-MacOSX-intel/lib/libz.a
checking jpeg compression... no
checking szip compression... suppressed
checking HDF4 library... suppressed
checking HDF5 library... yes
HDF5 found: /home/chrisbr/hdf-1.8.7/lib
checking dependencies for HDF5 library... OK
checking HDF4to5 library... no
checking jni.h usability... yes
checking jni.h presence... yes
checking for jni.h... yes
checking for java... /nfs/local/.package/jdk-1.6.0_04/bin/java
checking for javac... /nfs/local/.package/jdk-1.6.0_04/bin/javac
checking for javadoc... /nfs/local/.package/jdk-1.6.0_04/bin/javadoc
checking for jar... /nfs/local/.package/jdk-1.6.0_04/bin/jar
configure: error: : not writable

Hi Chris,

I noticed that you are using Linux 2.6.9. We built the HDF-Java 2.7 software using Linux 2.6.18 and the C layer is built with gcc 4.1.2.

A few years ago there was an issue where HDF-Java (2.5) built with gcc 4.*
failed on machines that had an old C library (gcc 3.4).

I'm wondering if this could be the same issue... ?

-Barbara

Hello Allen,

Thanks for your response. I am unable to run hdfview.sh from the bin folder either. I encounter a similar JVM dump when I attempt this:

[chrisbr@pdelogin08 bin]$ ./hdfview.sh
#
# An unexpected error has been detected by Java Runtime Environment:
#
# SIGFPE (0x8) at pc=0x000000313f807827, pid=18472, tid=1092860256
#
# Java VM: Java HotSpot(TM) 64-Bit Server VM (10.0-b19 mixed mode linux-amd64)
# Problematic frame:
# C [ld-linux-x86-64.so.2+0x7827]
#
# An error report file with more information is saved as:
# /home/chrisbr/temp2/hdf-java/bin/hs_err_pid18472.log
#
# If you would like to submit a bug report, please visit:
# http://java.sun.com/webapps/bugreport/crash.jsp
# The crash happened outside the Java Virtual Machine in native code.
# See problematic frame for where to report the bug.
#
./hdfview.sh: line 92: 18472 Aborted $JAVAPATH/java

-Xmx1000m -Djava.library.path=$LD_LIBRARY_PATH ncsa.hdf.view.HDFView -root $HDFVIEW_HOME $*

···

On Wed, 1 Jun 2011, Brown, Chris wrote:

Thanks,
Chris

From: Allen D Byrne [mailto:byrn@hdfgroup.org]
Sent: Wednesday, June 01, 2011 8:46 AM
To: hdf-forum@hdfgroup.org
Cc: Brown, Chris
Subject: Re: [Hdf-forum] hdf java linux64 binaries

Chris,
The first question I need to ask; Can you run the hdfview.sh script from the bin folder?
Allen

Hi,

I am having a problem with the HDF Java linux 64 bit binaries. We've been using HDF Java for quite some time now on Windows, but I'm unable to get it running on our linux 64 systems using the 2.7 linux 64 binaries as downloaded from the production site.

First of all, here is some information on the linux host and java version I am using:

[chrisbr@pdelogin08 ~/temp]$ uname -a
Linux pdelogin08.amd.com 2.6.9-55.ELsmp #1 SMP Fri Apr 20 16:36:54 EDT 2007 x86_64 x86_64 x86_64 GNU/Linux
[chrisbr@pdelogin08 ~/temp]$ java -version
java version "1.6.0_04"
Java(TM) SE Runtime Environment (build 1.6.0_04-b12)
Java HotSpot(TM) 64-Bit Server VM (build 10.0-b19, mixed mode)

I have a very simple java test program that makes a call to H5.H5Fopen and H5.H5fclose(), but when I run it I get a JVM dump:

[chrisbr@pdelogin08 ~/temp]$ ls
hdftest.jar hydra_sort.h5 jhdf5.jar jhdf5obj.jar jhdf.jar jhdfobj.jar libjhdf5.so libjhdf.so
[chrisbr@pdelogin08 ~/temp]$ java -cp "hdftest.jar:jhdf5.jar:jhdf5obj.jar:jhdf.jar:jhdfobj.jar" -Djava.library.path="/home/chrisbr/temp" com.amd.hdf.test.TestHdf
#
# An unexpected error has been detected by Java Runtime Environment:
#
# SIGFPE (0x8) at pc=0x000000313f807827, pid=424, tid=1076017504
#
# Java VM: Java HotSpot(TM) 64-Bit Server VM (10.0-b19 mixed mode linux-amd64)
# Problematic frame:
# C [ld-linux-x86-64.so.2+0x7827]
#
# An error report file with more information is saved as:
# /home/chrisbr/temp/hs_err_pid424.log
#
# If you would like to submit a bug report, please visit:
# http://java.sun.com/webapps/bugreport/crash.jsp
# The crash happened outside the Java Virtual Machine in native code.
# See problematic frame for where to report the bug.
#
Abort

I looked through the documentation, but didn't see any notes about specific linux versions supported/unsupported. Did I maybe miss something? I was very much hoping to get this running using the provided binaries as I'm not much of a C expert and would like to avoid having to build from source myself.

Any help or tips would be greatly appreciated.

Regards,
Chris

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

--
Barbara Jones
bljones@hdfgroup.org

Hi Chris,

My guess is that a directory was not specified in which to install the
software. The configure tries to test if the software can be installed
in the /usr/local/ directory but you don't have access to it.

Could that be the issue?

The --prefix option is for specifying the directory in which you
wish to install the software.

You can type ./configure -help to see the options that can be used
with configure.

-Barbara

···

Thanks for the tip, looks like you are exactly right. I confirmed this works on a more recent linux kernel, although I need to get it working on this older version. I compiled the C code with no problem, and then moved on to compiling the Java code on top of this. However, the configuration script is giving me some type of write error, and I'm not sure what to make of it. Based on the script output below, can you tell me what needs to be writable?

Thanks,
Chris

[chrisbr@pdelogin08 hdf-java]$ ./runconfig-example.sh
checking if tr works... yes
checking for gawk... gawk
checking if expr works... yes
checking for gcc... gcc
checking for C compiler default output file name... a.out
checking whether the C compiler works... yes
checking whether we are cross compiling... no
checking for suffix of executables...
checking for suffix of object files... o
checking whether we are using the GNU C compiler... yes
checking whether gcc accepts -g... yes
checking for gcc option to accept ANSI C... none needed
checking whether make sets $(MAKE)... yes
checking for a BSD-compatible install... /usr/bin/install -c
checking how to run the C preprocessor... gcc -E
checking for egrep... grep -E
checking for ANSI C header files... yes
checking for ceil in -lm... yes
checking for rm... /bin/rm
checking for find... /usr/bin/find
checking build system type... x86_64-pc-linux-gnuoldld
checking host system type... x86_64-pc-linux-gnuoldld
checking target system type... x86_64-pc-linux-gnuoldld
FYI the target here is: x86_64-pc-linux-gnuoldld
checking gzip compression... yes
zlib found: /home/chrisbr/temp/zlib-1.2.3-MacOSX-intel/lib/libz.a
checking jpeg compression... no
checking szip compression... suppressed
checking HDF4 library... suppressed
checking HDF5 library... yes
HDF5 found: /home/chrisbr/hdf-1.8.7/lib
checking dependencies for HDF5 library... OK
checking HDF4to5 library... no
checking jni.h usability... yes
checking jni.h presence... yes
checking for jni.h... yes
checking for java... /nfs/local/.package/jdk-1.6.0_04/bin/java
checking for javac... /nfs/local/.package/jdk-1.6.0_04/bin/javac
checking for javadoc... /nfs/local/.package/jdk-1.6.0_04/bin/javadoc
checking for jar... /nfs/local/.package/jdk-1.6.0_04/bin/jar
configure: error: : not writable

--
Barbara Jones
bljones@hdfgroup.org

Hi Barbara,

Yes, that ended up being the issue (figured this out shortly after I sent my email). I was able to make it past the configuration script, but got a number of additional errors when I try to build afterwards. I'm giving up on it for now, and am going to instead see if I can't get the kernel upgraded on the host I am using. It sounds like this might be an easier approach in this case as I'm not much of a C expert. But again, thanks for all of the help you have provided.

-Chris

···

-----Original Message-----
From: Barbara Jones [mailto:bljones@hdfgroup.org]
Sent: Friday, June 03, 2011 9:54 AM
To: Brown, Chris
Cc: hdf-forum@hdfgroup.org
Subject: RE: [Hdf-forum] hdf java linux64 binaries

Hi Chris,

My guess is that a directory was not specified in which to install the
software. The configure tries to test if the software can be installed
in the /usr/local/ directory but you don't have access to it.

Could that be the issue?

The --prefix option is for specifying the directory in which you
wish to install the software.

You can type ./configure -help to see the options that can be used
with configure.

-Barbara

Thanks for the tip, looks like you are exactly right. I confirmed this works on a more recent linux kernel, although I need to get it working on this older version. I compiled the C code with no problem, and then moved on to compiling the Java code on top of this. However, the configuration script is giving me some type of write error, and I'm not sure what to make of it. Based on the script output below, can you tell me what needs to be writable?

Thanks,
Chris

[chrisbr@pdelogin08 hdf-java]$ ./runconfig-example.sh
checking if tr works... yes
checking for gawk... gawk
checking if expr works... yes
checking for gcc... gcc
checking for C compiler default output file name... a.out
checking whether the C compiler works... yes
checking whether we are cross compiling... no
checking for suffix of executables...
checking for suffix of object files... o
checking whether we are using the GNU C compiler... yes
checking whether gcc accepts -g... yes
checking for gcc option to accept ANSI C... none needed
checking whether make sets $(MAKE)... yes
checking for a BSD-compatible install... /usr/bin/install -c
checking how to run the C preprocessor... gcc -E
checking for egrep... grep -E
checking for ANSI C header files... yes
checking for ceil in -lm... yes
checking for rm... /bin/rm
checking for find... /usr/bin/find
checking build system type... x86_64-pc-linux-gnuoldld
checking host system type... x86_64-pc-linux-gnuoldld
checking target system type... x86_64-pc-linux-gnuoldld
FYI the target here is: x86_64-pc-linux-gnuoldld
checking gzip compression... yes
zlib found: /home/chrisbr/temp/zlib-1.2.3-MacOSX-intel/lib/libz.a
checking jpeg compression... no
checking szip compression... suppressed
checking HDF4 library... suppressed
checking HDF5 library... yes
HDF5 found: /home/chrisbr/hdf-1.8.7/lib
checking dependencies for HDF5 library... OK
checking HDF4to5 library... no
checking jni.h usability... yes
checking jni.h presence... yes
checking for jni.h... yes
checking for java... /nfs/local/.package/jdk-1.6.0_04/bin/java
checking for javac... /nfs/local/.package/jdk-1.6.0_04/bin/javac
checking for javadoc... /nfs/local/.package/jdk-1.6.0_04/bin/javadoc
checking for jar... /nfs/local/.package/jdk-1.6.0_04/bin/jar
configure: error: : not writable

--
Barbara Jones
bljones@hdfgroup.org