HDF5 View generate an Memory error


HFD5view generates an error when we open the “DRM Forces” table.
We installed “HDFView-3.2.0-ubuntu2004_64.tar.gz” on a vSphere VM, the VM has 36CPUs and 96GB memory.
When we open certain tables, we get a memory error.
“An error occured while loading data for the table: failed to read scaler dataset Out of Memory”
Do we need to change the syntax of our scripts for this release?


3,2.0 was released to take advantage of the 1.12 hdf5 release. Unfortunately the 3.2.0 and the 3.1.4 (1.10 based) releases have problems with references inside a container object (like array, compounds, etc.).

As has been the case, HDFView was never designed to handle really large datasets, that redesign has been waiting for resources (devs and other kinds) to be available.


You might try upping that limit in hdfview.sh

export JAVAOPTS=-Xmx1024M



Thanks for the tips :slight_smile:
I tested the suggestion " export JAVA_OPTS=-Xmx1024M" with several values, but it’s not enough. Now I define the columns and rows in the HDFView with the options “*Show Data with Options”. This allows small calculations to be performed. And I know it’s not an installation error. Thanks