I opened a hdf5 file with 6 datasets, each datasets is about 0.5G size (3338258 row *39 column float).
When I opened the datasets , it poped up a warning dialog "Exception.OutofMemory", to my surprise, I still can open the 2 datasets of the total 6 datasets(each datasets is the same size)
I find some explanations in "<A href="http://ftp.hdfgroup.org/hdf5-quest.html#jbigd">http://ftp.hdfgroup.org/hdf5-quest.html#jbigd</A>" , but I am not sure this is
because the java virtual machine exception or the dataset crashed problem by my coding bug ?
Usually I cannot open h5dump or hdfview to open a hdf5 file or datasets when program error? I just want to
make sure is it the reason of my code? or just jave machine exception itself.