Compiler Error HDF5 eclipse c++: H5pubconf.h No such file or directory

I have downloaded HDF5 library from https://support.hdfgroup.org/ftp/HDF5/releases/hdf5-1.8/hdf5-1.8.21/src/ and I have included it to my C++ project.

I’m trying to run the create.cpp example so I have copied it to my project. The problem is that some files or directories can’t be found. The name of the project is PruebaHDF5.

#ifdef OLD_HEADER_FILENAME
#include <iostream.h>
#else
#include
#endif
#include

#include "c++/src/H5Cpp.h"

#ifndef H5_NO_NAMESPACE
    using namespace H5;
#endif

const H5std_string      FILE_NAME( "SDS.h5" );
const H5std_string      DATASET_NAME( "IntArray" );
const int       NX = 5;                    // dataset dimensions
const int       NY = 6;
const int       RANK = 2;

int main (void)
{
   /*
    * Data initialization.
    */
   int i, j;
   int data[NX][NY];          // buffer for data to write
   for (j = 0; j < NX; j++)
   {
      for (i = 0; i < NY; i++)
         data[j][i] = i + j;
   }
   /*
    * 0 1 2 3 4 5
    * 1 2 3 4 5 6
    * 2 3 4 5 6 7
    * 3 4 5 6 7 8
    * 4 5 6 7 8 9
    */

   // Try block to detect exceptions raised by any of the calls inside it
   try
   {
      /*
       * Turn off the auto-printing when failure occurs so that we can
       * handle the errors appropriately
       */
      Exception::dontPrint();

      /*
       * Create a new file using H5F_ACC_TRUNC access,
       * default file creation properties, and default file
       * access properties.
       */
      H5File file( FILE_NAME, H5F_ACC_TRUNC );

      /*
       * Define the size of the array and create the data space for fixed
       * size dataset.
       */
      hsize_t     dimsf[2];              // dataset dimensions
      dimsf[0] = NX;
      dimsf[1] = NY;
      DataSpace dataspace( RANK, dimsf );

      /*
       * Define datatype for the data in the file.
       * We will store little endian INT numbers.
       */
      IntType datatype( PredType::NATIVE_INT );
      datatype.setOrder( H5T_ORDER_LE );

      /*
       * Create a new dataset within the file using defined dataspace and
       * datatype and default dataset creation properties.
       */
      DataSet dataset = file.createDataSet( DATASET_NAME, datatype, dataspace );

      /*
       * Write the data to the dataset using default memory space, file
       * space, and transfer properties.
       */
      dataset.write( data, PredType::NATIVE_INT );
   }  // end of try block

   // catch failure caused by the H5File operations
   catch( FileIException error )
   {
      error.printError();
      return -1;
   }

   // catch failure caused by the DataSet operations
   catch( DataSetIException error )
   {
      error.printError();
      return -1;
   }

   // catch failure caused by the DataSpace operations
   catch( DataSpaceIException error )
   {
      error.printError();
      return -1;
   }

   // catch failure caused by the DataSpace operations
   catch( DataTypeIException error )
   {
      error.printError();
      return -1;
   }

   return 0;  // successfully terminated
}

As you may see, I also had to add the correct path to H5Cpp.h.

First, the message error is: fatal error: hdf5.h: No such file or directory PruebaHDF5
line 15, external location: C:…\H5Include.h C/C++ Problem. In H5Include I’ve changed the include of hdf5.h to src/hdf5.h and the problem is gone.

But now another problem comes: fatal error: H5pubconf.h: No such file or directory PruebaHDF5 line 15, external location: C:…\H5public.h C/C++ Problem, and I can’t find this H5pubconf in any directory, so I can’t use the library. Does someone have any idea of how to solve this? I haven’t found any answer for this. Thank you very much for your help

While I am unable to help with the current official HDF5 C++ distribution I recommend to give a try to H5CPP This is a header only persistence library for modern C++ and furnished with a compiler assisted reflection. The number of examples are gradually growing. The project supports both the serial and MPI versions of HDF5, as well as major linear algebra libraries and the std::vector.

The library comes with a custom implementation of packet table h5::append that is binary compatible with regular dataset, and has measured throughput in the ballpark of the underlying filesystem.

All H5CPP handles are binary equivalent with the underlying hid_t and by default may be passed to and from CAPI calls.

Here is the ISC’19 presentation slides and my github page to download it from. Binaries are provided for debian and rpm based distros, windows version is know to work but not officially supported.

best wishes:
steven

Thanks for answering. I wrote to the HDF support, and they said the main problem is that HDF5 doesn’t support Eclipse with MinGW, and that I should use another compiler for eclipse.