Hello Elena,
I have attached the file.
I log live data from a socket to a .h5 file ( catdata.h5 )
Parallelly,I read data from the same file using the below code.
I call this program every 1 min to read the data from the log.
It works smoothly for sometime.Then it thows that error and breaks.
#include <iostream>
#include <stdlib.h>
#include <hdf5.h>
#include <hdf5_hl.h>
#include <time.h>
#define LOG_FILE "/tmp/catdata.h5"
#define TABLE "deltalog"
using namespace std;
const size_t field_offset[]={0,8,16};
const size_t field_size[]={8,8,8};
const hid_t field_type[]=
{H5T_NATIVE_DOUBLE,H5T_NATIVE_ULLONG,H5T_NATIVE_DOUBLE};
typedef struct rec_s
{
double t;
unsigned long long u64;
double data;
}rec_t;
int main(int argc, char ** argv)
{
int result =1;
hsize_t nfields,nrec;
if(argc <2) return 0;
double delta = atoi(argv[1]);
//double delta = 10;
struct timespec now;
clock_gettime( CLOCK_REALTIME, &now);
double start_time = now.tv_sec + ((now.tv_nsec / 1000000) *0.001);
start_time -= delta;
hid_t file_id;
file_id = H5Fopen(LOG_FILE, H5F_ACC_RDONLY, H5P_DEFAULT);
H5TBget_table_info ( file_id, TABLE, &nfields, &nrec );
rec_t* p_rec = new rec_t[nrec];
if(p_rec)
{
H5TBread_table( file_id, TABLE, 24, field_offset, field_size, p_rec
);
}
H5Fclose( file_id );
if(p_rec)
{
FILE* f=fopen("/tmp/catdata.xml","w");
if(f)
{
fprintf(f,"<records>");
for(int i=0; i<nrec; i++)
{
if(p_rec[i].t > start_time )
{
fprintf(f,"<r t=\"%lf\" p=\"%llu\"
v=\"%lf\"/>\n",p_rec[i].t, p_rec[i].u64, p_rec[i].data);
}
}
fprintf(f,"</records>");
fclose(f);
}
delete[] p_rec;
}
}
catdata.h5 (17.9 KB)
···
On Mon, Apr 20, 2015 at 4:17 AM, Elena Pourmal <epourmal@hdfgroup.org> wrote:
Hi Vishnu,
Information you provided is not enough to troubleshoot the problem.
Could you dump the file with h5dump? If so, could you please post the
code that shows how you are reading data?
Thank you!
Elena
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Elena Pourmal The HDF Group http://hdfgroup.org
1800 So. Oak St., Suite 203, Champaign IL 61820
217.531.6112
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
On Apr 15, 2015, at 3:52 AM, vishnu suganth <vishnusuganth@gmail.com> > wrote:
Hello,
I am new to HDF File Operations:
I get such an error while trying to read a H5TB data from a .h5 file.
Any idea why this occurred ?
HDF5-DIAG: Error detected in HDF5 (1.8.14) thread 0:
#000: H5Dio.c line 173 in H5Dread(): can't read data
major: Dataset
minor: Read failed
#001: H5Dio.c line 550 in H5D__read(): can't read data
major: Dataset
minor: Read failed
#002: H5Dchunk.c line 1872 in H5D__chunk_read(): unable to read raw data
chunk
major: Low-level I/O
minor: Read failed
#003: H5Dchunk.c line 2897 in H5D__chunk_lock(): unable to read raw data
chunk
major: Low-level I/O
minor: Read failed
#004: H5Fio.c line 120 in H5F_block_read(): read through metadata
accumulator failed
major: Low-level I/O
minor: Read failed
#005: H5Faccum.c line 263 in H5F__accum_read(): driver read request
failed
major: Low-level I/O
minor: Read failed
#006: H5FDint.c line 204 in H5FD_read(): driver read request failed
major: Virtual File Layer
minor: Read failed
#007: H5FDsec2.c line 692 in H5FD_sec2_read(): addr overflow, addr =
37288, size=576, eoa=37288
major: Invalid arguments to routine
minor: Address overflowed
Thanks & Regards,
Vishnu Suganth
_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5
_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5