Hi,
I am having this issue when reading a dataset if I use malloc
to alllocate memory. Apologies in advance. I am not very proficient in C and just starting to learn HDF5. Unfortunately, I cannot use Python because of legacy software.
Let me explain.
This works with a small dataset:
double dset_data[num_timesteps][num_cells];
file_id = H5Fopen(filename, H5F_ACC_RDWR, H5P_DEFAULT);
group_id = H5Gopen2(file_id, discipline, H5P_DEFAULT);
dataset_id = H5Dopen2(group_id, variable, H5P_DEFAULT);
status_read = H5Dread(dataset_id, H5T_NATIVE_DOUBLE, H5S_ALL, H5S_ALL, H5P_DEFAULT, dset_data);
and I can read the values from dset_data
.
However, my actual dataset is quite large (8760 x 25000) so double dset_data[num_timesteps][num_cells]
fails.
To avoid this issue I use malloc
to allocate memory for the dset_data
array, using the following code
int total_elements = num_timesteps * num_cells;
double *array = (double *)malloc(total_elements * sizeof(double));
for (int i = 0; i < total_elements; i++)
{
array[i] = 0.0;
}
double **dset_data = (double **)malloc(num_timesteps * sizeof(double *));
for (int i = 0; i < num_timesteps; i++)
{
dset_data[i] = &array[i * num_cells];
}
(The array is created correctly)
In this situation H5Dread
fails (I get status_read == 0
. Is there a way to see the actual error?) and when I try to read the values from dset_read
I get segmentation fault
It might be a very silly mistake, but hopefully you can help to shed some light on what I am doing wrong.
Thanks