Problem extending a dataset


I have code that is meant to extend a dataset. The dataset is a 2-d matrix
of short ints. It has a fixed number of columns, and I am appending rows. I
create the dataset like this:

    // Create the "data" data set
    hsize_t ddim[2] = {0, row_size}, dext[2] = {H5S_UNLIMITED, row_size},
chunk[2] = {1028 * row_size, row_size};
    dspc = H5Screate_simple(2, ddim, dext);
    dcpl = H5Pcreate(H5P_DATASET_CREATE);
    H5Pset_chunk(dcpl, 2, chunk);
    H5Pset_fill_time(dcpl, H5D_FILL_TIME_ALLOC);
    dset = H5Dcreate2(hdf, "data", H5T_STD_I16LE, dspc, H5P_DEFAULT, dcpl,

where row_size is the constant number of columns. I then extend it like

    // Extend the "data" data set
    dset = H5Dopen2(hdf, "data", H5P_DEFAULT);
    dspc = H5Dget_space(dset);
    hsize_t ddim[2];
    int status = H5Sget_simple_extent_dims(dspc, ddim, NULL);
    ddim[0] += keys.size();
    status = H5Dset_extent(dset, ddim);

    // Write the new rows
    dspc = H5Dget_space(dset);
    hsize_t dbeg[2] = {ddim[0] - keys.size(),0}, dcnt[2] = {keys.size(),
    status = H5Sselect_hyperslab(dspc, H5S_SELECT_SET, dbeg, NULL, dcnt,
    hsize_t select_size = H5Sget_select_npoints(dspc);
    H5Dwrite(dset, H5T_STD_I16LE, H5S_ALL, dspc, H5P_DEFAULT, &rows[0]);

Here keys is a std::vector whose size corresponds to the number of new
rows, and rows is a std::vector of the data to be added whose size
corresponds to keys.size() * row_size. When I run the code, the first pass
writes the first bunch of data, but the second pass seg-faults. I can
inspect the variables in gdb and all looks as it should (I think). For

at the H5Dwrite call, during the first write:

(gdb) p ddim
$4 = {2786272, 68}
(gdb) p keys.size()
$7 = 2786272
(gdb) p row_size
$8 = 68
(gdb) p keys.size() * row_size
$5 = 189466496
(gdb) p rows.size()
$6 = 189466496

and during the second call at the H5Dwrite call:

(gdb) p ddim
$15 = {4722675, 68}
(gdb) p keys.size()
$12 = 1936403
(gdb) p row_size
$11 = 68
(gdb) p keys.size() * row_size
$13 = 131675404
(gdb) p rows.size()
$14 = 131675404

I can see that the selection size is right:

(gdb) p select_size
$10 = 131675404

but I get a segmentation fault when the H5Dwrite call is made. Here is a
back trace:

Program received signal SIGSEGV, Segmentation fault.
0x00007ffff5b28e8f in memcpy () from /lib/
(gdb) bt
#0 0x00007ffff5b28e8f in memcpy () from /lib/
#1 0x00007ffff6cff38d in H5V_memcpyvv () from /usr/lib/
#2 0x00007ffff6a6bb1f in ?? () from /usr/lib/
#3 0x00007ffff6a858ca in ?? () from /usr/lib/
#4 0x00007ffff6a85c8e in H5D_select_write () from /usr/lib/
#5 0x00007ffff6a62944 in ?? () from /usr/lib/
#6 0x00007ffff6a7f60e in ?? () from /usr/lib/
#7 0x00007ffff6a7e3aa in H5Dwrite () from /usr/lib/

I am using 1.8.4, and
$ uname -a
Linux pharos 2.6.35-31-generic #62-Ubuntu SMP Tue Nov 8 14:20:11 UTC 2011
x86_64 GNU/Linux

I must be misunderstanding something. I admit reading the User Guide entry
for H5Sselect_hyperslab had me scratching my head about what the
distinction was between a memory data space and a file data space, but I
think I am imitating the examples. Any help in getting to the bottom of
this would be greatly appreciated.