That resolved my issue. I was missing inserting into both the memory type and file type (mtype.insertMember, ftype.insertMember). Till now I was only inserting in either the mtype or ftype (per my examples below). Come to think of it, now it makes sense as to why this would work!
···
From: Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org] On Behalf Of Werner Benger
Sent: Tuesday, July 26, 2016 2:30 AM
To: hdf-forum@lists.hdfgroup.org
Subject: Re: [Hdf-forum] [Ext] Re: Writing subset of compound datatype
Hi Aman,
it seems to me the problem is that you still use the same type for the dataset creation and dataset writing, and for the filetype the offset of member b should be zero, since the file structure only consists of that element. So I would think it should be like this (I do this theoretically as I'm not familiar with the C++ API and using the C API myself):
struct MemType { int a; int b; };
struct FileType { int b; };
mtype.insertMember("b", HOFFSET(MemType, b), datatype);
ftype.insertMember("b", HOFFSET(FileType, b), datatype);
H5::DataSet dataset = solution.createDataSet( "Well", ftype, dataspace, ds_creatplist );
dataset.write( MemTypeVar, mtype );
Note that HOFFSET(FileType,b) would be just zero. But in any case the main point is that for creating a dataset you use the file type (which specifies what you want in the file), for writing you use the memtype (which specifies what you have in memory). Both need to be valid types, of course.
Werner
On 25.07.2016 23:18, Aman Verma wrote:
Werner,
I tried your suggestion (and various permutations along the same line).
For the example that you mention below, when I do:
H5::CompType mtype(sizeof(MemType));
mtype.insertMember("b", HOFFSET(MemType, b), datatype);
H5::DataSet dataset = solution.createDataSet( "Well", mtype, dataspace, ds_creatplist );
dataset.write( MemTypeVar, mtype );
This is the only variation that runs without errors. But it gives me the same output as before (large file size, sizeof(MemType)).
When I do what you suggested:
H5::CompType mtype(sizeof(MemType));
H5::CompType ftype(sizeof(FileType)); // addition to the above code
mtype.insertMember("b", HOFFSET(MemType, b), datatype);
H5::DataSet dataset = solution.createDataSet( "Well", ftype, dataspace, ds_creatplist ); // changed from the above code
dataset.write( MemTypeVar, ftype ); // changed from the above code
I get a runtime error 'unable to create a dataset' (obviously as I am not inserting anything into ftype).
Now when I insert into ftype:
H5::CompType mtype(sizeof(MemType));
H5::CompType ftype(sizeof(FileType));
ftype.insertMember("b", HOFFSET(FileType, b), datatype); // changed from the above code
H5::DataSet dataset = solution.createDataSet( "Well", ftype, dataspace, ds_creatplist );
dataset.write( MemTypeVar, ftype );
I get a smaller size file (as expected, sizeof(FileType)). However the data being written is controlled only by the offset (here, 0) and hence it writes all the data from MemType with offset 0 (both a and b).
Hope this explains my problem. What statement(s) am I missing here to (a) provide the stride in MemType or to (b) write out the 'b' component specifically? Could you give a small example of how to achieve what you laid out in the last email?
Thanks,
Aman
From: Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org] On Behalf Of Werner Benger
Sent: Friday, July 22, 2016 2:47 PM
To: hdf-forum@lists.hdfgroup.org<mailto:hdf-forum@lists.hdfgroup.org>
Subject: Re: [Hdf-forum] [Ext] Re: Writing subset of compound datatype
Hi Aman,
the file type would need to be a compound type as well with a member name equal to the member in the memory type. Then HDF5 can convert it. An elementtype of float would not work, but something like
struct MemType { int a, b; };
struct FileType { int b; };
Then if you have an in-memory array of type MemType and dataset of type FileType, it should just write the "b" component of the in-memory array to the file.
Werner
On 22.07.2016 21:29, Aman Verma wrote:
Werner,
Thanks for responding. I understand what you're saying and had tried it before. During dataset creation, I replaced mtype (Class' type) with mElementType (let's say float). This leads to runtime errors.
H5::CompType mtype(sizeof(Class));
mtype.insertMember("MD", HOFFSET(Class, dMD), datatype);
H5::DataSet dataset = solution.createDataSet( "/Well", mElementType, dataspace, ds_creatplist );
dataset.write( ObjectArray, mtype );
Further on, during the dataset write, I now substitute:
dataset.write( ObjectArray, mElementType );
This too leads to some complaining. However, the file is written out but with junk/wrong values. I am not surprised at this behavior given what I'm doing. However, I wonder if there is a way to release the unused space.
Again, for simplicity, let's say I only want to write out the element b from an array of the struct
typedef struct s1_t {
int a;
float b;
double c;
} s1_t[LENGTH];
How can I achieve this without doing:
s1_tid = H5Tcreate (H5T_COMPOUND, sizeof(s1_t));
which leads to the file size being on the order of sizeof(s1_t) instead of sizeof(float))?
Appreciate your time.
Aman
From: Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org] On Behalf Of Werner Benger
Sent: Friday, July 22, 2016 9:58 AM
To: hdf-forum@lists.hdfgroup.org<mailto:hdf-forum@lists.hdfgroup.org>
Subject: Re: [Hdf-forum] [Ext] Re: Writing subset of compound datatype
Hi Aman,
no, you don't copy anything. You reference your full data structure and describe it with the complete memory layout of your data structure, but in the file you use a different data type. Currently you are using "mtype" in both the creation of the dataset and when writing the data, so in file you get the same as in memory. But you dont want that, you want to create a dataset that contains only a part of your data structure, so during data set creation time you would use a different type, not "mtype".
Werner
On 22.07.2016 16:07, Aman Verma wrote:
Hi Werner,
I guess what you're implying is that I copy the components being written to file (Element belonging to ObjectArray[i]) to a different, more appropriate datatype such as an array: ElementTemp[i]. However I want to avoid doing this copying at every time step. There also doesn't seem a way to make a reference to this data, else I could just use that to write. What am I missing here?
Aman
From: Hdf-forum [mailto:hdf-forum-bounces@lists.hdfgroup.org] On Behalf Of Werner Benger
Sent: Friday, July 22, 2016 7:58 AM
To: hdf-forum@lists.hdfgroup.org<mailto:hdf-forum@lists.hdfgroup.org>
Subject: [Ext] Re: [Hdf-forum] Writing subset of compound datatype
Hi,
you probably want a different datatype in the file than in memory, i.e. in memory you want a datatype that describes the entire class, but for the file type one that only refers to the components written.
Werner
On 22.07.2016 00:22, Aman Verma wrote:
I want to write, in a single line, the elements: ObjectArray[i].Element (0<=i<n) without having to copy the elements into a separate array altogether.
Using CompType and insertMember, I am writing just a subset (including Element) of an array of a compound datatype ObjectArray (a class/object in my case).
H5::CompType mtype(sizeof(Class));
mtype.insertMember("MD", HOFFSET(Class, dMD), datatype);
H5::DataSet dataset = solution.createDataSet( "/Well", mtype, dataspace, ds_creatplist );
dataset.write( ObjectArray, mtype );
This works fine. However, not unexpectedly, since I am defining the CompType with the size of the whole class (Class), while writing out only a subset (Element), my .h5 file is bigger (sizeof(Class)) than it should be (sizeof(Element)). I am already using compression. Is there a way to 'delete' this unused space? Or is there a better way to do what I'm trying to achieve? Is using hyperslab the right answer?
Thanks
_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org<mailto:Hdf-forum@lists.hdfgroup.org>
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org<https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.hdfgroup.org_mailman_listinfo_hdf-2Dforum-5Flists.hdfgroup.org&d=CwMD-g&c=uGuXJ43KPkPWEl2imVFDmZQlhQUET7pVRA2PDIOxgqw&r=Gubd6RtL_7XTb1ohYqqB8g&m=lKbmU4P9LiV-MFkYAWcKHGHdB0pPdG8BSuvpDHzqF4M&s=B8ATgTheCadpzj1Vzl1CaOJxX4aLFeR48nDvXd2HE4k&e=>
Twitter: https://twitter.com/hdf5<https://urldefense.proofpoint.com/v2/url?u=https-3A__twitter.com_hdf5&d=CwMD-g&c=uGuXJ43KPkPWEl2imVFDmZQlhQUET7pVRA2PDIOxgqw&r=Gubd6RtL_7XTb1ohYqqB8g&m=lKbmU4P9LiV-MFkYAWcKHGHdB0pPdG8BSuvpDHzqF4M&s=QqA7m5AzKDNF2kyO9CAAnZPTyo-Qm1ExdLTzF8pI7Qw&e=>
--
___________________________________________________________________________
Dr. Werner Benger Visualization Research
Center for Computation & Technology at Louisiana State University (CCT/LSU)
2019 Digital Media Center, Baton Rouge, Louisiana 70803
Tel.: +1 225 578 4809 Fax.: +1 225 578-5362
_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org<mailto:Hdf-forum@lists.hdfgroup.org>
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org<https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.hdfgroup.org_mailman_listinfo_hdf-2Dforum-5Flists.hdfgroup.org&d=CwMD-g&c=uGuXJ43KPkPWEl2imVFDmZQlhQUET7pVRA2PDIOxgqw&r=Gubd6RtL_7XTb1ohYqqB8g&m=Ccaao7t3QAS9kuX_boUykX5qxNp_8iYMo1fDsHPh0gk&s=7rVUnJXuY3hvWgbFymEb24F05PEIbdN9USxwcgTuD0k&e=>
Twitter: https://twitter.com/hdf5<https://urldefense.proofpoint.com/v2/url?u=https-3A__twitter.com_hdf5&d=CwMD-g&c=uGuXJ43KPkPWEl2imVFDmZQlhQUET7pVRA2PDIOxgqw&r=Gubd6RtL_7XTb1ohYqqB8g&m=Ccaao7t3QAS9kuX_boUykX5qxNp_8iYMo1fDsHPh0gk&s=OL2mQPNGNdBrTVC3bgrv0xug7mF-8Do9qc3zycatRYo&e=>
--
___________________________________________________________________________
Dr. Werner Benger Visualization Research
Center for Computation & Technology at Louisiana State University (CCT/LSU)
2019 Digital Media Center, Baton Rouge, Louisiana 70803
Tel.: +1 225 578 4809 Fax.: +1 225 578-5362
_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org<mailto:Hdf-forum@lists.hdfgroup.org>
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org<https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.hdfgroup.org_mailman_listinfo_hdf-2Dforum-5Flists.hdfgroup.org&d=CwMD-g&c=uGuXJ43KPkPWEl2imVFDmZQlhQUET7pVRA2PDIOxgqw&r=Gubd6RtL_7XTb1ohYqqB8g&m=G3IJ7Alnzuvxox4ixVHMIpVuJg1b_xwK6Li61frHJBk&s=eLFVQwpGwZy3GV1kqorfXhqUbR9RHUgCwxNP1of9w1I&e=>
Twitter: https://twitter.com/hdf5<https://urldefense.proofpoint.com/v2/url?u=https-3A__twitter.com_hdf5&d=CwMD-g&c=uGuXJ43KPkPWEl2imVFDmZQlhQUET7pVRA2PDIOxgqw&r=Gubd6RtL_7XTb1ohYqqB8g&m=G3IJ7Alnzuvxox4ixVHMIpVuJg1b_xwK6Li61frHJBk&s=yBenHqeIOeC-cCvjYk5c3P1KFF_KjrFvXg3iyPeULm0&e=>
--
___________________________________________________________________________
Dr. Werner Benger Visualization Research
Center for Computation & Technology at Louisiana State University (CCT/LSU)
2019 Digital Media Center, Baton Rouge, Louisiana 70803
Tel.: +1 225 578 4809 Fax.: +1 225 578-5362
_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@lists.hdfgroup.org<mailto:Hdf-forum@lists.hdfgroup.org>
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org<https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.hdfgroup.org_mailman_listinfo_hdf-2Dforum-5Flists.hdfgroup.org&d=CwMD-g&c=uGuXJ43KPkPWEl2imVFDmZQlhQUET7pVRA2PDIOxgqw&r=Gubd6RtL_7XTb1ohYqqB8g&m=zkd0OwTDG9R2D9xcDVHofOS-4675FmLNkuWmg9dkq94&s=6IBel46sYtmELTsdvMf9h9yta-AB78zXuONGG4e1_gY&e=>
Twitter: https://twitter.com/hdf5<https://urldefense.proofpoint.com/v2/url?u=https-3A__twitter.com_hdf5&d=CwMD-g&c=uGuXJ43KPkPWEl2imVFDmZQlhQUET7pVRA2PDIOxgqw&r=Gubd6RtL_7XTb1ohYqqB8g&m=zkd0OwTDG9R2D9xcDVHofOS-4675FmLNkuWmg9dkq94&s=SNe_ff1vF5rFgmsg2d5PhjoJqXkFnfOMq4JWhAfnAx4&e=>
--
___________________________________________________________________________
Dr. Werner Benger Visualization Research
Center for Computation & Technology at Louisiana State University (CCT/LSU)
2019 Digital Media Center, Baton Rouge, Louisiana 70803
Tel.: +1 225 578 4809 Fax.: +1 225 578-5362