Allocating chunks in a specific order?


Is there a way to allocate (not necessarily write) unfiltered chunks in a specific order?

For example, I would interested in writing chunks along the path of a Morton Z-curve:

The best idea I have so far is to use H5Dwrite_chunk and write the chunks in that order.


Just to mimic this for experimental purposes, you could use external layout (H5Pset_external) and order the external references accordingly in one or more external files. Such a dataset wouldn’t be extendible (because of contiguous layout), but sufficient for testing. Then what? What’s the use case?



There are several sharded formats that we work with:

What I am exploring is if a HDF5 chunked file could be used as a Zarr shard and if there exist any limitations. For volumetric data, it is often useful to read nearby chunks via a contiguous read operation or range query. If the chunks could be arranged according to some kind of space filling curve, then a HDF5 file could be used in certain optimized read access patterns.


Here’s the related issue I created: