split particle file if it's bigger than 500GB
It's important to keep files between 1GB and 500 GB for tape backup/restore purposes.
At the moment we can use checkpoint files where the number of checkpoints per file is decided beforehand. This sort of works, since we know how big each checkpoint is, and we can compute exactly how big the checkpoint file will be.
For the particle sample file this isn't implemented, and I would like a way to do it such that we can then read the data transparently. One way to do it would be to create a "simname_particles.h5" file which only contains links to external files, and the external files are written with checkpoint-like approach.
Can the decision of "let's split the particle file now" be automated in a clean way? I.e. the data should be easily accessible afterwards.
How general does the splitting mechanism need to be?
Implement file splitting mechanism.
Ensure compatibility with old data sets.