The first version of
EncyclopediaNormalizer is now ready to be merged into a production branch. This first version is still lacking tools for handling phonon/elastic/thermal properties which will be added in a later phase.
The EncyclopediaNormalizer has been extensively tested locally and on a separate deployment machine. In particular:
- The execution time for the normalizer is very similar to existing ones (~100ms)
- The normalizer has been tested on a separate deployment with ~8000 entries: no unexpected expections are raised, for cases where the encyclopedia data cannot be created the normalizer will simply not create the
encyclopediasection and will instead log the reason.
- This addition does not affect other parts of the infrastructure: changes are almost completely isolated to EncyclopediaNormalizer
- The normalizer comes with basis regresssion tests that test for metainfo correctness, the correct detection of identical materials and the creation of hashes used for grouping calculations.
- The produced data has been also checked through the encyclopedia GUI both locally and on the staging machine.
Although the normalizer is not yet fully finished, I think merging it to a production branch would make sense. This way we can incrementally test it's behaviour on production data before moving to building the secondary normalizers for phonon/elastic properties and before building the API for using it. I expect that the metainfo created for the Encyclopedia will be quite unstable for a while, so it will still undergo quite a bit of restructuring.