Lock mechanism for concurrent editing
A simple mechanism to start, could be to detect concurrent edits. The UI reads data, creates a hash/checksum of that initial data. When the user made changes and saves these changes, the UI post the updated data including the checksum of the original data. The backend rejects this request to update the data, if the checksum of the current server data does not match the checksum supplied, because this means the data has been changed by someone else.
It is a simple and somewhat effective mechanism. Its effective on a human activity timescale. Of course, if updates really happen at the exact same time, we would need to verify checksum and update file within a semaphore of some kind.
API wise, this could be implemented by adding an optional query parameter to the "put raw file api" for the checksum. The API should limit this to a certain file size, as computing hashes for large files can be expensive. The API would need to return a suitable HTTP code, if the hashes don't match. When the UI is reading an archive based on a raw file (like the ELN functionality), the archive contains already a hash over the raw file as part of the metadata (nomad/datamodel/datamodel.py::EntryMetadata.entry_hash
). The UI need to inform the user if the ELN save was rejected. A simple message and option to reload the archive and overwrite the current changes, should be enough in the beginning.