Skip to content
GitLab
Projects Groups Topics Snippets
  • /
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
    • Contribute to GitLab
  • Sign in
  • nomad-FAIR nomad-FAIR
  • Project information
    • Project information
    • Activity
    • Labels
    • Members
  • Repository
    • Repository
    • Files
    • Commits
    • Branches
    • Tags
    • Contributor statistics
    • Graph
    • Compare revisions
  • Issues 242
    • Issues 242
    • List
    • Boards
    • Service Desk
    • Milestones
  • Merge requests 43
    • Merge requests 43
  • CI/CD
    • CI/CD
    • Pipelines
    • Jobs
    • Schedules
  • Deployments
    • Deployments
    • Environments
    • Releases
  • Packages and registries
    • Packages and registries
    • Package Registry
    • Container Registry
    • Infrastructure Registry
  • Monitor
    • Monitor
    • Incidents
  • Analytics
    • Analytics
    • Value stream
    • CI/CD
    • Repository
  • Snippets
    • Snippets
  • Activity
  • Graph
  • Create a new issue
  • Jobs
  • Commits
  • Issue Boards
Collapse sidebar

On Monday, June 12, 2023, from 9.00 to 10.00 am there will be a maintenance with a short downtime of the GitLab service.

  • nomad-labnomad-lab
  • nomad-FAIRnomad-FAIR
  • Issues
  • #403
Closed
Open
Issue created Aug 24, 2020 by Markus Scheidgen@mscheidgOwner6 of 9 checklist items completed6/9 checklist items

Clean up the experimental data

For show-casing purposes the experimental section of NOMAD needs to be "cleaned"

  • rename CMS/EMS -> computational/experimental
  • fix the uploader and co-author names
  • fix other metadata like locations and dates
  • better metadata and experiment names
  • more data (e.g. automatised EELS indexing)
  • EELS preview?
  • maybe Markus Kühbach has real preview figs for his set
  • a disclaimer (in the search) about the "show-case" nature of the experimental section
  • more databases to "index"

show-cases the indexing of external databse

We could use EELS to show that NOMAD could crawl web-based databases to index. Simply Python web-scraping techniques should suffice to create an upload consisting of respective web-pages that "parsers" can convert into respective NOMAD metainfo data.

show-case the indexing of external repositories

We could use zenoodo and its API to improve the metadata in NOMAD/experimental by downloading titles, descriptions, authors, etc.

authors

There is a difference between the person uploading the metadata, i.e. the person providing the reference to the data and the authors of the data. The later usually given by an external database or repository. We need to reflect this in NOMAD user datamodel and add support for non NOMAD user authors: #404 (closed)

Edited Nov 10, 2020 by Markus Scheidgen
Assignee
Assign to
Time tracking