Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found

Target

Select target project
  • nomad-lab/analytics
1 result
Show changes
Commits on Source (435)
Showing
with 463 additions and 19180 deletions
......@@ -3,4 +3,4 @@
**/.gitmodules
**/.dockerignore
.gitlab-ci
\ No newline at end of file
.gitlab-ci.yml
\ No newline at end of file
.vscode/
.idea/
.ipython/
.keras/
.local/
.DS_Store
# https://github.com/github/gitignore/blob/main/Python.gitignore
# Byte-compiled / optimized / DLL files
__pycache__/
......@@ -29,6 +23,7 @@ parts/
sdist/
var/
wheels/
share/python-wheels/
*.egg-info/
.installed.cfg
*.egg
......@@ -47,14 +42,17 @@ pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov/
.tox/
.nox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
*.py,cover
.hypothesis/
.pytest_cache/
cover/
# Translations
*.mo
......@@ -64,6 +62,7 @@ coverage.xml
*.log
local_settings.py
db.sqlite3
db.sqlite3-journal
# Flask stuff:
instance/
......@@ -76,16 +75,49 @@ instance/
docs/_build/
# PyBuilder
.pybuilder/
target/
# Jupyter Notebook
.ipynb_checkpoints
# IPython
profile_default/
ipython_config.py
# pyenv
# For a library or package, you might want to ignore these files since the code is
# intended to run in multiple environments; otherwise, check them in:
.python-version
# celery beat schedule file
# pipenv
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
# However, in case of collaboration, if having platform-specific dependencies or dependencies
# having no cross-platform support, pipenv may install dependencies that don't work, or not
# install all needed dependencies.
#Pipfile.lock
# poetry
# Similar to Pipfile.lock, it is generally recommended to include poetry.lock in version control.
# This is especially recommended for binary packages to ensure reproducibility, and is more
# commonly ignored for libraries.
# https://python-poetry.org/docs/basic-usage/#commit-your-poetrylock-file-to-version-control
#poetry.lock
# pdm
# Similar to Pipfile.lock, it is generally recommended to include pdm.lock in version control.
#pdm.lock
# pdm stores project-wide configurations in .pdm.toml, but it is recommended to not include it
# in version control.
# https://pdm.fming.dev/#use-with-ide
.pdm.toml
# PEP 582; used by e.g. github.com/David-OConnor/pyflow and github.com/pdm-project/pdm
__pypackages__/
# Celery stuff
celerybeat-schedule
celerybeat.pid
# SageMath parsed files
*.sage.py
......@@ -111,3 +143,252 @@ venv.bak/
# mypy
.mypy_cache/
.dmypy.json
dmypy.json
# Pyre type checker
.pyre/
# pytype static type analyzer
.pytype/
# Cython debug symbols
cython_debug/
# PyCharm
# JetBrains specific template is maintained in a separate JetBrains.gitignore that can
# be found at https://github.com/github/gitignore/blob/main/Global/JetBrains.gitignore
# and can be added to the global gitignore or merged into this file. For a more nuclear
# option (not recommended) you can uncomment the following to ignore the entire idea folder.
.idea/
# https://github.com/github/gitignore/blob/main/Node.gitignore
# Logs
logs
*.log
npm-debug.log*
yarn-debug.log*
yarn-error.log*
lerna-debug.log*
.pnpm-debug.log*
# Diagnostic reports (https://nodejs.org/api/report.html)
report.[0-9]*.[0-9]*.[0-9]*.[0-9]*.json
# Runtime data
pids
*.pid
*.seed
*.pid.lock
# Directory for instrumented libs generated by jscoverage/JSCover
lib-cov
# Coverage directory used by tools like istanbul
coverage
*.lcov
# nyc test coverage
.nyc_output
# Grunt intermediate storage (https://gruntjs.com/creating-plugins#storing-task-files)
.grunt
# Bower dependency directory (https://bower.io/)
bower_components
# node-waf configuration
.lock-wscript
# Compiled binary addons (https://nodejs.org/api/addons.html)
build/Release
# Dependency directories
node_modules/
jspm_packages/
# Snowpack dependency directory (https://snowpack.dev/)
web_modules/
# TypeScript cache
*.tsbuildinfo
# Optional npm cache directory
.npm
# Optional eslint cache
.eslintcache
# Optional stylelint cache
.stylelintcache
# Microbundle cache
.rpt2_cache/
.rts2_cache_cjs/
.rts2_cache_es/
.rts2_cache_umd/
# Optional REPL history
.node_repl_history
# Output of 'npm pack'
*.tgz
# Yarn Integrity file
.yarn-integrity
# dotenv environment variable files
.env
.env.development.local
.env.test.local
.env.production.local
.env.local
# parcel-bundler cache (https://parceljs.org/)
.cache
.parcel-cache
# Next.js build output
.next
out
# Nuxt.js build / generate output
.nuxt
dist
# Gatsby files
.cache/
# Comment in the public line in if your project uses Gatsby and not Next.js
# https://nextjs.org/blog/next-9-1#public-directory-support
# public
# vuepress build output
.vuepress/dist
# vuepress v2.x temp and cache directory
.temp
.cache
# Docusaurus cache and generated files
.docusaurus
# Serverless directories
.serverless/
# FuseBox cache
.fusebox/
# DynamoDB Local files
.dynamodb/
# TernJS port file
.tern-port
# Stores VSCode versions used for testing VSCode extensions
.vscode-test
# yarn v2
.yarn/cache
.yarn/unplugged
.yarn/build-state.yml
.yarn/install-state.gz
.pnp.*
# https://github.com/github/gitignore/blob/main/Global/VisualStudioCode.gitignore/
.vscode/*
!.vscode/settings.json
# !.vscode/tasks.json
# !.vscode/launch.json
# !.vscode/extensions.json
!.vscode/*.code-snippets
# Local History for Visual Studio Code
.history/
# Built Visual Studio Code Extensions
*.vsix
# https://github.com/github/gitignore/blob/main/Global/macOS.gitignore
# General
.DS_Store
.AppleDouble
.LSOverride
# Icon must end with two \r
Icon
# Thumbnails
._*
# Files that might appear in the root of a volume
.DocumentRevisions-V100
.fseventsd
.Spotlight-V100
.TemporaryItems
.Trashes
.VolumeIcon.icns
.com.apple.timemachine.donotpresent
# Directories potentially created on remote AFP share
.AppleDB
.AppleDesktop
Network Trash Folder
Temporary Items
.apdisk
# https://github.com/github/gitignore/blob/main/Global/Windows.gitignore
# Windows thumbnail cache files
Thumbs.db
Thumbs.db:encryptable
ehthumbs.db
ehthumbs_vista.db
# Dump file
*.stackdump
# Folder config file
[Dd]esktop.ini
# Recycle Bin used on file shares
$RECYCLE.BIN/
# Windows Installer files
*.cab
*.msi
*.msix
*.msm
*.msp
# Windows shortcuts
*.lnk
# https://github.com/github/gitignore/blob/main/Global/Linux.gitignore
*~
# temporary files which can be created if a process still has a handle open of a deleted file
.fuse_hidden*
# KDE directory preferences
.directory
# Linux trash folder which might appear on any partition or disk
.Trash-*
# .nfs files are created when an open file is removed but is still being accessed
.nfs*
# Analitics
deployments
.keras/
# default installed image for docker executor is: python:3.6
# using an image that can do git, docker, docker-compose
# https://docs.gitlab.com/ee/ci/docker/using_docker_build.html
image: docker:dind
image: gitlab-registry.mpcdf.mpg.de/nomad-lab/nomad-fair/ci-runner
variables:
GIT_SUBMODULE_STRATEGY: recursive
# variables:
# IMAGE_NAME: ${CI_REGISTRY_IMAGE}:${CI_COMMIT_REF_SLUG}
stages:
- build
- deploy
- release
build to staging:
build to develop:
stage: build
variables:
GIT_SUBMODULE_STRATEGY: recursive
GIT_SUBMODULE_UPDATE_FLAGS: --jobs 4
before_script:
- echo "Building the single user notebook image"
- echo $CI_REGISTRY_PASSWORD | docker login -u $CI_REGISTRY_USER $CI_REGISTRY --password-stdin
- docker info
script:
# Using cache to speed up the build process --cache-from ${CI_REGISTRY_IMAGE}:${CI_MERGE_REQUEST_SOURCE_BRANCH_NAME}
- docker pull ${CI_REGISTRY_IMAGE}:${CI_MERGE_REQUEST_SOURCE_BRANCH_NAME} || true
- docker build --tag ${CI_REGISTRY_IMAGE}:${CI_MERGE_REQUEST_SOURCE_BRANCH_NAME}${CI_COMMIT_SHORT_SHA} --tag ${CI_REGISTRY_IMAGE}:${CI_MERGE_REQUEST_SOURCE_BRANCH_NAME} .
- docker push ${CI_REGISTRY_IMAGE}:${CI_MERGE_REQUEST_SOURCE_BRANCH_NAME}${CI_COMMIT_SHORT_SHA}
- docker push ${CI_REGISTRY_IMAGE}:${CI_MERGE_REQUEST_SOURCE_BRANCH_NAME}
- docker build -t ${CI_REGISTRY_IMAGE}:${CI_COMMIT_REF_SLUG} .
- docker push ${CI_REGISTRY_IMAGE}:${CI_COMMIT_REF_SLUG}
rules:
# Execute jobs when a new commit is pushed to master branch
- if: $CI_MERGE_REQUEST_TARGET_BRANCH_NAME == "master"
- if: $CI_MERGE_REQUEST_TARGET_BRANCH_NAME == "develop"
build to production:
stage: build
before_script:
- echo "Building the single user notebook image"
- echo $CI_REGISTRY_PASSWORD | docker login -u $CI_REGISTRY_USER $CI_REGISTRY --password-stdin
- docker info
script:
# Using cache to speed up the build process
- docker pull ${CI_REGISTRY_IMAGE}:latest || true
- docker build --cache-from ${CI_REGISTRY_IMAGE}:latest --tag ${CI_REGISTRY_IMAGE}:production${CI_COMMIT_SHORT_SHA} --tag ${CI_REGISTRY_IMAGE}:latest .
- docker push ${CI_REGISTRY_IMAGE}:production${CI_COMMIT_SHORT_SHA}
- docker push ${CI_REGISTRY_IMAGE}:latest
rules:
# Execute jobs when a new commit is pushed to master branch
- if: $CI_COMMIT_BRANCH == 'master'
# build to staging:
# stage: build
# variables:
# GIT_SUBMODULE_STRATEGY: recursive
# GIT_SUBMODULE_UPDATE_FLAGS: --jobs 4
# before_script:
# - echo "Building the single user notebook image"
# - echo $CI_REGISTRY_PASSWORD | docker login -u $CI_REGISTRY_USER $CI_REGISTRY --password-stdin
# script:
# - docker build -t ${STAGING_IMAGE} .
# - docker push ${STAGING_IMAGE}
# rules:
# # Execute jobs when a new commit is pushed to develop branch
# - if: $CI_COMMIT_BRANCH == "develop"
# build to production:
# stage: build
# variables:
# GIT_SUBMODULE_STRATEGY: recursive
# GIT_SUBMODULE_UPDATE_FLAGS: --jobs 4
# before_script:
# - echo "Building the single user notebook image"
# - echo $CI_REGISTRY_PASSWORD | docker login -u $CI_REGISTRY_USER $CI_REGISTRY --password-stdin
# script:
# - docker build -t ${STAGING_IMAGE} .
# - docker push ${STAGING_IMAGE}
# rules:
# # Execute jobs when a new commit is pushed to master branch
# - if: $CI_COMMIT_BRANCH == 'master'
deploy to staging:
image: python:3.6
deploy to develop:
stage: deploy
environment:
name: dev/$CI_COMMIT_REF_NAME
deployment_tier: development
url: https://analytics-toolkit.nomad-coe.eu/dev/${CI_ENVIRONMENT_SLUG}
auto_stop_in: 7 days
on_stop: stop deploy dev
variables:
GIT_SUBMODULE_STRATEGY: none
NAMESPACE: analytics-develop
before_script:
- mkdir ~/.kube/
- echo ${CI_KUBE_CONFIG} | base64 -d > ~/.kube/config
- helm repo add jupyterhub https://jupyterhub.github.io/helm-chart
- helm repo update
- helm version
script:
- ./.gitlab-ci/update_tag_staging.sh
environment:
name: staging
url: https://nomad-lab.eu/dev/analytics/staging
- helm upgrade ${CI_ENVIRONMENT_SLUG} jupyterhub/jupyterhub
--install
--namespace ${NAMESPACE}
--version=1.2.0
--timeout=40m0s
--cleanup-on-fail
--values deployments/dev-values.yaml
--set hub.baseUrl=/dev/${CI_ENVIRONMENT_SLUG}
--set fullnameOverride=${CI_ENVIRONMENT_SLUG}
--set singleuser.podNameTemplate="${CI_ENVIRONMENT_SLUG}-{username}"
--set hub.config.GenericOAuthenticator.oauth_callback_url=https://analytics-toolkit.nomad-coe.eu/dev/${CI_ENVIRONMENT_SLUG}/hub/oauth_callback
--set singleuser.image.name=${CI_REGISTRY_IMAGE}
--set singleuser.image.tag=${CI_COMMIT_REF_SLUG}
--set roll=true
--wait
rules:
# Execute jobs when a new commit is pushed to master branch
- if: $CI_MERGE_REQUEST_TARGET_BRANCH_NAME == "master"
- if: $CI_MERGE_REQUEST_TARGET_BRANCH_NAME == "develop"
deploy to production:
image: python:3.6
stop deploy dev:
stage: deploy
script:
- ./.gitlab-ci/update_tag_production.sh
environment:
name: production
url: https://nomad-lab.eu/prod/analytics/hub
rules:
# Execute jobs when a new commit is pushed to master branch
- if: $CI_COMMIT_BRANCH == 'master'
name: dev/$CI_COMMIT_REF_NAME
action: stop
before_script:
- mkdir ~/.kube/
- echo ${CI_K8S_CONFIG} | base64 -d > ~/.kube/config
script:
- helm uninstall ${CI_ENVIRONMENT_SLUG} --namespace analytics
when: manual
needs: ["build to develop"]
# deploy to staging:
# stage: deploy
# environment:
# name: staging
# url: https://nomad-lab.eu/dev/analytics/staging
# script:
# - ./.gitlab-ci/update_tag_staging.sh
# rules:
# # Execute jobs when a new commit is pushed to master branch
# - if: $CI_MERGE_REQUEST_TARGET_BRANCH_NAME == "develop"
# deploy to production:
# stage: deploy
# environment:
# name: production
# url: https://nomad-lab.eu/prod/analytics/hub
# script:
# - ./.gitlab-ci/update_tag_production.sh
# rules:
# # Execute jobs when a new commit is pushed to master branch
# - if: $CI_COMMIT_BRANCH == 'develop'
# !/bin/bash
# Based on: https://docs.gitlab.com/ee/ci/ssh_keys/README.html
# Install ssh-agent if not already installed, it is required by Docker.
# (change apt-get to yum if you use an RPM-based image)
# - 'which ssh-agent || ( apt-get update -y && apt-get install openssh-client -y )'
# Run ssh-agent (inside the build environment)
eval $(ssh-agent -s)
# Add the SSH key stored in SSH_PRIVATE_KEY variable to the agent store
# We're using tr to fix line endings which makes ed25519 keys work
# without extra base64 encoding.
# https://gitlab.com/gitlab-examples/ssh-private-key/issues/1#note_48526556
echo "$SSH_PRIVATE_KEY" | tr -d '\r' | ssh-add -
# Create the SSH directory and give it the right permissions
mkdir -p ~/.ssh
chmod 700 ~/.ssh
# Use ssh-keyscan to scan the keys of your private server.
ssh-keyscan gitlab.mpcdf.mpg.de >> ~/.ssh/known_hosts
chmod 644 ~/.ssh/known_hosts
# Set the user name and email.
git config --global user.name $GITLAB_USER_NAME
git config --global user.email $GITLAB_USER_EMAIL
# Clone the private repository
git clone git@gitlab.mpcdf.mpg.de:nomad-lab/analytics-deployment.git /tmp/analytics-deployment
cd /tmp/analytics-deployment
# Update the tag of the docker image
sed -i "s/^ tag\:.*/ tag\: production$CI_COMMIT_SHORT_SHA/g" deployments/hub/config.yaml
sed -i "s/^ tag\:.*/ tag\: production$CI_COMMIT_SHORT_SHA/g" deployments/hub/public.yaml
# Finally, commit and push the changes
git add deployments/hub/config.yaml
git add deployments/hub/public.yaml
git commit -m "CI: Update the hub image for production ($CI_PIPELINE_URL)"
git push
rm -rf /tmp/analytics-deployment
# !/bin/bash
# Based on: https://docs.gitlab.com/ee/ci/ssh_keys/README.html
# Install ssh-agent if not already installed, it is required by Docker.
# (change apt-get to yum if you use an RPM-based image)
# - 'which ssh-agent || ( apt-get update -y && apt-get install openssh-client -y )'
# Run ssh-agent (inside the build environment)
eval $(ssh-agent -s)
# Add the SSH key stored in SSH_PRIVATE_KEY variable to the agent store
# We're using tr to fix line endings which makes ed25519 keys work
# without extra base64 encoding.
# https://gitlab.com/gitlab-examples/ssh-private-key/issues/1#note_48526556
echo "$SSH_PRIVATE_KEY" | tr -d '\r' | ssh-add -
# Create the SSH directory and give it the right permissions
mkdir -p ~/.ssh
chmod 700 ~/.ssh
# Use ssh-keyscan to scan the keys of your private server.
ssh-keyscan gitlab.mpcdf.mpg.de >> ~/.ssh/known_hosts
chmod 644 ~/.ssh/known_hosts
# Set the user name and email.
git config --global user.name $GITLAB_USER_NAME
git config --global user.email $GITLAB_USER_EMAIL
# Clone the private repository
git clone git@gitlab.mpcdf.mpg.de:nomad-lab/analytics-deployment.git /tmp/analytics-deployment
cd /tmp/analytics-deployment
# Update the tag of the docker image
sed -i "s/^ tag\:.*/ tag\: $CI_MERGE_REQUEST_SOURCE_BRANCH_NAME$CI_COMMIT_SHORT_SHA/g" deployments/hub/staging.yaml
# Finally, commit and push the changes
git add deployments/hub/staging.yaml
git commit -m "CI: Update the hub image for staging ($CI_PIPELINE_URL)"
git push
rm -rf /tmp/analytics-deployment
[submodule "3rdparty/atomic-features-package"]
path = 3rdparty/atomic-features-package
url = https://gitlab.mpcdf.mpg.de/nomad-lab/atomic-features-package.git
[submodule "3rdparty/cmlkit"]
path = 3rdparty/cmlkit
url = https://gitlab.mpcdf.mpg.de/nomad-lab/cmlkit.git
[submodule "3rdparty/keras-vis"]
path = 3rdparty/keras-vis
url = https://github.com/raghakot/keras-vis.git
[submodule "3rdparty/qmmlpack"]
path = 3rdparty/qmmlpack
url = https://gitlab.com/qmml/qmmlpack.git
[submodule "3rdparty/quip"]
path = 3rdparty/quip
url = https://github.com/libAtoms/QUIP.git
url = git@github.com:libAtoms/QUIP.git
[submodule "3rdparty/sissopp"]
path = 3rdparty/sissopp
url = https://gitlab.com/sissopp_developers/sissopp.git
[submodule "tutorials/analytics-compressed-sensing"]
path = tutorials/analytics-compressed-sensing
url = https://gitlab.mpcdf.mpg.de/nomad-lab/analytics-compressed-sensing.git
......@@ -22,57 +37,78 @@
[submodule "tutorials/analytics-tcmi"]
path = tutorials/analytics-tcmi
url = https://gitlab.mpcdf.mpg.de/nomad-lab/analytics-tcmi.git
[submodule "tutorials/analytics-descriptor-role"]
path = tutorials/analytics-descriptor-role
url = https://gitlab.mpcdf.mpg.de/nomad-lab/analytics-descriptor-role.git
[submodule "tutorials/analytics-error-estimates"]
path = tutorials/analytics-error-estimates
url = https://gitlab.mpcdf.mpg.de/nomad-lab/analytics-error-estimates.git
[submodule "3rdparty/cpp_sisso"]
path = 3rdparty/cpp_sisso
url = https://gitlab.mpcdf.mpg.de/nomad-lab/cpp_sisso.git
[submodule "tutorials/analytics-query-nomad-archive"]
path = tutorials/analytics-query-nomad-archive
url = https://gitlab.mpcdf.mpg.de/nomad-lab/analytics-query-nomad-archive.git
[submodule "tutorials/analytics-cmlkit"]
path = tutorials/analytics-cmlkit
url = https://gitlab.mpcdf.mpg.de/nomad-lab/analytics-cmlkit.git
[submodule "3rdparty/qmmlpack"]
path = 3rdparty/qmmlpack
url = https://gitlab.com/qmml/qmmlpack.git
[submodule "tutorials/analytics-descriptor-role"]
path = tutorials/analytics-descriptor-role
url = https://gitlab.mpcdf.mpg.de/nomad-lab/analytics-descriptor-role.git
[submodule "tutorials/analytics-decision-tree"]
path = tutorials/analytics-decision-tree
url = https://gitlab.mpcdf.mpg.de/nomad-lab/analytics-decision-tree.git
[submodule "tutorials/analytics-clustering-tutorial"]
path = tutorials/analytics-clustering-tutorial
url = https://gitlab.mpcdf.mpg.de/nomad-lab/analytics-clustering-tutorial.git
[submodule "tutorials/analytics-nn-regression"]
path = tutorials/analytics-nn-regression
url = https://gitlab.mpcdf.mpg.de/nomad-lab/analytics-nn-regression.git
[submodule "tutorials/analytics-exploratory-analysis"]
path = tutorials/analytics-exploratory-analysis
url = https://gitlab.mpcdf.mpg.de/nomad-lab/analytics-exploratory-analysis.git
[submodule "tutorials/analytics-perovskites-tolerance-factor"]
path = tutorials/analytics-perovskites-tolerance-factor
url = https://gitlab.mpcdf.mpg.de/nomad-lab/analytics-perovskites-tolerance-factor.git
[submodule "tutorials/analytics-krr4mat"]
path = tutorials/analytics-krr4mat
url = https://gitlab.mpcdf.mpg.de/nomad-lab/analytics-krr4mat.git
[submodule "tutorials/analytics-error-estimates"]
path = tutorials/analytics-error-estimates
url = https://gitlab.mpcdf.mpg.de/nomad-lab/analytics-error-estimates.git
[submodule "tutorials/analytics-query-nomad-archive"]
path = tutorials/analytics-query-nomad-archive
url = https://gitlab.mpcdf.mpg.de/nomad-lab/analytics-query-nomad-archive
[submodule "tutorials/analytics-domain-of-applicability"]
path = tutorials/analytics-domain-of-applicability
url = https://gitlab.mpcdf.mpg.de/nomad-lab/analytics-domain-of-applicability.git
[submodule "tutorials/analytics-nn-regression"]
path = tutorials/analytics-nn-regression
url = https://gitlab.mpcdf.mpg.de/nomad-lab/analytics-nn-regression.git
[submodule "tutorials/analytics-tetradymite-PRM2020"]
path = tutorials/analytics-tetradymite-PRM2020
url = https://gitlab.mpcdf.mpg.de/nomad-lab/analytics-tetradymite-PRM2020.ipynb.git
[submodule "tutorials/analytics-clustering-tutorial"]
path = tutorials/analytics-clustering-tutorial
url = https://gitlab.mpcdf.mpg.de/nomad-lab/analytics-clustering-tutorial.git
[submodule "tutorials/analytics-arise"]
path = tutorials/analytics-arise
url = https://gitlab.mpcdf.mpg.de/nomad-lab/analytics-arise.git
[submodule "tutorials/analytics-krr4mat"]
path = tutorials/analytics-krr4mat
url = https://gitlab.mpcdf.mpg.de/nomad-lab/analytics-krr4mat.git
[submodule "3rdparty/keras-vis"]
path = 3rdparty/keras-vis
url = https://github.com/raghakot/keras-vis.git
[submodule "3rdparty/atomic-features-package"]
path = 3rdparty/atomic-features-package
url = https://gitlab.mpcdf.mpg.de/nomad-lab/atomic-features-package.git
[submodule "tutorials/analytics-co2-sgd-tutorial"]
path = tutorials/analytics-co2-sgd-tutorial
url = https://gitlab.mpcdf.mpg.de/nomad-lab/analytics-co2-sgd-tutorial.git
[submodule "tutorials/analytics-catalysis-MRS2021"]
path = tutorials/analytics-catalysis-MRS2021
url = https://gitlab.mpcdf.mpg.de/nomad-lab/analytics-catalysis-MRS2021.git
[submodule "tutorials/analytics-atomic-features"]
path = tutorials/analytics-atomic-features
url = https://gitlab.mpcdf.mpg.de/nomad-lab/analytics-atomic-features.git
[submodule "tutorials/analytics-sgd-alloys-oxygen-reduction-evolution"]
path = tutorials/analytics-sgd-alloys-oxygen-reduction-evolution
url = https://gitlab.mpcdf.mpg.de/nomad-lab/analytics-sgd-alloys-oxygen-reduction-evolution.git
[submodule "tutorials/analytics-co2-sgd-tutorial"]
path = tutorials/analytics-co2-sgd-tutorial
url = https://gitlab.mpcdf.mpg.de/nomad-lab/analytics-co2-sgd-tutorial.git
[submodule "tutorials/analytics-metalInsulator-PRM2018"]
path = tutorials/analytics-metalInsulator-PRM2018
url = https://gitlab.mpcdf.mpg.de/nomad-lab/analytics-metalinsulator-prm2018.git
[submodule "tutorials/analytics-proto-archetype-clustering-sisso"]
path = tutorials/analytics-proto-archetype-clustering-sisso
url = https://gitlab.mpcdf.mpg.de/nomad-lab/analytics-proto-archetype-clustering-sisso.git
[submodule "tutorials/analytics-svm-classification"]
path = tutorials/analytics-svm-classification
url = https://gitlab.mpcdf.mpg.de/nomad-lab/analytics-svm-classification.git
[submodule "tutorials/analytics-sgd-propylene-oxidation-hte"]
path = tutorials/analytics-sgd-propylene-oxidation-hte
url = https://gitlab.mpcdf.mpg.de/nomad-lab/analytics-sgd-propylene-oxidation-hte.git
[submodule "tutorials/analytics-dos-similarity-search"]
path = tutorials/analytics-dos-similarity-search
url = https://gitlab.mpcdf.mpg.de/nomad-lab/analytics-dos-similarity-search.git
[submodule "tutorials/analytics-dimensionality-reduction"]
path = tutorials/analytics-dimensionality-reduction
url = https://gitlab.mpcdf.mpg.de/nomad-lab/analytics-dimensionality-reduction.git
[submodule "tutorials/analytics-kappa-screening-sisso"]
path = tutorials/analytics-kappa-screening-sisso
url = https://gitlab.mpcdf.mpg.de/nomad-lab/analytics-kappa-screening-sisso.git
[submodule "3rdparty/nomad-FAIR"]
path = 3rdparty/nomad-FAIR
url = git@gitlab.mpcdf.mpg.de:nomad-lab/nomad-FAIR.git
Subproject commit a0c34e8bf4f6cc05bcfa6f5524adbc1806a44b9c
Subproject commit 7dda7a911e99b87c4470f01d619664e39f56dd9c
Subproject commit 3ce44fd3933c17d567fbf783fa784b2b6c98ba7c
Subproject commit 0a99e15f54a53f829d1fc7f2cb2d0372319eebbd
*~
### GAP suite — Non-commercial License Agreement
*Last updated: 19th April 2017*
The GAP suite of programs, comprising the GAP-filler and GAP packages
(together “the Software”), written by Dr Gabor Csanyi and Dr Albert
Bartok-Partay of the University of Cambridge (the “Authors”) for
describing the chemical environments of atoms and molecules and
predicting potential energy surfaces, is copyrighted by the University
of Cambridge and the Authors. Users at non-commercial institutions may
obtain a copy for Non-commercial Purposes. "Non-commercial Purposes"
means use of the Software and/or any **Software-derived data (including
interatomic potentials)** for academic research or other not-for-profit
or scholarly purposes which are undertaken at an educational,
non-profit, charitable or governmental institution that does not
involve the production or manufacture of products for sale or the
performance of services for a fee or other non-monetary value.
For commercial use of either or both the Software or any
Software-derived data, please contact Cambridge Enterprise Limited at
<enquiries@enterprise.cam.ac.uk>.
1. This is a legal agreement between you (“USER”), and the Authors. By
accepting, receiving, and using this Software, you are agreeing to
be bound by the terms of this Agreement.
2. The Authors hereby grant to the USER a non-exclusive,
non-transferable personal licence to use the Software.
3. The licence granted under this Agreement shall only entitle the USER
to use the Software for Non-commercial Purposes. The use of the
Software, or any code which is a modification of, enhancement to,
derived from or based upon the Software, or Software-derived data,
either directly or indirectly for any commercial purpose whatsoever
is expressly forbidden. The USER may not sublicense, distribute or
copy (except for archival purposes) the Software or enhancements
thereto.
4. If any academic paper or publication by the USER includes any set of
interatomic potentials or other Software-derived data then the USER
must timeously publish their full GAP model and source data (i.e.
the underlying quantum mechanical data which were used as input to
the GAP-filler). Any publication of any GAP models, interatomic
potentials or other Software-derived data must specifically state
that such data is provided for non-commercial use only.
5. If any academic paper or publication by the USER is based wholly or
partially, directly or indirectly on the Software or any results
derived from the Software then that paper or publication must cite:
- A. P. Bartok et al. *Physical Review Letters* **104**
136403 (2010)
- A. P. Bartok et al. *Physical Review B* **87** 184115 (2013)
- That this Software is available for non-commercial use from
`www.libatoms.org`
6. The Software is the subject of copyright. Unauthorised copying of
the Software is expressly forbidden. The University of Cambridge and
the Authors retain all the rights in and title to the Software.
7. The Software is provided "AS IS" and except as expressly provided in
this Agreement no warranty, condition, undertaking or term, express
or implied, statutory or otherwise, as to the condition,
performance, satisfactory quality or fitness for purpose of the
Software is given or assumed by the Authors or the University of
Cambridge, and all such warranties, conditions, undertakings and
terms are hereby excluded.
8. The limitations and exclusions in this Agreement shall not apply in
respect of claims for personal injury or death caused by negligence
or in respect of fraud or fraudulent misrepresentation.
9. EXCEPT AS PROVIDED BY CLAUSE 8, neither the AUTHORS nor the
University OF CAMBRIDGE or its employees or students shall be liable
for any damages or expenses of whatsoever nature and howsoever
arising (including without limitation in contract, tort, negligence
or for breach of statutory duty or misrepresentation) in connection
with any right or licence granted or use of the Software or
otherwise in connection with this Licence or any relationships
established by it. Without prejudice to the generality of the
foregoing, in the event that the AUTHORS, the University OF
CAMBRIDGE, its employees or students should be found liable, then
their aggregate liability for direct damages shall be limited to
£100; and none of them shall be liable for any indirect, incidental,
consequential or special damages including without limitation loss
of profits, revenue, or business opportunity, loss of goodwill, data
loss, business disruption or computer failure.
10. The USER shall indemnify the Authors, the University of Cambridge,
its employees and students in full against each and every claim made
against any of them by any third party (including without limitation
in contract, tort, negligence or for breach of statutory duty or
misrepresentation) arising out of or in connection with the USER's
use of the Software.
11. The Authors accept no obligation to provide maintenance nor do they
guarantee the expected functionality of the Software or of any part
of the Software.
12. This Agreement is effective until terminated. This Agreement will
terminate automatically on notice from the Authors if the USER fails
to comply with any provision of this Agreement. Upon termination the
USER shall immediately destroy all copies of the Software.
13. This Agreement and any matters relating to it shall be governed and
construed in accordance with the laws of England and Wales and
Authors and the USER hereby irrevocably submit to the exclusive
jurisdiction of the English Courts.
# HND XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
# HND X
# HND X libAtoms+QUIP: atomistic simulation library
# HND X
# HND X Portions of this code were written by
# HND X Albert Bartok-Partay, Silvia Cereda, Gabor Csanyi, James Kermode,
# HND X Ivan Solt, Wojciech Szlachta, Csilla Varnai, Steven Winfield.
# HND X
# HND X Copyright 2006-2010.
# HND X
# HND X Not for distribution
# HND X
# HND X Portions of this code were written by Noam Bernstein as part of
# HND X his employment for the U.S. Government, and are not subject
# HND X to copyright in the USA.
# HND X
# HND X When using this software, please cite the following reference:
# HND X
# HND X http://www.libatoms.org
# HND X
# HND X Additional contributions by
# HND X Alessio Comisso, Chiara Gattinoni, and Gianpietro Moras
# HND X
# HND XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
ifeq (${QUIP_ARCH},)
include Makefile.arch
else
include Makefile.${QUIP_ARCH}
endif
include Makefile.inc
include Makefile.rules
GAP1_F95_FILES = make_permutations_v2 descriptors gp_predict descriptors_wrapper clustering
GAP1_F95_SOURCES = ${addsuffix .f95, ${GAP1_F95_FILES}}
GAP1_F95_OBJS = ${addsuffix .o, ${GAP1_F95_FILES}}
GAP2_F95_FILES = gp_fit gap_fit_module
GAP2_F95_SOURCES = ${addsuffix .f95, ${GAP2_F95_FILES}}
GAP2_F95_OBJS = ${addsuffix .o, ${GAP2_F95_FILES}}
default: ${GAP_LIBFILE}
ifeq (${USE_MAKEDEP},1)
GAP1_F95_FPP_FILES = ${addsuffix .fpp, ${GAP1_F95_FILES}}
GAP2_F95_FPP_FILES = ${addsuffix .fpp, ${GAP2_F95_FILES}}
GAP1.depend: ${GAP1_F95_FPP_FILES}
${SCRIPT_PATH}/${MAKEDEP} ${MAKEDEP_ARGS} -- ${addprefix ../../src/GAP/,${GAP1_F95_SOURCES}} > GAP1.depend
GAP2.depend: ${GAP2_F95_FPP_FILES} ${GAP1_F95_FPP_FILES}
${SCRIPT_PATH}/${MAKEDEP} ${MAKEDEP_ARGS} -- ${addprefix ../../src/GAP/,${GAP2_F95_SOURCES}} > GAP2.depend
-include GAP1.depend
-include GAP2.depend
endif
PROGRAMS = gap_fit
LIBS = -L. -lquiputils -lquip_core -lgap -latoms
ifeq (${HAVE_THIRDPARTY},1)
LIBS += -lthirdparty
endif
LIBFILES = libatoms.a ${GAP_LIBFILE} libquip_core.a libquiputils.a
.PHONY : clean allclean depend doc install
Programs: ${PROGRAMS}
cp ${QUIP_ROOT}/src/GAP/teach_sparse .
${PROGRAMS}: % : ${LIBFILES} ${GAP2_F95_OBJS} ${GAPFIT_LIBFILE} %.o
$(LINKER) $(LINKFLAGS) -o $@ ${F90OPTS} $@.o ${GAPFIT_LIBFILE} ${LIBS} ${LINKOPTS}
${GAP_LIBFILE}: ${GAP1_F95_OBJS}
ifneq (${LIBTOOL},)
${LIBTOOL} -o ${GAP_LIBFILE} ${GAP1_F95_OBJS}
else
${AR} ${AR_ADD} ${GAP_LIBFILE} $?
endif
${GAPFIT_LIBFILE}: ${GAP2_F95_OBJS}
ifneq (${LIBTOOL},)
${LIBTOOL} -o ${GAPFIT_LIBFILE} ${GAP2_F95_OBJS}
else
${AR} ${AR_ADD} ${GAPFIT_LIBFILE} $?
endif
install:
@if [ ! -d ${QUIP_INSTALLDIR} ]; then \
echo "make install: QUIP_INSTALLDIR '${QUIP_INSTALLDIR}' doesn't exist or isn't a directory"; \
exit 1; \
else \
for f in ${PROGRAMS} ; do \
echo "Copying $$f to ${QUIP_INSTALLDIR}/$${f}${QUIP_MPI_SUFFIX}" ; \
cp $$f ${QUIP_INSTALLDIR}/$${f}${QUIP_MPI_SUFFIX} ; \
done ;\
cp ${QUIP_ROOT}/src/GAP/teach_sparse ${QUIP_INSTALLDIR}; \
fi
clean:
rm -f *.o *.mod *.mod.save ${GAP_LIBFILE} ${GAPFIT_LIBFILE} ${PROGRAMS} GAP1.depend GAP2.depend
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
# Minimal makefile for Sphinx documentation
#
# You can set these variables from the command line.
SPHINXOPTS =
SPHINXBUILD = sphinx-build
SPHINXPROJ = gap
SOURCEDIR = .
BUILDDIR = _build
# Put it first so that "make" without argument is like "make help".
help:
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
.PHONY: help Makefile
# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
# -*- coding: utf-8 -*-
# HQ XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
# HQ X
# HQ X quippy: Python interface to QUIP atomistic simulation library
# HQ X
# HQ X Copyright James Kermode 2010
# HQ X
# HQ X These portions of the source code are released under the GNU General
# HQ X Public License, version 2, http://www.gnu.org/copyleft/gpl.html
# HQ X
# HQ X If you would like to license the source code under different terms,
# HQ X please contact James Kermode, james.kermode@gmail.com
# HQ X
# HQ X When using this software, please cite the following reference:
# HQ X
# HQ X http://www.jrkermode.co.uk/quippy
# HQ X
# HQ XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
#
# quippy documentation build configuration file, created by
# sphinx-quickstart on Wed Sep 2 14:17:01 2009.
#
# This file is execfile()d with the current directory set to its
# containing dir.
#
# Note that not all possible configuration values are present in this
# autogenerated file.
#
# All configuration values have a default; values that are commented out
# serve to show the default.
import glob
import os
import sys
# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
sys.path.insert(1, os.path.abspath('..'))
sys.path.insert(1, os.path.abspath('.'))
# -- General configuration ------------------------------------------------
# If your documentation needs a minimal Sphinx version, state it here.
#
needs_sphinx = '1.4'
# Add any Sphinx extension module names here, as strings. They can be
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
# ones.
extensions = [
'sphinx.ext.autodoc',
'sphinx.ext.doctest',
'sphinx.ext.mathjax',
'sphinx.ext.autosummary',
'sphinx.ext.intersphinx',
'sphinx.ext.viewcode',
'sphinx.ext.githubpages',
'nbsphinx',
'modcontents',
'numpydoc']
# This list was generated by grep as a workaround to
# https://github.com/sphinx-doc/sphinx/issues/2485
autosummary_generate = [
'gap_fit.rst',
]
nbsphinx_allow_errors = True
# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']
# The suffix(es) of source filenames.
# You can specify multiple suffix as a list of string:
#
# source_suffix = ['.rst', '.md']
source_suffix = ['.rst', '.ipynb']
# The master toctree document.
master_doc = 'index'
# General information about the project.
project = u'gap'
copyright = u'20010-2019, Gabor Csanyi'
author= u'Gabor Csanyi'
# The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the
# built documents.
#
# The short X.Y version.
import os
version=os.popen('cat ../GIT_VERSON 2> /dev/null|| '
'git describe --always --tags --dirty=+ 2> /dev/null || '
'echo ').read().strip()
# The full version, including alpha/beta/rc tags.
release = version
# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
#
# This is also used if you do content translation via gettext catalogs.
# Usually you set "language" from the command line for these cases.
language = None
# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
# This patterns also effect to html_static_path and html_extra_path
exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store', '**.ipynb_checkpoints']
# The name of the Pygments (syntax highlighting) style to use.
pygments_style = 'sphinx'
# A list of ignored prefixes for module index sorting.
modindex_common_prefix = ['quippy.']
# -- Options for HTML output ----------------------------------------------
# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
#
html_theme = 'sphinx_rtd_theme'
# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the
# documentation.
html_theme_options = {
}
html_logo = 'hybrid.png'
html_favicon = 'favicon.ico'
# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
html_static_path = ['_static']
# Fix for RTD tables
html_context = {
'css_files': [
'_static/theme_overrides.css',
],
}
# -- Options for HTMLHelp output ------------------------------------------
# Output file base name for HTML help builder.
htmlhelp_basename = 'gapdoc'
# -- Options for LaTeX output ---------------------------------------------
latex_elements = {
# The paper size ('letterpaper' or 'a4paper').
#
# 'papersize': 'letterpaper',
# The font size ('10pt', '11pt' or '12pt').
#
# 'pointsize': '10pt',
# Additional stuff for the LaTeX preamble.
#
# 'preamble': '',
# Latex figure (float) alignment
#
# 'figure_align': 'htbp',
}
# Grouping the document tree into LaTeX files. List of tuples
# (source start file, target name, title,
# author, documentclass [howto, manual, or own class]).
latex_documents = [
(master_doc, 'gap.tex', u'GAP Documentation',
u'Gabor Csanyi', 'manual'),
]
# -- Options for manual page output ---------------------------------------
# One entry per manual page. List of tuples
# (source start file, name, description, authors, manual section).
man_pages = [
(master_doc, 'gap', u'GAP Documentation',
[author], 1)
]
# -- Options for Texinfo output -------------------------------------------
# Grouping the document tree into Texinfo files. List of tuples
# (source start file, target name, title, author,
# dir menu entry, description, category)
texinfo_documents = [
(master_doc, 'gap', u'GAP Documentation',
author, 'gap', 'One line description of project.',
'Miscellaneous'),
]
intersphinx_mapping = {'python': ('https://docs.python.org/2.7', None),
'ase': ('https://wiki.fysik.dtu.dk/ase/', None),
'numpy': ('https://docs.scipy.org/doc/numpy/', None),
'matplotlib': ('https://matplotlib.org/', None)}
###
### sphinx-quickstart output ends here
###
from quippy.oo_fortran import FortranDerivedType, FortranDerivedTypes
import re
# classnames = [v.__name__ for v in FortranDerivedTypes.values()]
# classname_re = re.compile('(^|\s+)((``)?)('+'|'.join([v.__name__ for v in FortranDerivedTypes.values()])+r')\2(?=[\s\W]+)')
# method_or_attr_re = re.compile('(^|\s+)((``)?)('+'|'.join([v.__name__+'\.([a-zA-Z][a-zA-Z0-9_]*)' for v in FortranDerivedTypes.values()])+r')\2(?=[\s\W]+)')
# doc_subs = [(classname_re, r'\1:class:`~.\4`'),
# (method_or_attr_re, r'\1:meth:`.\4`')]
global_options = None
def process_docstring(app, what, name, obj, options, lines):
global global_options
global_options = options
def process_signature(app, what, name, obj, options, signature, return_annotation):
return (signature, return_annotation)
def maybe_skip_member(app, what, name, obj, skip, options):
if hasattr(FortranDerivedType, name) and options.inherited_members:
return True
return skip
from docutils import nodes, utils
from docutils.parsers.rst.roles import set_classes
def get_github_url(view, path):
return 'https://github.com/{project}/{view}/{branch}/{path}'.format(
project='libAtoms/GAP',
view=view,
branch='public',
path=path)
def github_role(role, rawtext, text, lineno, inliner, options={}, content=[]):
if text[-1] == '>':
i = text.index('<')
name = text[:i - 1]
text = text[i + 1:-1]
else:
name = text
ref = get_github_url('blob', name)
set_classes(options)
node = nodes.reference(rawtext, name, refuri=ref, **options)
return [node], []
def mol_role(role, rawtext, text, lineno, inliner, options={}, content=[]):
n = []
t = ''
while text:
if text[0] == '_':
n.append(nodes.Text(t))
t = ''
n.append(nodes.subscript(text=text[1]))
text = text[2:]
else:
t += text[0]
text = text[1:]
n.append(nodes.Text(t))
return n, []
def setup(app):
app.connect('autodoc-process-docstring', process_docstring)
app.connect('autodoc-process-signature', process_signature)
app.connect('autodoc-skip-member', maybe_skip_member)
app.add_role('git', github_role)
app.add_role('mol', mol_role)
autodoc_member_order = 'groupwise'
#autoclass_content = 'both'
def add_line(self, line, source, *lineno):
"""Append one line of generated reST to the output."""
sys.stdout.write(self.indent + line + '\n')
self.directive.result.append(self.indent + line, source, *lineno)
# Uncomment two lines below to debug autodoc rst output
#import sphinx.ext.autodoc
#sphinx.ext.autodoc.Documenter.add_line = add_line
# Monkey patch numpydoc to exclude methods and attributes inherited
# from base classes, and to include attributes wrapped by Python
# properties
numpydoc_show_class_members = True
import inspect
import pydoc
import numpydoc
import numpydoc.docscrape
@property
def methods(self):
if self._cls is None:
return []
do_inherited = False
if global_options is not None and hasattr(global_options, 'inherited_members'):
do_inherited = global_options.inherited_members
return [name for name,func in inspect.getmembers(self._cls)
if not name.startswith('_') and callable(func) and
(do_inherited or not any(hasattr(base, name) for base in self._cls.__bases__)) and
pydoc.getdoc(getattr(self._cls, name))]
@property
def properties(self):
if self._cls is None:
return []
do_inherited = False
if global_options is not None and hasattr(global_options, 'inherited_members'):
do_inherited = global_options.inherited_members
return [name for name,func in inspect.getmembers(self._cls) if not name.startswith('_') and
(do_inherited or not any(hasattr(base, name) for base in self._cls.__bases__)) and
(func is None or isinstance(getattr(self._cls, name), property)) and
pydoc.getdoc(getattr(self._cls, name))]
numpydoc.docscrape.ClassDoc.methods = methods
numpydoc.docscrape.ClassDoc.properties = properties
# generate .rst versions of any .ipynb notebooks under Examples/
# Superseded by nbsphinx extension
# for notebook in glob.glob(os.path.join('Examples/*.ipynb')):
# cmd = 'jupyter nbconvert --to rst {0}'.format(notebook)
# print cmd
# os.system(cmd)
.. HQ XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
.. HQ X
.. HQ X quippy: Python interface to QUIP atomistic simulation library
.. HQ X
.. HQ X Copyright James Kermode 2010
.. HQ X
.. HQ X These portions of the source code are released under the GNU General
.. HQ X Public License, version 2, http://www.gnu.org/copyleft/gpl.html
.. HQ X
.. HQ X If you would like to license the source code under different terms,
.. HQ X please contact James Kermode, james.kermode@gmail.com
.. HQ X
.. HQ X When using this software, please cite the following reference:
.. HQ X
.. HQ X http://www.jrkermode.co.uk/quippy
.. HQ X
.. HQ XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
The gap_fit program
=====================================================
Main options
------------
.. autofunction:: quippy.gap_fit_parse_command_line
GAP options
-----------
.. autofunction:: quippy.gap_fit_parse_gap_str
`sparse_method` options are:
- RANDOM: default, chooses n_sparse random datapoints
- PIVOT: based on the full covariance matrix finds the n_sparse "pivoting" points
- CLUSTER: based on the full covariance matrix performs a k-medoid clustering into n_sparse clusters, returning the medoids
- UNIFORM: makes a histogram of the data based on n_sparse and returns a data point from each bin
- KMEANS: k-means clustering based on the data points
- COVARIANCE: greedy data point selection based on the sparse covariance matrix, to minimise the GP variance of all datapoints
- UNIQ: selects unique datapoints from the dataset
- FUZZY: fuzzy k-means clustering
- FILE: reads sparse points from a file
- INDEX_FILE: reads indices of sparse points from a file
- CUR_COVARIANCE: CUR, based on the full covariance matrix
- CUR_POINTS: CUR, based on the datapoints
Source diff could not be displayed: it is too large. Options to address this: view the blob.
.. HQ XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
.. HQ X
.. HQ X GAP: Gaussian Approximation Potential
.. HQ X
.. HQ X Copyright Albert Bartok-Partay, Gabor Csanyi 2010-2019
.. HQ X
.. HQ X gc121@cam.ac.uk
.. HQ X
.. HQ X
.. HQ XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
.. gap documentation master file
GAP and SOAP documentation
=============================
.. module:: gap
These are the documentation pages for Gaussian Approximation Potential
(GAP) code. GAP is a plugin for QUIP, which itself can be used as a
plugin to LAMMPS or called from ASE. For installation, see the QUIP
documentation pages. The information below is on the usage of GAP.
The purpose of the GAP code is to fit interatomic potentials and then
use them for molecular simulation.
Contents
========
.. toctree::
:maxdepth: 1
gap_fit.rst
tutorials.rst
Indices and tables
==================
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`
* `Module Source Code <_modules/index.html>`_