Commit 74b8a018 authored by Adam Fekete's avatar Adam Fekete
Browse files

setting up mongodb

parent 494d32d6
# mongodb instance folder
db/
# results of the the notebooks
tutorials/flow_sic_relax
tutorials/launcher_*/
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
......
......@@ -23,7 +23,6 @@ RUN apt-get update \
liblapack-dev libblas-dev \
libhdf5-dev libnetcdf-dev libnetcdff-dev libpnetcdf-dev libxc-dev \
libfftw3-dev libxml2-dev \
mongodb \
slurmd slurm-client slurmctld \
&& rm -rf /var/lib/apt/lists/*
......@@ -55,6 +54,7 @@ RUN conda install --quiet --yes \
# 'matminer' \
# 'jupyterlab-git' \
'abipy' \
'jupyter-server-proxy' \
'jupyter_contrib_nbextensions' \
'jupyter_nbextensions_configurator' \
&& pip install --no-cache-dir jupyter-jsmol fireworks \
......@@ -74,6 +74,6 @@ USER $NB_UID
RUN pip install -e pseudo_dojo
USER $NB_UID
WORKDIR $HOME
......@@ -78,6 +78,14 @@ More info: https://jupyter-docker-stacks.readthedocs.io/en/latest/using/common.h
More information about the command line options: https://jupyter-docker-stacks.readthedocs.io/en/latest/using/common.html#notebook-options
## Running the notebook with mongodb server
If the `workflows-workshop:latest` image is available you can run the following command to start the tutorial:
```bash
docker-compose up
```
## Continuous integration (for future usage)
Each commit triggers a build process on GitLab Runner. Besides the latest tag, there will be a unique tag (same that as the value of the git commit) available for explicitly tracking the version of the notebook for cluster deployment.
......
version: '3.1'
services:
workflows-workshop:
# build: .
image: workflows-workshop:latest
restart: always
user: root
environment:
# - NB_UID=1001
- GRANT_SUDO=yes
ports:
- 8888:8888
depends_on:
- mongo
volumes:
- ./tutorials:/home/jovyan/tutorials
mongo:
image: mongo
restart: always
# environment:
# - MONGO_INITDB_ROOT_USERNAME=admin
# - MONGO_INITDB_ROOT_PASSWORD=secret
# ports:
# - 27017:27017 # opening port for development only!
volumes:
- ./db:/data/db
# mongo-express: # mongo-express only for development only!
# image: mongo-express
# restart: always
# ports:
# - 8081:8081
# environment:
# - ME_CONFIG_MONGODB_ADMINUSERNAME=admin
# - ME_CONFIG_MONGODB_ADMINPASSWORD=secret
# depends_on:
# - mongo
%% Cell type:markdown id:exempt-cloud tags:
# Abinit - quickstart
https://abinit.github.io/abipy/flow_gallery/run_sic_relax.html#sphx-glr-flow-gallery-run-sic-relax-py
%% Cell type:code id:necessary-budapest tags:
``` python
!abicheck.py
```
%% Cell type:markdown id:chronic-invention tags:
## Relaxation Flow
This example shows how to build a very simple Flow for the structural relaxation of SiC. One could use a similar logic to perform multiple relaxations with different input parameters…
%% Cell type:code id:handmade-dividend tags:
``` python
import abipy.abilab as abilab
import abipy.data as data
import abipy.flowtk as flowtk
def build_flow(workdir):
pseudos = data.pseudos("14si.pspnc", "6c.pspnc")
structure = data.structure_from_ucell("SiC")
# Initialize the input
relax_inp = abilab.AbinitInput(structure, pseudos=pseudos)
# Set variables
relax_inp.set_vars(
ecut=20,
paral_kgb=1,
iomode=3,
# Relaxation part
ionmov=2,
optcell=1,
strfact=100,
ecutsm=0.5, # Important!
dilatmx=1.15, # Important!
toldff=1e-6,
tolmxf=1e-5,
ntime=100,
)
# K-points sampling
shiftk = [
[0.5,0.5,0.5],
[0.5,0.0,0.0],
[0.0,0.5,0.0],
[0.0,0.0,0.5]
]
relax_inp.set_kmesh(ngkpt=[4, 4, 4], shiftk=shiftk)
# Initialize the flow
flow = flowtk.Flow(workdir)
# Register the task.
flow.register_relax_task(relax_inp)
return flow
```
%% Cell type:markdown id:related-music tags:
Build and run the flow:
%% Cell type:code id:steady-fellowship tags:
``` python
flow = build_flow('flow_sic_relax')
scheduler = flow.make_scheduler()
scheduler.start()
```
%% Cell type:markdown id:confirmed-space tags:
To visualize the evolution of the lattice parameters during the structura relaxation use:
%% Cell type:code id:raising-valuable tags:
``` python
abifile = abilab.abiopen('flow_sic_relax/w0/t0/outdata/out_HIST.nc')
abifile.plot();
```
%% Cell type:code id:changing-bennett tags:
``` python
```
%% Cell type:code id:passing-mineral tags:
``` python
```
%% Cell type:code id:prescribed-mexico tags:
``` python
```
%% Cell type:markdown id:metallic-supervisor tags:
# Fireworks (Five-minute quickstart)
https://materialsproject.github.io/fireworks/quickstart.html
%% Cell type:markdown id:gorgeous-curve tags:
## Start FireWorks
If not already running, start MongoDB (if your MongoDB is hosted and maintained externally, follow the note below regarding lpad init):
%% Cell type:code id:electronic-essex tags:
``` python
# !lpad init
```
%% Cell type:markdown id:white-owner tags:
Reset/Initialize the FireWorks database (the LaunchPad):
%% Cell type:code id:entertaining-scotland tags:
``` python
# !lpad reset
```
%% Cell type:code id:increasing-context tags:
``` python
!cat my_launchpad.yaml
```
%% Cell type:markdown id:numerous-regard tags:
## Add a Workflow
There are many ways to add Workflows to the database, including a Python API. Let’s start with an extremely simple example that can be added via the command line:
%% Cell type:code id:square-primary tags:
``` python
!lpad add_scripts 'echo "hello"' 'echo "goodbye"' -n hello goodbye -w test_workflow
```
%% Cell type:markdown id:manufactured-bernard tags:
This added a two-job linear workflow. The first jobs prints hello to the command line, and the second job prints goodbye. We gave names (optional) to each step as “hello” and “goodbye”. We named the workflow overall (optional) as “test_workflow”.
%% Cell type:markdown id:mounted-audio tags:
Let’s look at our test workflow:
%% Cell type:code id:identical-international tags:
``` python
!lpad get_wflows -n test_workflow -d more
```
%% Cell type:markdown id:fuzzy-intellectual tags:
We get back basic information on our workflows. The second step “goodbye” is waiting for the first one to complete; it is not ready to run because it depends on the first job.
%% Cell type:markdown id:useful-reviewer tags:
## Run all Workflows
You can run jobs one at a time (“singleshot”) or all at once (“rapidfire”). Let’s run all jobs:
%% Cell type:code id:excess-chosen tags:
``` python
!rlaunch --silencer rapidfire
```
%% Cell type:markdown id:dedicated-commerce tags:
Clearly, both steps of our workflow ran in the correct order.
%% Cell type:markdown id:joined-antique tags:
Let’s again look at our workflows:
%% Cell type:code id:conditional-henry tags:
``` python
!lpad get_wflows -n test_workflow -d more
```
%% Cell type:markdown id:lucky-failing tags:
FireWorks automatically created launcher_ directories for each step in the Workflow and ran them. We see that both steps are complete. Note that there exist options to choose where to run jobs, as well as to tear down empty directories after running jobs.
%% Cell type:markdown id:forward-client tags:
## Launch the web GUI
If you have a web browser, you can launch the web GUI to see your results:
%% Cell type:code id:anticipated-patrol tags:
``` python
# !lpad webgui
```
%% Cell type:markdown id:civic-sauce tags:
Note that there are options to run the web site in a server mode, try lpad webgui -h to see all the options.
open: http://localhost:8888/proxy/5000
%% Cell type:markdown id:explicit-denial tags:
## Python code
The following Python code achieves the same behavior:
%% Cell type:code id:cutting-modification tags:
``` python
from fireworks import Firework, Workflow, LaunchPad, ScriptTask
from fireworks.core.rocket_launcher import rapidfire
# set up the LaunchPad and reset it
launchpad = LaunchPad(host='mongo')
launchpad.reset('', require_password=False)
# create the individual FireWorks and Workflow
fw1 = Firework(ScriptTask.from_str('echo "hello"'), name="hello")
fw2 = Firework(ScriptTask.from_str('echo "goodbye"'), name="goodbye")
wf = Workflow([fw1, fw2], {fw1:fw2}, name="test workflow")
# store workflow and launch it locally
launchpad.add_wf(wf)
rapidfire(launchpad)
```
%% Cell type:markdown id:allied-lesson tags:
In the code above, the `{fw1:fw2}` argument to Workflow is adding a dependency of fw2 to fw1. You could instead define this dependency when defining your FireWorks:
%% Cell type:code id:interstate-intervention tags:
``` python
fw1 = Firework(ScriptTask.from_str('echo "hello"'), name="hello")
fw2 = Firework(ScriptTask.from_str('echo "goodbye"'), name="goodbye", parents=[fw1])
wf = Workflow([fw1, fw2], name="test workflow")
```
%% Cell type:code id:speaking-institution tags:
``` python
```
%% Cell type:code id:devoted-treat tags:
``` python
```
......@@ -16,14 +16,14 @@ qadapters: # List of `qadapters` objects (just one in this simplified example)
# pre_run: "export PATH=$HOME/git_repos/abinit/build_gcc/src/98_main:$PATH"
limits:
timelimit: 24:00:00 # Time-limit for each task.
max_cores: 64 # Max number of cores that can be used by a single task.
hint_cores: 8
timelimit: 01:00:00 # Time-limit for each task.
max_cores: 4 # Max number of cores that can be used by a single task.
hint_cores: 2
hardware:
num_nodes: 1
sockets_per_node: 2
cores_per_socket: 64
mem_per_node: 256 Gb
sockets_per_node: 1
cores_per_socket: 4
mem_per_node: 16 Gb
##############################
authsource: fireworks
host: mongo
logdir: null
mongoclient_kwargs: {}
name: fireworks
password: null
port: 27017
ssl: false
ssl_ca_certs: null
ssl_certfile: null
ssl_keyfile: null
ssl_pem_passphrase: null
strm_lvl: INFO
uri_mode: false
user_indices: []
username: null
wf_user_indices: []
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment