Commit b4b9b12d authored by Martin Reinecke's avatar Martin Reinecke
Browse files

Merge branch 'nifty7_changes' into 'NIFTy_8'

Nifty7 changes

See merge request !659
parents e3a348b7 9ed964a5
Pipeline #105563 passed with stages
in 30 minutes and 38 seconds
......@@ -111,6 +111,13 @@ likelihood becomes the identity matrix. This is needed for the `GeoMetricKL`
algorithm.
Remove gitversion interface
---------------------------
Since we provide proper nifty releases on PyPI now, the gitversion interface is
not supported any longer.
Changes since NIFTy 5
=====================
......
......@@ -6,13 +6,13 @@ RUN apt-get update && apt-get install -y \
# Packages needed for NIFTy
python3-scipy \
# Documentation build dependencies
python3-sphinx-rtd-theme dvipng texlive-latex-base texlive-latex-extra \
dvipng texlive-latex-base texlive-latex-extra \
# Testing dependencies
python3-pytest-cov jupyter \
# Optional NIFTy dependencies
python3-mpi4py python3-matplotlib \
# more optional NIFTy dependencies
&& pip3 install ducc0 finufft jupyter jax jaxlib \
&& pip3 install ducc0 finufft jupyter jax jaxlib sphinx pydata-sphinx-theme \
&& rm -rf /var/lib/apt/lists/*
# Set matplotlib backend
......
include ChangeLog.md
include demos/*.py
graft tests
global-exclude *.py[cod]
......@@ -4,7 +4,7 @@ NIFTy - Numerical Information Field Theory
[![coverage report](https://gitlab.mpcdf.mpg.de/ift/nifty/badges/NIFTy_8/coverage.svg)](https://gitlab.mpcdf.mpg.de/ift/nifty/-/commits/NIFTy_8)
**NIFTy** project homepage:
[http://ift.pages.mpcdf.de/nifty](http://ift.pages.mpcdf.de/nifty)
[https://ift.pages.mpcdf.de/nifty](https://ift.pages.mpcdf.de/nifty)
Summary
-------
......@@ -49,7 +49,7 @@ Installation
- [SciPy](https://www.scipy.org/)
Optional dependencies:
- [DUCC0](https://gitlab.mpcdf.mpg.de/mtr/ducc) for faster FFTs, spherical
- [ducc0](https://gitlab.mpcdf.mpg.de/mtr/ducc) for faster FFTs, spherical
harmonic transforms, and radio interferometry gridding support
- [mpi4py](https://mpi4py.scipy.org) (for MPI-parallel execution)
- [matplotlib](https://matplotlib.org/) (for field plotting)
......@@ -91,7 +91,7 @@ MPI support is added via:
sudo apt-get install python3-mpi4py
### Running the tests
### Run the tests
To run the tests, additional packages are required:
......@@ -105,29 +105,20 @@ following command in the repository root:
### First Steps
For a quick start, you can browse through the [informal
introduction](http://ift.pages.mpcdf.de/nifty/code.html) or
introduction](https://ift.pages.mpcdf.de/nifty/code.html) or
dive into NIFTy by running one of the demonstrations, e.g.:
python3 demos/getting_started_1.py
### Building the documentation from source
To build the documentation from source, install
[sphinx](https://www.sphinx-doc.org/en/stable/index.html) and the
[Read The Docs Sphinx Theme](https://github.com/readthedocs/sphinx_rtd_theme)
on your system and run
sh docs/generate.sh
### Acknowledgements
Please acknowledge the use of NIFTy in your publication(s) by using a
phrase such as the following:
Please consider acknowledging NIFTy in your publication(s) by using a phrase
such as the following:
> "Some of the results in this publication have been derived using the
> NIFTy package [(https://gitlab.mpcdf.mpg.de/ift/NIFTy)](https://gitlab.mpcdf.mpg.de/ift/NIFTy)"
and a citation to one of the [publications](http://ift.pages.mpcdf.de/nifty/citations.html).
and a citation to one of the [publications](https://ift.pages.mpcdf.de/nifty/citations.html).
### Licensing terms
......@@ -137,15 +128,6 @@ The NIFTy package is licensed under the terms of the
*without any warranty*.
Contributing
------------
Please note our convention not to use pure Python `assert` statements in
production code. They are not guaranteed to by executed by Python and can be
turned off by the user (`python -O` in cPython). As an alternative use
`ift.myassert`.
Contributors
------------
......
import nifty8
needs_sphinx = '3.2.0'
extensions = [
'sphinx.ext.napoleon', # Support for NumPy and Google style docstrings
'sphinx.ext.imgmath', # Render math as images
......@@ -26,8 +28,19 @@ language = None
exclude_patterns = []
add_module_names = False
html_theme = "sphinx_rtd_theme"
html_theme = "pydata_sphinx_theme"
html_logo = 'nifty_logo_black.png'
html_theme_options = {
"icon_links": [
{
"name": "PyPI",
"url": "https://pypi.org/project/nifty8",
"icon": "fas fa-box",
}
],
"gitlab_url": "https://gitlab.mpcdf.mpg.de/ift/nifty",
}
html_last_updated_fmt = '%b %d, %Y'
exclude_patterns = [
'mod/modules.rst', 'mod/nifty8.git_version.rst', 'mod/nifty8.logger.rst'
......
Contributing to NIFTy
=====================
Coding conventions
------------------
We do not use pure Python `assert` statements in production code. They are not
guaranteed to by executed by Python and can be turned off by the user
(`python -O` in cPython). As an alternative use `ift.myassert`.
Build the documentation
-----------------------
To build the documentation from source, install `sphinx
<https://www.sphinx-doc.org/en/stable/index.html>`_ and the `pydata sphinx theme
<https://github.com/pydata/pydata-sphinx-theme>`_ on your system and run
.. code-block:: sh
sh docs/generate.sh
NIFTy -- Numerical Information Field Theory
===========================================
NIFTy Manual
============
**NIFTy** [1]_ [2]_ [3]_, "\ **N**\umerical **I**\nformation **F**\ield **T**\heor\ **y**\ ", is a versatile library designed to enable the development of signal inference algorithms that are independent of the underlying grids (spatial, spectral, temporal, …) and their resolutions.
Its object-oriented framework is written in Python, although it accesses libraries written in C++ and C for efficiency.
NIFTy offers a toolkit that abstracts discretized representations of continuous spaces, fields in these spaces, and operators acting on these fields into classes.
This allows for an abstract formulation and programming of inference algorithms, including those derived within information field theory.
NIFTy's interface is designed to resemble IFT formulae in the sense that the user implements algorithms in NIFTy independent of the topology of the underlying spaces and the discretization scheme.
Thus, the user can develop algorithms on subsets of problems and on spaces where the detailed performance of the algorithm can be properly evaluated and then easily generalize them to other, more complex spaces and the full problem, respectively.
The set of spaces on which NIFTy operates comprises point sets, *n*-dimensional regular grids, spherical spaces, their harmonic counterparts, and product spaces constructed as combinations of those.
NIFTy takes care of numerical subtleties like the normalization of operations on fields and the numerical representation of model components, allowing the user to focus on formulating the abstract inference procedures and process-specific model properties.
References
----------
.. [1] Selig et al., "NIFTY - Numerical Information Field Theory. A versatile PYTHON library for signal inference ", 2013, Astronmy and Astrophysics 554, 26; `[DOI] <https://ui.adsabs.harvard.edu/link_gateway/2013A&A...554A..26S/doi:10.1051/0004-6361/201321236>`_, `[arXiv:1301.4499] <https://arxiv.org/abs/1301.4499>`_
.. [2] Steininger et al., "NIFTy 3 - Numerical Information Field Theory - A Python framework for multicomponent signal inference on HPC clusters", 2017, accepted by Annalen der Physik; `[arXiv:1708.01073] <https://arxiv.org/abs/1708.01073>`_
.. [3] Arras et al., "NIFTy5: Numerical Information Field Theory v5", 2019, Astrophysics Source Code Library; `[ascl:1903.008] <http://ascl.net/1903.008>`_
Contents
........
Welcome to the nifty8 documentation!
.. toctree::
:maxdepth: 2
:maxdepth: 1
ift
volume
Gallery <https://wwwmpa.mpa-garching.mpg.de/~ensslin/nifty-gallery/index.html>
installation
code
citations
Package Documentation <mod/nifty8>
User Guide <user/index>
API reference <mod/nifty8>
Development <dev/index>
Approximate Inference
=====================
In Variational Inference (VI), the posterior :math:`\mathcal{P}(\xi|d)` is approximated by a simpler, parametrized distribution, often a Gaussian :math:`\mathcal{Q}(\xi)=\mathcal{G}(\xi-m,D)`.
The parameters of :math:`\mathcal{Q}`, the mean :math:`m` and its covariance :math:`D` are obtained by minimization of an appropriate information distance measure between :math:`\mathcal{Q}` and :math:`\mathcal{P}`.
As a compromise between being optimal and being computationally affordable, the variational Kullback-Leibler (KL) divergence is used:
.. math::
\mathrm{KL}(m,D|d)= \mathcal{D}_\mathrm{KL}(\mathcal{Q}||\mathcal{P})=
\int \mathcal{D}\xi \,\mathcal{Q}(\xi) \log \left( \frac{\mathcal{Q}(\xi)}{\mathcal{P}(\xi)} \right)
NIFTy features two main alternatives for variational inference: Metric Gaussian Variational Inference (MGVI) and geometric Variational Inference (geoVI).
A visual comparison of the MGVI and GeoVI algorithm can be found in `variational_inference_visualized.py <https://gitlab.mpcdf.mpg.de/ift/nifty/-/blob/NIFTy_8/demos/variational_inference_visualized.py>`_.
Metric Gaussian Variational Inference (MGVI)
--------------------------------------------
Minimizing the KL divergence with respect to all entries of the covariance :math:`D` is unfeasible for fields.
Therefore, Metric Gaussian Variational Inference (MGVI, [1]_) approximates the posterior precision matrix :math:`D^{-1}` at the location of the current mean :math:`m` by the Bayesian Fisher information metric,
.. math::
M \approx \left\langle \frac{\partial \mathcal{H}(d,\xi)}{\partial \xi} \, \frac{\partial \mathcal{H}(d,\xi)}{\partial \xi}^\dagger \right\rangle_{(d,\xi)}.
In practice the average is performed over :math:`\mathcal{P}(d,\xi)\approx \mathcal{P}(d|\xi)\,\delta(\xi-m)` by evaluating the expression at the current mean :math:`m`.
This results in a Fisher information metric of the likelihood evaluated at the mean plus the prior information metric.
Therefore we will only have to infer the mean of the approximate distribution.
The only term within the KL-divergence that explicitly depends on it is the Hamiltonian of the true problem averaged over the approximation:
.. math::
\mathrm{KL}(m|d) \;\widehat{=}\;
\left\langle \mathcal{H}(\xi,d) \right\rangle_{\mathcal{Q}(\xi)},
where :math:`\widehat{=}` expresses equality up to irrelevant (here not :math:`m`-dependent) terms.
Thus, only the gradient of the KL is needed with respect to this, which can be expressed as
.. math::
\frac{\partial \mathrm{KL}(m|d)}{\partial m} = \left\langle \frac{\partial \mathcal{H}(d,\xi)}{\partial \xi} \right\rangle_{\mathcal{G}(\xi-m,D)}.
We stochastically estimate the KL-divergence and gradients with a set of samples drawn from the approximate posterior distribution.
The particular structure of the covariance allows us to draw independent samples solving a certain system of equations.
This KL-divergence for MGVI is implemented by
:func:`~nifty8.minimization.kl_energies.MetricGaussianKL` within NIFTy8.
Note that MGVI typically provides only a lower bound on the variance.
Geometric Variational Inference (geoVI)
---------------------------------------
For non-linear posterior distributions :math:`\mathcal{P}(\xi|d)` an approximation with a Gaussian :math:`\mathcal{Q}(\xi)` in the coordinates :math:`\xi` is sub-optimal, as higher order interactions are ignored.
A better approximation can be achieved by constructing a coordinate system :math:`y = g\left(\xi\right)` in which the posterior is close to a Gaussian, and perform VI with a Gaussian :math:`\mathcal{Q}(y)` in these coordinates.
This approach is called Geometric Variational Inference (geoVI).
It is discussed in detail in [2]_.
One useful coordinate system is obtained in case the metric :math:`M` of the posterior can be expressed as the pullback of the Euclidean metric by :math:`g`:
.. math::
M = \left(\frac{\partial g}{\partial \xi}\right)^T \frac{\partial g}{\partial \xi} \ .
In general, such a transformation exists only locally, i.e. in a neighbourhood of some expansion point :math:`\bar{\xi}`, denoted as :math:`g_{\bar{\xi}}\left(\xi\right)`.
Using :math:`g_{\bar{\xi}}`, the GeoVI scheme uses a zero mean, unit Gaussian :math:`\mathcal{Q}(y) = \mathcal{G}(y, 1)` approximation.
It can be expressed in :math:`\xi` coordinates via the pushforward by the inverse transformation :math:`\xi = g_{\bar{\xi}}^{-1}(y)`:
.. math::
\mathcal{Q}_{\bar{\xi}}(\xi) = \left(g_{\bar{\xi}}^{-1} * \mathcal{Q}\right)(\xi) = \int \delta\left(\xi - g_{\bar{\xi}}^{-1}(y)\right) \ \mathcal{G}(y, 1) \ \mathcal{D}y \ ,
where :math:`\delta` denotes the Kronecker-delta.
GeoVI obtains the optimal expansion point :math:`\bar{\xi}` such that :math:`\mathcal{Q}_{\bar{\xi}}` matches the posterior as good as possible.
Analogous to the MGVI algorithm, :math:`\bar{\xi}` is obtained by minimization of the KL-divergence between :math:`\mathcal{P}` and :math:`\mathcal{Q}_{\bar{\xi}}` w.r.t. :math:`\bar{\xi}`.
Furthermore the KL is represented as a stochastic estimate using a set of samples drawn from :math:`\mathcal{Q}_{\bar{\xi}}` which is implemented in NIFTy8 via :func:`~nifty8.minimization.kl_energies.GeoMetricKL`.
Publications
------------
If you use MGVI or geoVI, the authors of the respective papers [1]_ [2]_ would greatly appreciate a citation.
.. [1] J. Knollmüller, T.A. Enßlin, "Metric Gaussian Variational Inference"; `[arXiv:1901.11033] <https://arxiv.org/abs/1901.11033>`_
.. [2] P. Frank, R. Leike, and T.A. Enßlin (2021), "Geometric Variational Inference"; `[arXiv:2105.10470] <https://arxiv.org/abs/2105.10470>`_ `[doi] <https://doi.org/10.3390/e23070853>`_
NIFTy-related publications
==========================
::
.. parsed-literal::
@article{asclnifty5,
title={NIFTy5: Numerical Information Field Theory v5},
author={Arras, Philipp and Baltac, Mihai and Ensslin, Torsten A and Frank, Philipp and Hutschenreuter, Sebastian and Knollmueller, Jakob and Leike, Reimar and Newrzella, Max-Niklas and Platz, Lukas and Reinecke, Martin and others},
......@@ -9,6 +9,7 @@ NIFTy-related publications
year={2019}
}
.. parsed-literal::
@software{nifty,
author = {{Martin Reinecke, Theo Steininger, Marco Selig}},
title = {NIFTy -- Numerical Information Field TheorY},
......@@ -17,7 +18,8 @@ NIFTy-related publications
date = {2018-04-05},
}
@article{2013A&A...554A..26S,
.. parsed-literal::
@article{nifty1,
author = {{Selig}, M. and {Bell}, M.~R. and {Junklewitz}, H. and {Oppermann}, N. and {Reinecke}, M. and {Greiner}, M. and {Pachajoa}, C. and {En{\ss}lin}, T.~A.},
title = "{NIFTY - Numerical Information Field Theory. A versatile PYTHON library for signal inference}",
journal = {\aap},
......@@ -35,7 +37,8 @@ NIFTy-related publications
adsnote = {Provided by the SAO/NASA Astrophysics Data System}
}
@article{2017arXiv170801073S,
.. parsed-literal::
@article{nifty3,
author = {{Steininger}, T. and {Dixit}, J. and {Frank}, P. and {Greiner}, M. and {Hutschenreuter}, S. and {Knollm{\"u}ller}, J. and {Leike}, R. and {Porqueres}, N. and {Pumpe}, D. and {Reinecke}, M. and {{\v S}raml}, M. and {Varady}, C. and {En{\ss}lin}, T.},
title = "{NIFTy 3 - Numerical Information Field Theory - A Python framework for multicomponent signal inference on HPC clusters}",
journal = {ArXiv e-prints},
......@@ -48,3 +51,18 @@ NIFTy-related publications
adsurl = {http://cdsads.u-strasbg.fr/abs/2017arXiv170801073S},
adsnote = {Provided by the SAO/NASA Astrophysics Data System}
}
.. parsed-literal::
@article{geovi,
author = {Frank, Philipp and Leike, Reimar and Enßlin, Torsten A.},
title = {Geometric Variational Inference},
journal = {Entropy},
volume = {23},
year = {2021},
number = {7},
article-number = {853},
url = {https://www.mdpi.com/1099-4300/23/7/853},
issn = {1099-4300},
doi = {10.3390/e23070853}
}
IFT -- Information Field Theory
===============================
Information Field Theory
========================
Theoretical Background
----------------------
......@@ -145,14 +145,14 @@ Here, the uncertainty of the field and the power spectrum of its generating proc
+----------------------------------------------------+
| **Output of tomography demo getting_started_3.py** |
+----------------------------------------------------+
| .. image:: images/getting_started_3_setup.png |
| .. image:: ../images/getting_started_3_setup.png |
| |
+----------------------------------------------------+
| Non-Gaussian signal field, |
| data backprojected into the image domain, power |
| spectrum of underlying Gausssian process. |
+----------------------------------------------------+
| .. image:: images/getting_started_3_results.png |
| .. image:: ../images/getting_started_3_results.png |
| |
+----------------------------------------------------+
| Posterior mean field signal |
......@@ -181,81 +181,3 @@ In the high dimensional setting of field inference these volume factors can diff
A MAP estimate, which is only representative for a tiny fraction of the parameter space, might be a poorer choice (with respect to an error norm) compared to a slightly worse location with slightly lower posterior probability, which, however, is associated with a much larger volume (of nearby locations with similar probability).
This causes MAP signal estimates to be more prone to overfitting the noise as well as to perception thresholds than methods that take volume effects into account.
Metric Gaussian Variational Inference
-------------------------------------
One method that takes volume effects into account is Variational Inference (VI).
In VI, the posterior :math:`\mathcal{P}(\xi|d)` is approximated by a simpler, parametrized distribution, often a Gaussian :math:`\mathcal{Q}(\xi)=\mathcal{G}(\xi-m,D)`.
The parameters of :math:`\mathcal{Q}`, the mean :math:`m` and its covariance :math:`D` are obtained by minimization of an appropriate information distance measure between :math:`\mathcal{Q}` and :math:`\mathcal{P}`.
As a compromise between being optimal and being computationally affordable, the variational Kullback-Leibler (KL) divergence is used:
.. math::
\mathrm{KL}(m,D|d)= \mathcal{D}_\mathrm{KL}(\mathcal{Q}||\mathcal{P})=
\int \mathcal{D}\xi \,\mathcal{Q}(\xi) \log \left( \frac{\mathcal{Q}(\xi)}{\mathcal{P}(\xi)} \right)
Minimizing this with respect to all entries of the covariance :math:`D` is unfeasible for fields.
Therefore, Metric Gaussian Variational Inference (MGVI) approximates the posterior precision matrix :math:`D^{-1}` at the location of the current mean :math:`m` by the Bayesian Fisher information metric,
.. math::
M \approx \left\langle \frac{\partial \mathcal{H}(d,\xi)}{\partial \xi} \, \frac{\partial \mathcal{H}(d,\xi)}{\partial \xi}^\dagger \right\rangle_{(d,\xi)}.
In practice the average is performed over :math:`\mathcal{P}(d,\xi)\approx \mathcal{P}(d|\xi)\,\delta(\xi-m)` by evaluating the expression at the current mean :math:`m`.
This results in a Fisher information metric of the likelihood evaluated at the mean plus the prior information metric.
Therefore we will only have to infer the mean of the approximate distribution.
The only term within the KL-divergence that explicitly depends on it is the Hamiltonian of the true problem averaged over the approximation:
.. math::
\mathrm{KL}(m|d) \;\widehat{=}\;
\left\langle \mathcal{H}(\xi,d) \right\rangle_{\mathcal{Q}(\xi)},
where :math:`\widehat{=}` expresses equality up to irrelevant (here not :math:`m`-dependent) terms.
Thus, only the gradient of the KL is needed with respect to this, which can be expressed as
.. math::
\frac{\partial \mathrm{KL}(m|d)}{\partial m} = \left\langle \frac{\partial \mathcal{H}(d,\xi)}{\partial \xi} \right\rangle_{\mathcal{G}(\xi-m,D)}.
We stochastically estimate the KL-divergence and gradients with a set of samples drawn from the approximate posterior distribution.
The particular structure of the covariance allows us to draw independent samples solving a certain system of equations.
This KL-divergence for MGVI is implemented by
:func:`~nifty8.minimization.kl_energies.MetricGaussianKL` within NIFTy8.
Note that MGVI typically provides only a lower bound on the variance.
Geometric Variational Inference
-------------------------------
For non-linear posterior distributions :math:`\mathcal{P}(\xi|d)` an approximation with a Gaussian :math:`\mathcal{Q}(\xi)` in the coordinates :math:`\xi` is sub-optimal, as higher order interactions are ignored.
A better approximation can be achieved by constructing a coordinate system :math:`y = g\left(\xi\right)` in which the posterior is close to a Gaussian, and perform VI with a Gaussian :math:`\mathcal{Q}(y)` in these coordinates.
This approach is called Geometric Variation Inference (geoVI).
It is discussed in detail in [6]_.
One useful coordinate system is obtained in case the metric :math:`M` of the posterior can be expressed as the pullback of the Euclidean metric by :math:`g`:
.. math::
M = \left(\frac{\partial g}{\partial \xi}\right)^T \frac{\partial g}{\partial \xi} \ .
In general, such a transformation exists only locally, i.e. in a neighbourhood of some expansion point :math:`\bar{\xi}`, denoted as :math:`g_{\bar{\xi}}\left(\xi\right)`.
Using :math:`g_{\bar{\xi}}`, the GeoVI scheme uses a zero mean, unit Gaussian :math:`\mathcal{Q}(y) = \mathcal{G}(y, 1)` approximation.
It can be expressed in :math:`\xi` coordinates via the pushforward by the inverse transformation :math:`\xi = g_{\bar{\xi}}^{-1}(y)`:
.. math::
\mathcal{Q}_{\bar{\xi}}(\xi) = \left(g_{\bar{\xi}}^{-1} * \mathcal{Q}\right)(\xi) = \int \delta\left(\xi - g_{\bar{\xi}}^{-1}(y)\right) \ \mathcal{G}(y, 1) \ \mathcal{D}y \ ,
where :math:`\delta` denotes the Kronecker-delta.
GeoVI obtains the optimal expansion point :math:`\bar{\xi}` such that :math:`\mathcal{Q}_{\bar{\xi}}` matches the posterior as good as possible.
Analogous to the MGVI algorithm, :math:`\bar{\xi}` is obtained by minimization of the KL-divergence between :math:`\mathcal{P}` and :math:`\mathcal{Q}_{\bar{\xi}}` w.r.t. :math:`\bar{\xi}`.
Furthermore the KL is represented as a stochastic estimate using a set of samples drawn from :math:`\mathcal{Q}_{\bar{\xi}}` which is implemented in NIFTy8 via :func:`~nifty8.minimization.kl_energies.GeoMetricKL`.
A visual comparison of the MGVI and GeoVI algorithm can be found in `variational_inference_visualized.py <https://gitlab.mpcdf.mpg.de/ift/nifty/-/blob/NIFTy_8/demos/variational_inference_visualized.py>`_.
.. [6] P. Frank, R. Leike, and T.A. Enßlin (2021), "Geometric Variational Inference"; `[arXiv:2105.10470] <https://arxiv.org/abs/2105.10470>`_
NIFTy user guide
================
This guide is an overview and explains the main idea behind nifty. More details
are found in the `API reference <../mod/nifty8.html>`_.
.. toctree::
:maxdepth: 1
whatisnifty
installation
ift
approximate_inference
volume
code
citations
......@@ -29,7 +29,8 @@ MPI support is added via::
NIFTy documentation is provided by Sphinx. To build the documentation::
sudo apt-get install python3-sphinx-rtd-theme dvipng
sudo apt-get install dvipng texlive-latex-base texlive-latex-extra
pip3 install sphinx pydata-sphinx-theme
cd <nifty_directory>
sh docs/generate.sh
......@@ -39,4 +40,3 @@ To view the documentation in firefox::
(Note: Make sure that you reinstall nifty after each change since sphinx
imports nifty from the Python path.)
Discretisation and Volume in NIFTy
==================================
Discretisation and Volume
=========================
.. note:: Some of this discussion is rather technical and may be skipped in a first read-through.
......@@ -12,7 +12,7 @@ Fields are defined to be scalar functions on the manifold, living in the functio
Unless we find ourselves in the lucky situation that we can solve for the posterior statistics of interest analytically, we need to apply numerical methods.
This is where NIFTy comes into play.
.. figure:: images/inference.png
.. figure:: ../images/inference.png
:width: 80%
:align: center
......@@ -138,7 +138,7 @@ NIFTy is implemented such that in order to change resolution, only the line of c
It automatically takes care of dependent structures like volume factors, discretised operators and responses.
A visualisation of this can be seen in figure 2, which displays the MAP inference of a signal at various resolutions.
.. figure:: images/converging_discretization.png
.. figure:: ../images/converging_discretization.png
:scale: 80%
:align: center
......
What is NIFTy?
==============
**NIFTy** [1]_ [2]_ [3]_, "\ **N**\umerical **I**\nformation **F**\ield **T**\heor\ **y**\ ", is a versatile library designed to enable the development of signal inference algorithms that are independent of the underlying grids (spatial, spectral, temporal, …) and their resolutions.
Its object-oriented framework is written in Python, although it accesses libraries written in C++ and C for efficiency.
NIFTy offers a toolkit that abstracts discretized representations of continuous spaces, fields in these spaces, and operators acting on these fields into classes.
This allows for an abstract formulation and programming of inference algorithms, including those derived within information field theory.
NIFTy's interface is designed to resemble IFT formulae in the sense that the user implements algorithms in NIFTy independent of the topology of the underlying spaces and the discretization scheme.
Thus, the user can develop algorithms on subsets of problems and on spaces where the detailed performance of the algorithm can be properly evaluated and then easily generalize them to other, more complex spaces and the full problem, respectively.
The set of spaces on which NIFTy operates comprises point sets, *n*-dimensional regular grids, spherical spaces, their harmonic counterparts, and product spaces constructed as combinations of those.
NIFTy takes care of numerical subtleties like the normalization of operations on fields and the numerical representation of model components, allowing the user to focus on formulating the abstract inference procedures and process-specific model properties.
Examples of nifty applications can be found in the `nifty gallery (external link) <https://wwwmpa.mpa-garching.mpg.de/~ensslin/nifty-gallery/index.html>`_.
References
----------
.. [1] Selig et al., "NIFTY - Numerical Information Field Theory. A versatile PYTHON library for signal inference ", 2013, Astronmy and Astrophysics 554, 26; `[DOI] <https://ui.adsabs.harvard.edu/link_gateway/2013A&A...554A..26S/doi:10.1051/0004-6361/201321236>`_, `[arXiv:1301.4499] <https://arxiv.org/abs/1301.4499>`_
.. [2] Steininger et al., "NIFTy 3 - Numerical Information Field Theory - A Python framework for multicomponent signal inference on HPC clusters", 2017, accepted by Annalen der Physik; `[arXiv:1708.01073] <https://arxiv.org/abs/1708.01073>`_
.. [3] Arras et al., "NIFTy5: Numerical Information Field Theory v5", 2019, Astrophysics Source Code Library; `[ascl:1903.008] <http://ascl.net/1903.008>`_
[build-system]
requires = ["setuptools >= 40.6.0", "numpy >= 1.17.0", "scipy >= 1.4.1"]
build-backend = "setuptools.build_meta"
......@@ -11,44 +11,52 @@
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
#
# Copyright(C) 2013-2019 Max-Planck-Society
# Copyright(C) 2013-2021 Max-Planck-Society
#
# NIFTy is being developed at the Max-Planck-Institut fuer Astrophysik.
from setuptools import find_packages, setup
import os
import site
import sys
def write_version():
import subprocess
try:
p = subprocess.Popen(["git", "describe", "--dirty", "--tags", "--always"],
stdout=subprocess.PIPE)
res = p.communicate()[0].strip().decode('utf-8')
except FileNotFoundError:
print("Could not determine version string from git history")
res = "unknown"
with open("nifty8/git_version.py", "w") as file:
file.write('gitversion = "{}"\n'.format(res))
from setuptools import find_packages, setup
# Workaround until https://github.com/pypa/pip/issues/7953 is fixed
site.ENABLE_USER_SITE = "--user" in sys.argv[1:]
write_version()
exec(open('nifty8/version.py').read())
with open("README.md") as f:
long_description = f.read()
description = """NIFTy, Numerical Information Field Theory, is a versatile
library designed to enable the development of signal inference algorithms that
operate regardless of the underlying grids and their resolutions."""
setup(name="nifty8",
version=__version__,
author="Theo Steininger, Martin Reinecke",
author="Martin Reinecke",
author_email="martin@mpa-garching.mpg.de",
description="Numerical Information Field Theory",
url="http://www.mpa-garching.mpg.de/ift/nifty/",
description=description,
long_description=long_description,
long_description_content_type="text/markdown",
url="https://ift.pages.mpcdf.de/nifty/",
project_urls={
"Bug Tracker": "https://gitlab.mpcdf.mpg.de/ift/nifty/issues",
"Documentation": "https://ift.pages.mpcdf.de/nifty/",
"Source Code": "https://gitlab.mpcdf.mpg.de/ift/nifty",
"Changelog": "https://gitlab.mpcdf.mpg.de/ift/nifty/-/blob/NIFTy_8/ChangeLog",
},
packages=find_packages(include=["nifty8", "nifty8.*"]),
zip_safe=True,
license="GPLv3",
setup_requires=['scipy>=1.4.1', 'numpy>=1.17'],
install_requires=['scipy>=1.4.1', 'numpy>=1.17'],
python_requires='>=3.6',
classifiers=[
"Development Status :: 4 - Beta",
"Topic :: Utilities",
"License :: OSI Approved :: GNU General Public License v3 "
"or later (GPLv3+)"],
"Development Status :: 5 - Production/Stable",
"Topic :: Scientific/Engineering :: Mathematics",
"Topic :: Scientific/Engineering :: Physics",
"License :: OSI Approved :: GNU General Public License v3 (GPLv3)",
"Operating System :: OS Independent",
"Programming Language :: Python",
"Intended Audience :: Science/Research"],
)
......@@ -380,13 +380,12 @@ def MetricGaussianKL(mean, hamiltonian, n_samples, mirror_samples, constants=[],
these occasions but rather the minimizer is told that the position it
has tried is not sensible.
Notes
-----
Note
----
The two lists `constants` and `point_estimates` are independent from each
other. It is possible to sample along domains which are kept constant
during minimization and vice versa.