Commit 7f76e3cb authored by Philipp Frank's avatar Philipp Frank

restructured docs: include apidoc support for module index, added copy of...

restructured docs: include apidoc support for module index, added copy of gallery and introduction of nifty1 docs, changed to rtd theme and bug fixes
parent 1024df5e
Pipeline #12975 passed with stage
in 4 minutes and 27 seconds
# Makefile for Sphinx documentation
#
# You can set these variables from the command line.
SPHINXOPTS =
SPHINXBUILD = sphinx-build
PAPER =
BUILDDIR = build
# User-friendly check for sphinx-build
ifeq ($(shell which $(SPHINXBUILD) >/dev/null 2>&1; echo $$?), 1)
$(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from http://sphinx-doc.org/)
endif
# Internal variables.
PAPEROPT_a4 = -D latex_paper_size=a4
PAPEROPT_letter = -D latex_paper_size=letter
ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) source
# the i18n builder cannot share the environment and doctrees with the others
I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) source
.PHONY: help
help:
@echo "Please use \`make <target>' where <target> is one of"
@echo " html to make standalone HTML files"
@echo " dirhtml to make HTML files named index.html in directories"
@echo " singlehtml to make a single large HTML file"
@echo " pickle to make pickle files"
@echo " json to make JSON files"
@echo " htmlhelp to make HTML files and a HTML help project"
@echo " qthelp to make HTML files and a qthelp project"
@echo " applehelp to make an Apple Help Book"
@echo " devhelp to make HTML files and a Devhelp project"
@echo " epub to make an epub"
@echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
@echo " latexpdf to make LaTeX files and run them through pdflatex"
@echo " latexpdfja to make LaTeX files and run them through platex/dvipdfmx"
@echo " text to make text files"
@echo " man to make manual pages"
@echo " texinfo to make Texinfo files"
@echo " info to make Texinfo files and run them through makeinfo"
@echo " gettext to make PO message catalogs"
@echo " changes to make an overview of all changed/added/deprecated items"
@echo " xml to make Docutils-native XML files"
@echo " pseudoxml to make pseudoxml-XML files for display purposes"
@echo " linkcheck to check all external links for integrity"
@echo " doctest to run all doctests embedded in the documentation (if enabled)"
@echo " coverage to run coverage check of the documentation (if enabled)"
.PHONY: clean
clean:
rm -rf $(BUILDDIR)/*
.PHONY: html
html:
$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."
.PHONY: dirhtml
dirhtml:
$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml."
.PHONY: singlehtml
singlehtml:
$(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml
@echo
@echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml."
.PHONY: pickle
pickle:
$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
@echo
@echo "Build finished; now you can process the pickle files."
.PHONY: json
json:
$(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
@echo
@echo "Build finished; now you can process the JSON files."
.PHONY: htmlhelp
htmlhelp:
$(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
@echo
@echo "Build finished; now you can run HTML Help Workshop with the" \
".hhp project file in $(BUILDDIR)/htmlhelp."
.PHONY: qthelp
qthelp:
$(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
@echo
@echo "Build finished; now you can run "qcollectiongenerator" with the" \
".qhcp project file in $(BUILDDIR)/qthelp, like this:"
@echo "# qcollectiongenerator $(BUILDDIR)/qthelp/NIFTY.qhcp"
@echo "To view the help file:"
@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/NIFTY.qhc"
.PHONY: applehelp
applehelp:
$(SPHINXBUILD) -b applehelp $(ALLSPHINXOPTS) $(BUILDDIR)/applehelp
@echo
@echo "Build finished. The help book is in $(BUILDDIR)/applehelp."
@echo "N.B. You won't be able to view it unless you put it in" \
"~/Library/Documentation/Help or install it in your application" \
"bundle."
.PHONY: devhelp
devhelp:
$(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
@echo
@echo "Build finished."
@echo "To view the help file:"
@echo "# mkdir -p $$HOME/.local/share/devhelp/NIFTY"
@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/NIFTY"
@echo "# devhelp"
.PHONY: epub
epub:
$(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
@echo
@echo "Build finished. The epub file is in $(BUILDDIR)/epub."
.PHONY: latex
latex:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo
@echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
@echo "Run \`make' in that directory to run these through (pdf)latex" \
"(use \`make latexpdf' here to do that automatically)."
.PHONY: latexpdf
latexpdf:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo "Running LaTeX files through pdflatex..."
$(MAKE) -C $(BUILDDIR)/latex all-pdf
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
.PHONY: latexpdfja
latexpdfja:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo "Running LaTeX files through platex and dvipdfmx..."
$(MAKE) -C $(BUILDDIR)/latex all-pdf-ja
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
.PHONY: text
text:
$(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text
@echo
@echo "Build finished. The text files are in $(BUILDDIR)/text."
.PHONY: man
man:
$(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man
@echo
@echo "Build finished. The manual pages are in $(BUILDDIR)/man."
.PHONY: texinfo
texinfo:
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
@echo
@echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo."
@echo "Run \`make' in that directory to run these through makeinfo" \
"(use \`make info' here to do that automatically)."
.PHONY: info
info:
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
@echo "Running Texinfo files through makeinfo..."
make -C $(BUILDDIR)/texinfo info
@echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo."
.PHONY: gettext
gettext:
$(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale
@echo
@echo "Build finished. The message catalogs are in $(BUILDDIR)/locale."
.PHONY: changes
changes:
$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
@echo
@echo "The overview file is in $(BUILDDIR)/changes."
.PHONY: linkcheck
linkcheck:
$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
@echo
@echo "Link check complete; look for any errors in the above output " \
"or in $(BUILDDIR)/linkcheck/output.txt."
.PHONY: doctest
doctest:
$(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
@echo "Testing of doctests in the sources finished, look at the " \
"results in $(BUILDDIR)/doctest/output.txt."
.PHONY: coverage
coverage:
$(SPHINXBUILD) -b coverage $(ALLSPHINXOPTS) $(BUILDDIR)/coverage
@echo "Testing of coverage in the sources finished, look at the " \
"results in $(BUILDDIR)/coverage/python.txt."
.PHONY: xml
xml:
$(SPHINXBUILD) -b xml $(ALLSPHINXOPTS) $(BUILDDIR)/xml
@echo
@echo "Build finished. The XML files are in $(BUILDDIR)/xml."
.PHONY: pseudoxml
pseudoxml:
$(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml
@echo
@echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml."
Two step creation of webpages:
sphinx-apidoc -l -e -d 3 -o sphinx/source/mod/ nifty/ nifty/plotting/ nifty/spaces/power_space/power_indices.py nifty/spaces/power_space/power_index_factory.py nifty/config/ nifty/basic_arithmetics.py nifty/nifty_meta.py nifty/random.py nifty/version.py nifty/field_types/ nifty/operators/fft_operator/transformations/rg_transforms.py
creates all .rst files neccesary for ModuleIndex excluding helper modules
sphinx-build -b html sphinx/source/ sphinx/build/
generates html filel amd build directory
...@@ -16,6 +16,9 @@ from nifty import * ...@@ -16,6 +16,9 @@ from nifty import *
import sys import sys
import os import os
import sphinx_rtd_theme
# If extensions (or modules to document with autodoc) are in another directory, # If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the # add these directories to sys.path here. If the directory is relative to the
...@@ -77,7 +80,7 @@ master_doc = 'index' ...@@ -77,7 +80,7 @@ master_doc = 'index'
# General information about the project. # General information about the project.
project = u'NIFTY' project = u'NIFTY'
copyright = u'2017, Theo Steininger' copyright = u'2013-2017, Max-Planck-Society'
author = u'Theo Steininger' author = u'Theo Steininger'
# The version info for the project you're documenting, acts as replacement for # The version info for the project you're documenting, acts as replacement for
...@@ -85,9 +88,9 @@ author = u'Theo Steininger' ...@@ -85,9 +88,9 @@ author = u'Theo Steininger'
# built documents. # built documents.
# #
# The short X.Y version. # The short X.Y version.
version = u'3.0.4' version = u'3.0'
# The full version, including alpha/beta/rc tags. # The full version, including alpha/beta/rc tags.
release = u'3.0.x' release = u'3.0.4'
# The language for content autogenerated by Sphinx. Refer to documentation # The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages. # for a list of supported languages.
...@@ -138,18 +141,19 @@ todo_include_todos = True ...@@ -138,18 +141,19 @@ todo_include_todos = True
# The theme to use for HTML and HTML Help pages. See the documentation for # The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes. # a list of builtin themes.
html_theme = 'alabaster' html_theme = "sphinx_rtd_theme"
#html_theme = 'classic'
# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the
# documentation.
html_theme_options = { html_theme_options = {
'page_width':'' 'collapse_navigation': False,
'display_version': False,
'navigation_depth': 3,
} }
html_theme_path = [sphinx_rtd_theme.get_html_theme_path()]
# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the
# documentation.
#html_theme_options = {}
# Add any paths that contain custom themes here, relative to this directory. # Add any paths that contain custom themes here, relative to this directory.
#html_theme_path = [] #html_theme_path = []
...@@ -163,7 +167,7 @@ html_theme_options = { ...@@ -163,7 +167,7 @@ html_theme_options = {
# The name of an image file (relative to this directory) to place at the top # The name of an image file (relative to this directory) to place at the top
# of the sidebar. # of the sidebar.
html_logo = 'nifty_logo_black.png' html_logo = 'nifty_logo_white.png'
# The name of an image file (relative to this directory) to use as a favicon of # The name of an image file (relative to this directory) to use as a favicon of
# the docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32 # the docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32
...@@ -196,7 +200,7 @@ html_last_updated_fmt = '%b %d, %Y' ...@@ -196,7 +200,7 @@ html_last_updated_fmt = '%b %d, %Y'
#html_additional_pages = {} #html_additional_pages = {}
# If false, no module index is generated. # If false, no module index is generated.
#html_domain_indices = True html_domain_indices = False
# If false, no index is generated. # If false, no index is generated.
#html_use_index = True #html_use_index = True
......
.. currentmodule:: nifty
The ``DiagonalOperator`` class -- ...
.....................................
.. autoclass:: DiagonalOperator
:show-inheritance:
:members:
.. currentmodule:: nifty
The ``FFTOperator`` class -- Fourier Transformations
....................................................
.. autoclass:: FFTOperator
:show-inheritance:
:members:
...@@ -11,7 +11,3 @@ In NIFTY, Fields are used to store data arrays and carry all the needed metainfo ...@@ -11,7 +11,3 @@ In NIFTY, Fields are used to store data arrays and carry all the needed metainfo
:show-inheritance: :show-inheritance:
:members: :members:
.. rubric:: Methods
.. autoautosummary:: Field
:methods:
Image Gallery
-------------
Transformations & Projections
.............................
.. currentmodule:: nifty
The "Faraday Map" [1]_ in spherical representation on a :py:class:`hp_space` and a :py:class:`gl_space`, their quadrupole projections, the uncertainty of the map, and the angular power spectrum.
+----------------------------+----------------------------+
| .. image:: images/f_00.png | .. image:: images/f_01.png |
| :width: 90 % | :width: 90 % |
+----------------------------+----------------------------+
| .. image:: images/f_02.png | .. image:: images/f_03.png |
| :width: 90 % | :width: 90 % |
+----------------------------+----------------------------+
| .. image:: images/f_04.png | .. image:: images/f_05.png |
| :width: 90 % | :width: 70 % |
+----------------------------+----------------------------+
Gaussian random fields
......................
Statistically homogeneous and isotropic Gaussian random fields drawn from different power spectra.
+----------------------------+----------------------------+
| .. image:: images/t_03.png | .. image:: images/t_04.png |
| :width: 60 % | :width: 70 % |
+----------------------------+----------------------------+
| .. image:: images/t_05.png | .. image:: images/t_06.png |
| :width: 60 % | :width: 70 % |
+----------------------------+----------------------------+
Wiener filtering I
..................
Wiener filter reconstruction of Gaussian random signal.
+--------------------------------+--------------------------------+--------------------------------+
| original signal | noisy data | reconstruction |
+================================+================================+================================+
| .. image:: images/rg1_s.png | .. image:: images/rg1_d.png | .. image:: images/rg1_m.png |
| :width: 90 % | :width: 90 % | :width: 90 % |
+--------------------------------+--------------------------------+--------------------------------+
| .. image:: images/rg2_s_pm.png | .. image:: images/rg2_d_pm.png | .. image:: images/rg2_m_pm.png |
| :width: 90 % | :width: 90 % | :width: 90 % |
+--------------------------------+--------------------------------+--------------------------------+
| .. image:: images/hp_s.png | .. image:: images/hp_d.png | .. image:: images/hp_m.png |
| :width: 90 % | :width: 90 % | :width: 90 % |
+--------------------------------+--------------------------------+--------------------------------+
Image reconstruction
....................
Image reconstruction of the classic "Moon Surface" image. The original image "Moon Surface" was taken from the `USC-SIPI image database <http://sipi.usc.edu/database/>`_.
+-----------------------------------+-----------------------------------+-----------------------------------+
| .. image:: images/moon_s.png | .. image:: images/moon_d.png | .. image:: images/moon_m.png |
| :width: 90 % | :width: 90 % | :width: 90 % |
+-----------------------------------+-----------------------------------+-----------------------------------+
| .. image:: images/moon_kernel.png | .. image:: images/moon_mask.png | .. image:: images/moon_sigma.png |
| :width: 90 % | :width: 90 % | :width: 90 % |
+-----------------------------------+-----------------------------------+-----------------------------------+
Wiener filtering II
...................
Wiener filter reconstruction results for the full and partially blinded data. Shown are the original signal (orange), the reconstruction (green), and :math:`1\sigma`-confidence interval (gray).
+--------------------------------------+--------------------------------------+
| noisy data | reconstruction results |
+======================================+======================================+
| .. image:: images/rg1_d.png | .. image:: images/rg1_m_err_.png |
| :width: 90 % | :width: 90 % |
+--------------------------------------+--------------------------------------+
| .. image:: images/rg1_d_gap.png | .. image:: images/rg1_m_gap_err_.png |
| :width: 90 % | :width: 90 % |
+--------------------------------------+--------------------------------------+
D\ :sup:`3`\ PO -- Denoising, Deconvolving, and Decomposing Photon Observations
...............................................................................
Application of the D\ :sup:`3`\ PO algorithm [2]_ showing the raw photon count data and the denoised, deconvolved, and decomposed reconstruction of the diffuse photon flux.
+--------------------------------------+--------------------------------------+
| .. image:: images/D3PO_data.png | .. image:: images/D3PO_diffuse.png |
| :width: 95 % | :width: 95 % |
+--------------------------------------+--------------------------------------+
RESOLVE -- Aperature synthesis imaging in radio astronomy
.........................................................
Signal inference on simulated single-frequency data: reconstruction by CLEAN (using uniform weighting) and by RESOLVE [3]_ (using IFT & NIFTY).
+-------------------------------------+-------------------------------------+-------------------------------------+
| .. image:: images/radio_signal.png | .. image:: images/radio_CLEAN.png | .. image:: images/radio_RESOLVE.png |
| :width: 90 % | :width: 90 % | :width: 90 % |
+-------------------------------------+-------------------------------------+-------------------------------------+
D\ :sup:`3`\ PO -- light
........................
Inference of the mock distribution of some species across Australia exploiting geospatial correlations in a (strongly) simplified scenario [4]_.
+--------------------------------+--------------------------------+--------------------------------+
| .. image:: images/au_data.png | .. image:: images/au_map.png | .. image:: images/au_error.png |
| :width: 90 % | :width: 90 % | :width: 90 % |
+--------------------------------+--------------------------------+--------------------------------+
NIFTY meets Lensing
...................
Signal reconstruction for a simulated image that has undergone strong gravitational lensing. Without *a priori* knowledge of the signal covariance :math:`S`, a common approach rescaling the `Laplace-Operator <http://de.wikipedia.org/wiki/Laplace-Operator>`_ and IFT's `"critical" filter <./demo_excaliwir.html#critical-wiener-filtering>`_ are compared.
+--------------------------------+--------------------------------+--------------------------------+--------------------------------+
| .. image:: images/lens_s0.png | .. image:: images/lens_d0.png | .. image:: images/lens_m1.png | .. image:: images/lens_m2.png |
| :width: 80 % | :width: 80 % | :width: 80 % | :width: 80 % |
| | | | |
| | | .. math:: | .. math:: |
| | | S(x,y) &= | S(x,y) &= |
| | | \lambda \: \Delta^{-1} | S(|x-y|) |
| | | \\ \equiv | \\ \equiv |
| | | S(k,l) &= \delta(k-l) | S(k,l) &= \delta(k-l) |
| | | \: \lambda \: k^{-2} | \: P(k) |
+--------------------------------+--------------------------------+--------------------------------+--------------------------------+
.. [1] N. Oppermann et. al., "An improved map of the Galactic Faraday sky", Astronomy & Astrophysics, vol. 542, id. A93, p. 14, see also the `project homepage <http://www.mpa-garching.mpg.de/ift/faraday/>`_
.. [2] M. Selig et. al., "Denoising, Deconvolving, and Decomposing Photon Observations", submitted to Astronomy & Astrophysics, 2013; `arXiv:1311.1888 <http://www.arxiv.org/abs/1311.1888>`_
.. [3] H. Junklewitz et. al., "RESOLVE: A new algorithm for aperture synthesis imaging of extended emission in radio astronomy", submitted to Astronomy & Astrophysics, 2013; `arXiv:1311.5282 <http://www.arxiv.org/abs/1311.5282>`_
.. [4] M. Selig, "The NIFTY way of Bayesian signal inference", submitted proceeding of the 33rd MaxEnt, 2013
IFT -- Information Field Theory
===============================
Theoretical Background
----------------------
`Information Field Theory <http://www.mpa-garching.mpg.de/ift/>`_ [1]_ (IFT) is information theory, the logic of reasoning under uncertainty, applied to fields. A field can be any quantity defined over some space, e.g. the air temperature over Europe, the magnetic field strength in the Milky Way, or the matter density in the Universe. IFT describes how data and knowledge can be used to infer field properties. Mathematically it is a statistical field theory and exploits many of the tools developed for such. Practically, it is a framework for signal processing and image reconstruction.
IFT is fully Bayesian. How else can infinitely many field degrees of freedom be constrained by finite data?
It can be used without the knowledge of Feynman diagrams. There is a full toolbox of methods. It reproduces many known well working algorithms. This should be reassuring. And, there were certainly previous works in a similar spirit. Anyhow, in many cases IFT provides novel rigorous ways to extract information from data.
.. tip:: An *in-a-nutshell introduction to information field theory* can be found in [2]_.
.. [1] T. Ensslin et al., "Information field theory for cosmological perturbation reconstruction and nonlinear signal analysis", PhysRevD.80.105005, 09/2009; `arXiv:0806.3474 <http://www.arxiv.org/abs/0806.3474>`_
.. [2] T. Ensslin, "Information field theory", accepted for the proceedings of MaxEnt 2012 -- the 32nd International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering; `arXiv:1301.2556 <http://arxiv.org/abs/1301.2556>`_
Discretized continuum
---------------------
The representation of fields that are mathematically defined on a continuous space in a finite computer environment is a common necessity. The goal hereby is to preserve the continuum limit in the calculus in order to ensure a resolution independent discretization.
+-----------------------------+-----------------------------+
| .. image:: images/42vs6.png | .. image:: images/42vs9.png |
| :width: 100 % | :width: 100 % |
+-----------------------------+-----------------------------+
Any partition of the continuous position space :math:`\Omega` (with volume :math:`V`) into a set of :math:`Q` disjoint, proper subsets :math:`\Omega_q` (with volumes :math:`V_q`) defines a pixelization,
.. math::
\Omega &\quad=\quad \dot{\bigcup_q} \; \Omega_q \qquad \mathrm{with} \qquad q \in \{1,\dots,Q\} \subset \mathbb{N}
, \\
V &\quad=\quad \int_\Omega \mathrm{d}x \quad=\quad \sum_{q=1}^Q \int_{\Omega_q} \mathrm{d}x \quad=\quad \sum_{q=1}^Q V_q
.
Here the number :math:`Q` characterizes the resolution of the pixelization and the continuum limit is described by :math:`Q \rightarrow \infty` and :math:`V_q \rightarrow 0` for all :math:`q \in \{1,\dots,Q\}` simultaneously. Moreover, the above equation defines a discretization of continuous integrals, :math:`\int_\Omega \mathrm{d}x \mapsto \sum_q V_q`.
Any valid discretization scheme for a field :math:`{s}` can be described by a mapping,
.. math::
s(x \in \Omega_q) \quad\mapsto\quad s_q \quad=\quad \int_{\Omega_q} \mathrm{d}x \; w_q(x) \; s(x)
,
if the weighting function :math:`w_q(x)` is chosen appropriately. In order for the discretized version of the field to converge to the actual field in the continuum limit, the weighting functions need to be normalized in each subset; i.e., :math:`\forall q: \int_{\Omega_q} \mathrm{d}x \; w_q(x) = 1`. Choosing such a weighting function that is constant with respect to :math:`x` yields
.. math::
s_q = \frac{\int_{\Omega_q} \mathrm{d}x \; s(x)}{\int_{\Omega_q} \mathrm{d}x} = \left< s(x) \right>_{\Omega_q}
,
which corresponds to a discretization of the field by spatial averaging. Another common and equally valid choice is :math:`w_q(x) = \delta(x-x_q)`, which distinguishes some position :math:`x_q \in \Omega_q`, and evaluates the continuous field at this position,
.. math::
s_q \quad=\quad \int_{\Omega_q} \mathrm{d}x \; \delta(x-x_q) \; s(x) \quad=\quad s(x_q)
.
In practice, one often makes use of the spatially averaged pixel position, :math:`x_q = \left< x \right>_{\Omega_q}`. If the resolution is high enough to resolve all features of the signal field :math:`{s}`, both of these discretization schemes approximate each other, :math:`\left< s(x) \right>_{\Omega_q} \approx s(\left< x \right>_{\Omega_q})`, since they approximate the continuum limit by construction. (The approximation of :math:`\left< s(x) \right>_{\Omega_q} \approx s(x_q \in \Omega_q)` marks a resolution threshold beyond which further refinement of the discretization reveals no new features; i.e., no new information content of the field :math:`{s}`.)
All operations involving position integrals can be normalized in accordance with the above definitions. For example, the scalar product between two fields :math:`{s}` and :math:`{u}` is defined as
.. math::
{s}^\dagger {u} \quad=\quad \int_\Omega \mathrm{d}x \; s^*(x) \; u(x) \quad\approx\quad \sum_{q=1}^Q V_q^{\phantom{*}} \; s_q^* \; u_q^{\phantom{*}}
,
where :math:`\dagger` denotes adjunction and :math:`*` complex conjugation. Since the above approximation becomes an equality in the continuum limit, the scalar product is independent of the pixelization scheme and resolution, if the latter is sufficiently high.
The above line of argumentation analogously applies to the discretization of operators. For a linear operator :math:`{A}` acting on some field :math:`{s}` as :math:`{A} {s} = \int_\Omega \mathrm{d}y \; A(x,y) \; s(y)`, a matrix representation discretized with constant weighting functions is given by
.. math::
A(x \in \Omega_p, y \in \Omega_q) \quad\mapsto\quad A_{pq} \quad=\quad \frac{\iint_{\Omega_p \Omega_q} \mathrm{d}x \, \mathrm{d}y \; A(x,y)}{\iint_{\Omega_p \Omega_q} \mathrm{d}x \, \mathrm{d}y} \quad=\quad \big< \big< A(x,y) \big>_{\Omega_p} \big>_{\Omega_q}
.
The proper discretization of spaces, fields, and operators, as well as the normalization of position integrals, is essential for the conservation of the continuum limit. Their consistent implementation in NIFTY allows a pixelization independent coding of algorithms.