Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
Menu
Open sidebar
Philipp Arras
whatisalikelihood
Commits
e8212f36
Commit
e8212f36
authored
Aug 27, 2018
by
Philipp Arras
Browse files
Remove part about metric
parent
d7a4ae80
Changes
1
Hide whitespace changes
Inline
Sidebyside
main.tex
View file @
e8212f36
...
...
@@ 112,23 +112,11 @@ in a fashion which is compatible with our inference machinery called NIFTy.
\end{itemize}
\section*
{
Mathematical description of what NIFTy needs
}
The Bayesian inference will be a minimization problem in the end. NIFTy comes
with a variety of minimizers in order to solve this problem. These minimizers
differ in their order. If you want to use only first order minimizers, you need
to provide the following two functions:
The abstract definition of what you'll need to implement is very short:
\begin{itemize}
\item
The loglikelihood
$
\mathcal
H
(
ds
)
:
=

\log
\mathcal
P
(
ds
)
$
as a function of
$
s
$
.
\item
Its derivative
$
\mathcal
H'
=
\frac
{
\partial
}{
\partial
s
}
\mathcal
H
(
ds
)
$
.
\end{itemize}
If you want to use second order minimizers (which is definitely worth it),
another object is needed:
\begin{itemize}
\item
$
\langle
\mathcal
H'
\mathcal
H'
^
\dagger
\rangle
_{
\mathcal
P
(
ds
)
}$
.
\end{itemize}
Note that for Gaussian, Poissonian and Bernoulli likelihoods this term doesn't
need to be calculated and implemented because NIFTy computes it automatically.
That's it. The rest of this paper explains what these formulae mean and how to
compute them. From now one, our discussion will become increasingly specific.
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment