Skip to content
GitLab
Projects
Groups
Snippets
Help
Loading...
Help
What's new
7
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
Open sidebar
ift
NIFTy
Commits
4ef02cc1
Commit
4ef02cc1
authored
May 10, 2017
by
Jakob Knollmueller
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
some changes and relaxed newton
parent
52192e16
Changes
3
Hide whitespace changes
Inline
Sidebyside
Showing
3 changed files
with
27 additions
and
15 deletions
+27
15
nifty/minimization/line_searching/line_search.py
nifty/minimization/line_searching/line_search.py
+1
1
nifty/minimization/quasi_newton_minimizer.py
nifty/minimization/quasi_newton_minimizer.py
+7
9
nifty/minimization/relaxed_newton.py
nifty/minimization/relaxed_newton.py
+19
5
No files found.
nifty/minimization/line_searching/line_search.py
View file @
4ef02cc1
...
...
@@ 24,7 +24,7 @@ from nifty import LineEnergy
class
LineSearch
(
Loggable
,
object
):
"""Class for
f
in
d
ing
a step size
.
"""Class for
determ
ining
the optimal step size along some descent direction
.
Initialize the line search procedure which can be used by a specific line
search method. Its finds the step size in a specific direction in the
...
...
nifty/minimization/quasi_newton_minimizer.py
View file @
4ef02cc1
...
...
@@ 26,18 +26,16 @@ from .line_searching import LineSearchStrongWolfe
class
QuasiNewtonMinimizer
(
Loggable
,
object
):
"""A
C
lass used by other minimization methods to find local minimum.
"""A
c
lass used by other minimization methods to find
a
local minimum.
QuasiNewton methods are used to find local minima or maxima of a function
by approximating the Jacobian or Hessian matrix at every iteration. The
class performs general steps(gets the gradient, descend direction, step
size and checks the conergence) which can be used then by a specific
minimization method.
Descent minimization methods are used to find a local minimum of a scalar function
by following a descent direction. This class implements the minimization procedure,
the descent direction has to be implemented separately.
Parameters

line_searcher : callable
Function which finds the step size in
to the
descent direction. (default:
Function which finds the step size in descent direction. (default:
LineSearchStrongWolfe())
callback : function, *optional*
Function f(energy, iteration_number) specified by the user to print
...
...
@@ 94,7 +92,7 @@ class QuasiNewtonMinimizer(Loggable, object):
"""Runs the minimization on the provided Energy class.
Accepts the NIFTY Energy class which describes our system and it runs
the minimization to find the minimum
/maximum
of the system.
the minimization to find the minimum of the system.
Parameters

...
...
@@ 104,7 +102,7 @@ class QuasiNewtonMinimizer(Loggable, object):
Returns

x :
f
ield
x :
F
ield
Latest `energy` of the minimization.
convergence : integer
Latest convergence level indicating whether the minimization
...
...
nifty/minimization/relaxed_newton.py
View file @
4ef02cc1
...
...
@@ 51,12 +51,26 @@ class RelaxedNewton(QuasiNewtonMinimizer):
self
.
line_searcher
.
prefered_initial_step_size
=
1.
def
_get_descend_direction
(
self
,
energy
):
""" Calculates the descent direction according to a Newton scheme.
The descent direction is determined by weighting the gradient at the
current parameter position with the inverse local curvature, provided by the
Energy object.
Parameters

energy : Energy
The energy object providing implementations of the to be minimized function,
its gradient and curvature.
Returns

out : Field
Returns the descent direction with proposed step length. In a quadratic
potential this corresponds to the optimal step.
"""
gradient
=
energy
.
gradient
curvature
=
energy
.
curvature
descend_direction
=
curvature
.
inverse_times
(
gradient
)
return
descend_direction
*

1
#norm = descend_direction.norm()
# if norm != 1:
# return descend_direction / norm
# else:
# return descend_direction * 1
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment