Skip to content
GitLab
Explore
Sign in
Primary navigation
Search or go to…
Project
N
NIFTy
Manage
Activity
Members
Labels
Plan
Issues
Issue boards
Milestones
Code
Merge requests
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Build
Pipelines
Jobs
Pipeline schedules
Artifacts
Deploy
Releases
Container registry
Model registry
Monitor
Service Desk
Analyze
Value stream analytics
Contributor analytics
CI/CD analytics
Repository analytics
Model experiments
Help
Help
Support
GitLab documentation
Compare GitLab plans
Community forum
Contribute to GitLab
Provide feedback
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
ift
NIFTy
Commits
3c5a2a01
Commit
3c5a2a01
authored
5 years ago
by
Philipp Arras
Browse files
Options
Downloads
Patches
Plain Diff
Rewrite NewtonCG
parent
abe02b37
No related branches found
No related tags found
2 merge requests
!349
Fix mpi
,
!333
Operator spectra
Pipeline
#52374
passed
5 years ago
Stage: build_docker
Stage: test
Stage: demo_runs
Changes
1
Pipelines
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
nifty5/minimization/descent_minimizers.py
+12
-36
12 additions, 36 deletions
nifty5/minimization/descent_minimizers.py
with
12 additions
and
36 deletions
nifty5/minimization/descent_minimizers.py
+
12
−
36
View file @
3c5a2a01
...
@@ -18,8 +18,11 @@
...
@@ -18,8 +18,11 @@
import
numpy
as
np
import
numpy
as
np
from
..logger
import
logger
from
..logger
import
logger
from
.conjugate_gradient
import
ConjugateGradient
from
.iteration_controllers
import
GradientNormController
from
.line_search
import
LineSearch
from
.line_search
import
LineSearch
from
.minimizer
import
Minimizer
from
.minimizer
import
Minimizer
from
.quadratic_energy
import
QuadraticEnergy
class
DescentMinimizer
(
Minimizer
):
class
DescentMinimizer
(
Minimizer
):
...
@@ -154,49 +157,22 @@ class NewtonCG(DescentMinimizer):
...
@@ -154,49 +157,22 @@ class NewtonCG(DescentMinimizer):
Algorithm derived from SciPy sources.
Algorithm derived from SciPy sources.
"""
"""
def
__init__
(
self
,
controller
,
napprox
=
0
,
line_searcher
=
None
):
def
__init__
(
self
,
controller
,
line_searcher
=
None
):
if
line_searcher
is
None
:
if
line_searcher
is
None
:
line_searcher
=
LineSearch
(
preferred_initial_step_size
=
1.
)
line_searcher
=
LineSearch
(
preferred_initial_step_size
=
1.
)
super
(
NewtonCG
,
self
).
__init__
(
controller
=
controller
,
super
(
NewtonCG
,
self
).
__init__
(
controller
=
controller
,
line_searcher
=
line_searcher
)
line_searcher
=
line_searcher
)
self
.
_napprox
=
int
(
napprox
)
def
get_descent_direction
(
self
,
energy
):
def
get_descent_direction
(
self
,
energy
):
# if self._napprox > 1:
g
=
energy
.
gradient
# from ..probing import approximation2endo
maggrad
=
abs
(
g
).
sum
()
# sqdiag = approximation2endo(energy.metric, self._napprox).sqrt()
float64eps
=
np
.
finfo
(
np
.
float64
).
eps
r
=
energy
.
gradient
maggrad
=
abs
(
r
).
sum
()
termcond
=
np
.
min
([
0.5
,
np
.
sqrt
(
maggrad
)])
*
maggrad
termcond
=
np
.
min
([
0.5
,
np
.
sqrt
(
maggrad
)])
*
maggrad
pos
=
energy
.
position
*
0
ic
=
GradientNormController
(
tol_abs_gradnorm
=
termcond
,
p
=
1
)
d
=
r
e
=
QuadraticEnergy
(
0
*
energy
.
position
,
energy
.
metric
,
g
)
previous_gamma
=
r
.
vdot
(
d
)
e
,
conv
=
ConjugateGradient
(
ic
,
nreset
=
np
.
inf
)(
e
)
ii
=
0
if
conv
==
ic
.
ERROR
:
while
True
:
raise
RuntimeError
if
not
ii
%
10
and
ii
>
0
:
return
-
e
.
position
print
(
ii
)
if
abs
(
r
).
sum
()
<=
termcond
:
return
pos
q
=
energy
.
apply_metric
(
d
)
curv
=
d
.
vdot
(
q
)
if
0
<=
curv
<=
3
*
float64eps
:
return
pos
if
curv
<
0
:
return
pos
if
ii
>
0
else
previous_gamma
/
curv
*
r
ii
+=
1
alpha
=
previous_gamma
/
curv
pos
=
pos
-
alpha
*
d
r
=
r
-
alpha
*
q
s
=
r
gamma
=
r
.
vdot
(
s
)
d
=
d
*
(
gamma
/
previous_gamma
)
+
r
previous_gamma
=
gamma
# curvature keeps increasing, bail out
raise
ValueError
(
"
Warning: CG iterations didn
'
t converge.
"
"
The Hessian is not positive definite.
"
)
class
L_BFGS
(
DescentMinimizer
):
class
L_BFGS
(
DescentMinimizer
):
...
...
This diff is collapsed.
Click to expand it.
Preview
0%
Loading
Try again
or
attach a new file
.
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Save comment
Cancel
Please
register
or
sign in
to comment