Commit 4ef02cc1 by Jakob Knollmueller

some changes and relaxed newton

parent 52192e16
 ... @@ -24,7 +24,7 @@ from nifty import LineEnergy ... @@ -24,7 +24,7 @@ from nifty import LineEnergy class LineSearch(Loggable, object): class LineSearch(Loggable, object): """Class for finding a step size. """Class for determining the optimal step size along some descent direction. Initialize the line search procedure which can be used by a specific line Initialize the line search procedure which can be used by a specific line search method. Its finds the step size in a specific direction in the search method. Its finds the step size in a specific direction in the ... ...
 ... @@ -26,18 +26,16 @@ from .line_searching import LineSearchStrongWolfe ... @@ -26,18 +26,16 @@ from .line_searching import LineSearchStrongWolfe class QuasiNewtonMinimizer(Loggable, object): class QuasiNewtonMinimizer(Loggable, object): """A Class used by other minimization methods to find local minimum. """A class used by other minimization methods to find a local minimum. Quasi-Newton methods are used to find local minima or maxima of a function Descent minimization methods are used to find a local minimum of a scalar function by approximating the Jacobian or Hessian matrix at every iteration. The by following a descent direction. This class implements the minimization procedure, class performs general steps(gets the gradient, descend direction, step the descent direction has to be implemented separately. size and checks the conergence) which can be used then by a specific minimization method. Parameters Parameters ---------- ---------- line_searcher : callable line_searcher : callable Function which finds the step size into the descent direction. (default: Function which finds the step size in descent direction. (default: LineSearchStrongWolfe()) LineSearchStrongWolfe()) callback : function, *optional* callback : function, *optional* Function f(energy, iteration_number) specified by the user to print Function f(energy, iteration_number) specified by the user to print ... @@ -94,7 +92,7 @@ class QuasiNewtonMinimizer(Loggable, object): ... @@ -94,7 +92,7 @@ class QuasiNewtonMinimizer(Loggable, object): """Runs the minimization on the provided Energy class. """Runs the minimization on the provided Energy class. Accepts the NIFTY Energy class which describes our system and it runs Accepts the NIFTY Energy class which describes our system and it runs the minimization to find the minimum/maximum of the system. the minimization to find the minimum of the system. Parameters Parameters ---------- ---------- ... @@ -104,7 +102,7 @@ class QuasiNewtonMinimizer(Loggable, object): ... @@ -104,7 +102,7 @@ class QuasiNewtonMinimizer(Loggable, object): Returns Returns ------- ------- x : field x : Field Latest `energy` of the minimization. Latest `energy` of the minimization. convergence : integer convergence : integer Latest convergence level indicating whether the minimization Latest convergence level indicating whether the minimization ... ...
 ... @@ -51,12 +51,26 @@ class RelaxedNewton(QuasiNewtonMinimizer): ... @@ -51,12 +51,26 @@ class RelaxedNewton(QuasiNewtonMinimizer): self.line_searcher.prefered_initial_step_size = 1. self.line_searcher.prefered_initial_step_size = 1. def _get_descend_direction(self, energy): def _get_descend_direction(self, energy): """ Calculates the descent direction according to a Newton scheme. The descent direction is determined by weighting the gradient at the current parameter position with the inverse local curvature, provided by the Energy object. Parameters ---------- energy : Energy The energy object providing implementations of the to be minimized function, its gradient and curvature. Returns ------- out : Field Returns the descent direction with proposed step length. In a quadratic potential this corresponds to the optimal step. """ gradient = energy.gradient gradient = energy.gradient curvature = energy.curvature curvature = energy.curvature descend_direction = curvature.inverse_times(gradient) descend_direction = curvature.inverse_times(gradient) return descend_direction * -1 return descend_direction * -1 #norm = descend_direction.norm() # if norm != 1: # return descend_direction / -norm # else: # return descend_direction * -1
Supports Markdown
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!