Package gmisclib :: Module opt :: Class opt
[frames] | no frames]

Class opt

source code

A class that implements a optimizer.

Instance Methods
 
__init__(self, p, T, fcn, args=None)
Create an optimizer object.
source code
 
set_nthreads(self, n)
Set the number of simultaneous threads to be allowed.
source code
 
z(self)
What is the current residual?
source code
 
sumsq(self)
What is the current error?
source code
 
print_ev_diag(self, misc={}, vectors=0) source code
 
Td(self)
This allows the differentiation temperature to be dynamically set.
source code
 
dE(self) source code
 
quick_lambda(self, a, b, startp, lamb, ntries=10) source code
 
search_with_update(self, lamb, newp) source code
 
calc_curv(self) source code
 
measure(self)
If you decrease the number of active parameters, it might behoove you to set self.diff=None, before you call this function, so that old derivitives get flushed.
source code
 
step(self)
A minimization step.
source code
 
predicted_sumsq_delta(self, last_p)
The energy change of a step, assuming you're at the minimum.
source code
 
one_anneal(self, T)
Take one simulated annealing step, as fast as possible.
source code
 
terminate(self)
Returns number between 0 and 1 if the optimization seems finished.
source code
 
covariance_under(self)
This is an underestimate of covariance.
source code
 
covariance(self)
Covariance estimate.
source code
 
run(self, maxsteps=None, misc={}, Ta=None)
This is the normal top-level routine.
source code
 
print_errbar(self) source code
 
sim_anneal(self, nsteps, Ta=None, misc={})
A top-level simulated annealing routine.
source code
 
print_at(self, misc={})
Prints a file to match the old C++ optimizer output.
source code
Class Variables
  __doc__ = """A class that implements a optimiz...
  LOG_NAME = 'a.tmp'
Instance Variables
  T
simulated annealing temperature.
  active
vector of which parameters to optimize.
  constrain
a function that constrains the step.
  deriv_quantum
minimum step size used in differentiation.
  scale
estimate of the step size to use in differentiation.
  sem
a semaphore to control the number of threads to use (if you want to set the # of threads do self.sem = semclass(nnn)).
Method Details

__init__(self, p, T, fcn, args=None)
(Constructor)

source code 

Create an optimizer object. You can also set the following variables between calls to self.step(): scale, deriv_quantum, T, active, sem, constrain

Parameters:
  • p - initial parameter vector.
  • T - temperature for simulated annealing and numeric differentiation.
  • fcn - fcn(p, args, diffparam, diffctr, processor), where p = parameters, args = arbitrary reference passed to function, diffparam = None (unless differentiating), diffctr = None (unless differentiating), processor = (proc_id, i_steps), where proc_id is in range(self.nthreads) and specifies which processor ought to be assigned to the task, and i_steps is the number of steps the optimization has gone through. The function is assumed to return either a Num Python array or a sequence of arrays and floats, mixed.

set_nthreads(self, n)

source code 

Set the number of simultaneous threads to be allowed. Calculations of the residuals are farmed out to threads.

one_anneal(self, T)

source code 

Take one simulated annealing step, as fast as possible. We send all the processors in random directions, and the one that reports an acceptable step first is returned. Returns 1 on success, 0 if no good step could be found in a reasonable number of tries. The temperature can also be a function of one argument (as an alternative to a float), in which case, the temperature is calculated as T(self).

terminate(self)

source code 

Returns number between 0 and 1 if the optimization seems finished. Returns -1 if it's clearly not done yet. This is designed to be called after step(), so that both self.p and self.last_p refer to the results of downhill steps.

covariance_under(self)

source code 

This is an underestimate of covariance. Adding in the self.lamb term means that it attempts to correct for the nonlinearity of the problem, at least along the direction of search.

run(self, maxsteps=None, misc={}, Ta=None)

source code 

This is the normal top-level routine. You should feel free to make your own, though.


Class Variable Details

__doc__

Value:
"""A class that implements a optimizer.
		@ivar scale: estimate of the step size to use in differentiation.
		@ivar deriv_quantum: minimum step size used in differentiation.
		@ivar T: simulated annealing temperature.
		@ivar active: vector of which parameters to optimize.  Only the acti\
ve
				parameters change their value.
		@ivar sem: a semaphore to control the number of threads to use
...

Instance Variable Details

active

vector of which parameters to optimize. Only the active parameters change their value.

constrain

a function that constrains the step. You can use it to do a constrained fit by having it project the proposed step back into the legal volume. Called as self.constrain(old_prms, proposed_prms, self.args). It returns the constrained step, or None if there isn't any.