Trees | Indices | Help |
|
---|
|
Classes | |
onelog |
Functions | |||
tuple(dict(str:onelog), dict(str:str)) |
|
||
tuple(dict(str:onelog), dict(str:str)) |
|
||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
Variables | |
FILE_DROP_FAC = 0.2
|
|
TRIGGER =
|
|
__package__ =
|
Imports: math, numpy, die, fiatio, gpkmisc, mcmc_logger, LG, IC
Function Details |
OBSOLETE
|
|
|
|
|
|
This selects which measurements will be used. It looks after convergence, then throws out optimizations that haven't converged. |
This selects which measurements will be used. It looks after convergence, then gives you the best few results from each run.
|
This selects which measurements will be used.
|
Return a summary of the properties of the selected indexers.
|
|
Compares the ASCII form of keys, for sorting purposes. Does a good attempt at ASCII ordering for strings and numeric ordering for numbers. |
Compares the tuple form of keys, for sorting purposes. Does a good attempt at ASCII ordering for strings and numeric ordering for numbers. |
References are:
and that references
The other thing, Log(BayesWeightedBayes) is the average of P(D|params,M)*Prior(params) over the posterior distribution of P[params|data,model]*Prior(params). It has no real statistical backing, but it's a crude approximation for the Bayes evidence itself, the normalized P[params|data,model]*Prior(params). |
Trees | Indices | Help |
|
---|
Generated by Epydoc 3.0.1 on Thu Sep 22 04:25:02 2011 | http://epydoc.sourceforge.net |