This class adapts the unconstrained nonlinear minimization algorithms in
the "minimization" package to the task of estimating locally optimal
(minimum-cost) parameter sets. This allows us to use algorithms like BFGS
FunctionMinimizerBFGS to find locally optimal parameters of, for
DifferentiableFeedforwardNeuralNetwork. Any first-order
FunctionMinimizer may be dropped into this class.
My current preference is for using BFGS (
solve virtually all problems. However, when there are too many parameters,
then Liu-Storey conjugate gradient (
another good choice.
When first-order derivative information is not available, then you may use
either automatic differentiation (
or the derivative-free minimization routines, such as those used by