compass_search
compass_search,
a C++ code which
seeks the minimizer of a scalar function of several variables
using compass search, a direct search algorithm that does not
use derivatives.
The algorithm, which goes back to Fermi and Metropolis, is easy to describe.
The algorithm begins with a starting point X, and a step size DELTA.
For each dimension I, the algorithm considers perturbing X(I) by adding
or subtracting DELTA.
If a perturbation is found which decreases the function, this becomes the
new X. Otherwise DELTA is halved.
The iteration halts when DELTA reaches a minimal value.
The algorithm is not guaranteed to find a global minimum. It can, for
instance, easily be attracted to a local minimum. Moreover, the algorithm
can diverge if, for instance, the function decreases as the argument goes
to infinity.
Licensing:
The computer code and data files described and made available on this web page
are distributed under
the MIT license
Languages:
compass_search is available in
a C version and
a C++ version and
a FORTRAN90 version and
a MATLAB version and
a Python version.
Related Data and Programs:
ASA047,
a C++ code which
minimizes a scalar function of several variables using the NelderMead algorithm.
compass_search_test
TEST_OPT,
a C++ code which
defines test problems
requiring the minimization of a scalar function of several variables.
TOMS178,
a C++ code which
optimizes a scalar functional of multiple variables using the
HookeJeeves method.
Author:
John Burkardt
Reference:

Evelyn Beale,
On an Iterative Method for Finding a Local Minimum of a Function
of More than One Variable,
Technical Report 25,
Statistical Techniques Research Group,
Princeton University, 1958.

Richard Brent,
Algorithms for Minimization without Derivatives,
Dover, 2002,
ISBN: 0486419983,
LC: QA402.5.B74.

Charles Broyden,
A class of methods for solving nonlinear simultaneous equations,
Mathematics of Computation,
Volume 19, 1965, pages 577593.

David Himmelblau,
Applied Nonlinear Programming,
McGraw Hill, 1972,
ISBN13: 9780070289215,
LC: T57.8.H55.

Tamara Kolda, Robert Michael Lewis, Virginia Torczon,
Optimization by Direct Search: New Perspectives on Some Classical
and Modern Methods,
SIAM Review,
Volume 45, Number 3, 2003, pages 385482.

Ken McKinnon,
Convergence of the NelderMead simplex method to a nonstationary point,
SIAM Journal on Optimization,
Volume 9, Number 1, 1998, pages 148158.

Zbigniew Michalewicz,
Genetic Algorithms + Data Structures = Evolution Programs,
Third Edition,
Springer, 1996,
ISBN: 3540606769,
LC: QA76.618.M53.

Jorge More, Burton Garbow, Kenneth Hillstrom,
Testing unconstrained optimization software,
ACM Transactions on Mathematical Software,
Volume 7, Number 1, March 1981, pages 1741.

Michael Powell,
An Iterative Method for Finding Stationary Values of a Function
of Several Variables,
Computer Journal,
Volume 5, 1962, pages 147151.

Howard Rosenbrock,
An Automatic Method for Finding the Greatest or Least Value of a Function,
Computer Journal,
Volume 3, 1960, pages 175184.
Source Code:
Last revised on 21 February 2020.