# COORDINATE_SEARCH The Coordinate Search Optimization Algorithm

COORDINATE_SEARCH is a MATLAB program which seeks the minimizer of a scalar function of several variables, by Jeff Borggaard.

The algorithm, which goes back to Fermi and Metropolis, is easy to describe. As originally designed, the algorithm begins with a central point, as well as 2*M "test" points, where M is the number of spatial dimensions. The I-th pair of test points differs from the central point only in the I-th coordinate. One test point increases, and one decreases, this coordinate by a fixed amount H.

The function value of all 2*M+1 points is computed; if the lowest value occurs at the center point, then H is decreased. If the lowest value occurs at a test point, then that test point becomes the new center point.

On the next step, using the current center point and size H, a new set of 2*M test points is computed, and the process is repeated.

Under certain simple conditions, the process will converge to a local minimizer of the function.

Note that fminsearch is a MATLAB built in command which minimizes a scalar function of several variables using the Nelder-Mead algorithm.

### Usage:

x_opt = coordinate_search ( x_center, f, flag )
where
• x_center, is a vector containing a starting point for the iteration.
• f is the "function handle"; that is, either a quoted expression for the function, or the name of an M-file that defines the function, preceded by an "@" sign;
• flag: if flag is nonzero, then the program assumes that a 2D problem is being solved, and it will display a plot of the iterative procedure. If flag is 3, then a special variation of the algorithm is to be used in which 3 test points are employed in a triangular pattern around the center point.
• x_opt is the program's estimate for the minimizer of the function;

Very simple functions can be input as a quoted string. Thus, one could specify the f argument as '(x(1)-2*x(2)+7)^2'; However, for more complicated functions it makes sense to prepare an M-file that defines the function. For this same example, a suitable M-file would be:

``````
function f = example ( x )
f = ( x(1) - 2 * x(2) + 7 )^2;
``````

If this information was stored in an M-file called example.m, then one might invoke the optimization program with a command like

``````
x_opt = coordinate_search ( x_init, @example, 0 )
``````

### Languages:

COORDINATE_SEARCH is available in a MATLAB version.

### Related Data and Programs:

ASA047, a MATLAB library which minimizes a scalar function of several variables using the Nelder-Mead algorithm.

COMPASS_SEARCH, is a MATLAB library which seeks the minimizer of a scalar function of several variables using compass search, a direct search algorithm that does not use derivatives.

ENTRUST, a MATLAB program which minimizes a scalar function of several variables using trust-region methods.

NELDER_MEAD, a MATLAB program which minimizes a scalar function of several variables using the Nelder-Mead algorithm.

PRAXIS, a FORTRAN90 library which implements the principal axis method of Richard Brent for minimization of a function without the use of derivatives.

TEST_OPT, a MATLAB library which defines test problems requiring the minimization of a scalar function of several variables.

TOMS178, a MATLAB library which optimizes a scalar functional of multiple variables using the Hooke-Jeeves method.

### Author:

Jeff Borggaard,
Mathematics Department,
Virginia Tech.

### Reference:

1. Evelyn Beale,
On an Iterative Method for Finding a Local Minimum of a Function of More than One Variable,
Technical Report 25,
Statistical Techniques Research Group,
Princeton University, 1958.
2. Richard Brent,
Algorithms for Minimization without Derivatives,
Dover, 2002,
ISBN: 0-486-41998-3,
LC: QA402.5.B74.
3. David Himmelblau,
Applied Nonlinear Programming,
McGraw Hill, 1972,
ISBN13: 978-0070289215,
LC: T57.8.H55.
4. Ken McKinnon,
Convergence of the Nelder-Mead simplex method to a nonstationary point,
SIAM Journal on Optimization,
Volume 9, Number 1, 1998, pages 148-158.
5. Zbigniew Michalewicz,
Genetic Algorithms + Data Structures = Evolution Programs,
Third Edition,
Springer, 1996,
ISBN: 3-540-60676-9,
LC: QA76.618.M53.
6. Michael Powell,
An Iterative Method for Finding Stationary Values of a Function of Several Variables,
Computer Journal,
Volume 5, 1962, pages 147-151.
7. Howard Rosenbrock,
An Automatic Method for Finding the Greatest or Least Value of a Function,
Computer Journal,
Volume 3, 1960, pages 175-184.

### Examples and Tests:

COORDINATE_SEARCH_TEST calls all the tests.

BEALE is the Beale function, for which N=2.

BOHACH1 is the Bohachevsky function #1, for which N=2.

BOHACH2 is the Bohachevsky function #2, for which N=2.

EXTENDED_ROSENBROCK is the "extended" Rosenbrock function. This version of the Rosenbrock function allows the spatial dimension N to be arbitrary, except that it must be even.

GOLDSTEIN_PRICE is the Goldstein-Price polynomial, for which N=2.

HIMMELBLAU is the Himmelblau function, for which N = 2, and which has four global minima.

LOCAL is a badly scaled function with a local minimum, for which N=2.

• local.m, defines the "local" function.
• local_test.m, tests COORDINATE_SEARCH with the function.

MCKINNON is the McKinnon function, for which N=2. This function can cause problems for the Nelder-Mead optimization algorithm.

POWELL is the Powell singular quartic function, for which N = 4.

ROSENBROCK is the Rosenbrock "banana" function.

You can go up one level to the MATLAB source codes.

Last revised on 01 January 2012.