Gradient descent [in constraint satisfaction]
A standard method for finding a minimum in a smooth function f[x] is to use
FixedPoint[# - a f'[#] &, x0]
If there are local minima, then which one is reached will depend on the starting point x0. It will not necessarily be the one closest to x0 because of potentially complicated overshooting effects associated with the step size a. Newton's method for finding zeros of f[x] is related and is given by
FixedPoint[# - f[#]/f'[#] &, x0]