Imagine how a swarm of bees find food sources. While visiting areas, different bees will find plants of different quality and quantity. Some might be better than others but they gravitate towards the best. Optimisation algorithms in AI work this way too.
Optimisation algorithms are used to evaluate massive search spaces for good solutions. Many of these algorithms are not guaranteed to find the absolute best solution; they attempt to find the global best while avoiding local best solutions.
A global best is the most optimal solution, and a local best is a solution that is less optimal. If the goal is to minimize, the smaller the value, the better — vice versa for maximization. The aim is to incrementally find local best solutions and converge on a global best one.
Parameters of the algorithm need careful attention. It must strive for diversity in solutions at the start and gradually gravitate toward better more homogenous solutions through each generation. Without divergence of solutions, a risk of getting stuck in a local best increases.
Genetic algorithms and swarm optimisation are fascinating and powerful, see Grokking AI Algorithms with Manning Publications: http://bit.ly/gaia-book, consider following me — @RishalHurbans, or join my mailing list for infrequent knowledge drops: https://rhurbans.com/subscribe.