English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

Information-Geometric Optimization with Natural Selection

MPS-Authors
/persons/resource/persons246034

Otwinowski,  Jakub
Max Planck Research Group Statistical physics of evolving systems, Max Planck Institute for Dynamics and Self-Organization, Max Planck Society;

/persons/resource/persons252185

LaMont,  Colin H.
Max Planck Research Group Statistical physics of evolving systems, Max Planck Institute for Dynamics and Self-Organization, Max Planck Society;

/persons/resource/persons221181

Nourmohammad,  Armita
Max Planck Research Group Statistical physics of evolving systems, Max Planck Institute for Dynamics and Self-Organization, Max Planck Society;

External Resource
No external resources are shared
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Otwinowski, J., LaMont, C. H., & Nourmohammad, A. (2020). Information-Geometric Optimization with Natural Selection. Entropy, 22: 967. doi:10.3390/e22090967.


Cite as: http://hdl.handle.net/21.11116/0000-0007-5636-C
Abstract
Evolutionary algorithms, inspired by natural evolution, aim to optimize difficult objective functions without computing derivatives. Here we detail the relationship between classical population genetics of quantitative traits and evolutionary optimization, and formulate a new evolutionary algorithm. Optimization of a continuous objective function is analogous to searching for high fitness phenotypes on a fitness landscape. We describe how natural selection moves a population along the non-Euclidean gradient that is induced by the population on the fitness landscape (the natural gradient). We show how selection is related to Newton’s method in optimization under quadratic fitness landscapes, and how selection increases fitness at the cost of reducing diversity. We describe the generation of new phenotypes and introduce an operator that recombines the whole population to generate variants. Finally, we introduce a proof-of-principle algorithm that combines natural selection, our recombination operator, and an adaptive method to increase selection and find the optimum. The algorithm is extremely simple in implementation; it has no matrix inversion or factorization, does not require storing a covariance matrix, and may form the basis of more general model-based optimization algorithms with natural gradient updates.