background
logo
ArxivPaperAI

Evolution Transformer: In-Context Evolutionary Optimization

Author:
Robert Tjarko Lange, Yingtao Tian, Yujin Tang
Keyword:
Computer Science, Artificial Intelligence, Artificial Intelligence (cs.AI), Neural and Evolutionary Computing (cs.NE)
journal:
--
date:
2024-03-05 00:00:00
Abstract
Evolutionary optimization algorithms are often derived from loose biological analogies and struggle to leverage information obtained during the sequential course of optimization. An alternative promising approach is to leverage data and directly discover powerful optimization principles via meta-optimization. In this work, we follow such a paradigm and introduce Evolution Transformer, a causal Transformer architecture, which can flexibly characterize a family of Evolution Strategies. Given a trajectory of evaluations and search distribution statistics, Evolution Transformer outputs a performance-improving update to the search distribution. The architecture imposes a set of suitable inductive biases, i.e. the invariance of the distribution update to the order of population members within a generation and equivariance to the order of the search dimensions. We train the model weights using Evolutionary Algorithm Distillation, a technique for supervised optimization of sequence models using teacher algorithm trajectories. The resulting model exhibits strong in-context optimization performance and shows strong generalization capabilities to otherwise challenging neuroevolution tasks. We analyze the resulting properties of the Evolution Transformer and propose a technique to fully self-referentially train the Evolution Transformer, starting from a random initialization and bootstrapping its own learning progress. We provide an open source implementation under https://github.com/RobertTLange/evosax.
PDF: Evolution Transformer: In-Context Evolutionary Optimization.pdf
Empowered by ChatGPT