On the performance of indirect encoding across the continuum of regularity

Author(s): 
Clune J
Stanley KO
Pennock RT
Ofria C
Year: 
2011
Abstract: 

This paper investigates how an evolutionary algorithm with an indirect encoding exploits the property of phenotypic regularity, an important design principle found in natural organisms and engineered designs. We present the first comprehensive study showing that such phenotypic regularity enables an indirect encoding to outperform direct encoding controls as problem regularity increases. Such an ability to produce regular solutions that can exploit the regularity of problems is an important prerequisite if evolutionary algorithms are to scale to high-dimensional real-world problems, which typically contain many regularities, both known and unrecognized. The indirect encoding in this case study is HyperNEAT, which evolves artificial neural networks (ANNs) in a manner inspired by concepts from biological development. We demonstrate that, in contrast to two direct encoding controls, HyperNEAT produces both regular behaviors and regular ANNs, which enables HyperNEAT to significantly outperform the direct encodings as regularity increases in three problem domains. We also show that the types of regularities HyperNEAT produces can be biased, allowing domain knowledge and preferences to be injected into the search. Finally, we examine the downside of a bias toward regularity. Even when a solution is mainly regular, some irregularity may be needed to perfect its functionality. This insight is illustrated by a new algorithm called HybrID that hybridizes indirect and direct encodings, which matched HyperNEAT's performance on regular problems yet outperformed it on problems with some irregularity. HybrID's ability to improve upon the performance of HyperNEAT raises the question of whether indirect encodings may ultimately excel not as stand-alone algorithms, but by being hybridized with a further process of refinement, wherein the indirect encoding produces patterns that exploit problem regularity and the refining process modifies that pattern to capture irregularities. This- - paper thus paints a more complete picture of indirect encodings than prior studies because it analyzes the impact of the continuum between irregularity and regularity on the performance of such encodings, and ultimately suggests a path forward that combines indirect encodings with a separate process of refinement.


Evolving Gaits for Legged Robots: Neural Networks with Geometric Patterns Perform Better

Neural networks evolved to produce gaits for legged robots. The use of the HyperNEAT generative encoding produces geometric patterns (regularities) in the neural wiring of the evolved brains, which improves performance by producing coordinated, regular leg movements.

Evolving artificial neural networks (ANNs) and gaits for robots are difficult, time-consuming tasks for engineers, making them suitable for evolutionary algorithms (aka genetic algorithms). Generative encodings (aka indirect and developmental encodings) perform better than direct encodings by producing neural regularities that result in behavioral regularities.

Evolving Regular, Modular Neural Networks

I (Jeff Clune) summarize my research into evolving modular, regular neural networks, which are digital models of brains. The property of regularity is produced by using HyperNEAT, a generative encoding based on concepts from developmental biology. The property of modularity arises because we add a cost for connections between neurons in the network. Evolving structurally organized neural networks, including those that are regular and modular, is a necessary step in our long-term quest of evolving computational intelligence that rivals or surpasses human intelligence.

Evolving Gaits for Legged Robots: Neural Networks with Geometric Patterns Perform Better

Neural networks evolved to produce gaits for legged robots. The use of the HyperNEAT generative encoding produces geometric patterns (regularities) in the neural wiring of the evolved brains, which improves performance by producing coordinated, regular leg movements.

Evolving Regular, Modular Neural Networks

I (Jeff Clune) summarize my research into evolving modular, regular neural networks, which are digital models of brains. The property of regularity is produced by using HyperNEAT, a generative encoding based on concepts from developmental biology.

Pub. Info: 
IEEE Transactions on Evolutionary Computation. 15(3): 346-367
BibTeX: 

@ARTICLE{5910671,
author={Clune, J. and Stanley, K.O. and Pennock, R.T. and Ofria, C.},
journal={Evolutionary Computation, IEEE Transactions on},
title={On the Performance of Indirect Encoding Across the Continuum of Regularity},
year={2011},
month={June},
volume={15},
number={3},
pages={346-367},
keywords={evolutionary computation;neural nets;HybrID algorithm;HyperNEAT;artificial neural networks;evolutionary algorithm;indirect encoding;phenotypic regularity property;regularity continuum;Artificial neural networks;Bioinformatics;Encoding;Evolution (biology);Genomics;Organisms;Topology;Artificial neural networks;HyperNEAT;developmental encodings;generative encodings;indirect encodings;regularity},
doi={10.1109/TEVC.2010.2104157},
ISSN={1089-778X},}