Evolving Regular Neural Networks

Combining evolution with concepts from developmental biology produces repeated patterns in neural networks that increase intelligence. Such repeated patterns are evidence of a property called "regularity", which typically involves symmetries and the repetition of design themes, with and without variation. Our research has shown how you can evolve regular phenotypes--whether they be neural networks, robot morphologies, or 3-D printable shapes--and that evolving such regularity improves performance and evolvability. We evolve such regularity with a generative encoding based on developmental biology called a compositional pattern producing network (CPPN), which is also used to evolve neural networks in the HyperNEAT algorithm.

Image: Evolved neural network robot controllers that have different regular patterns of neural connectivity.


Videos

Evolving Gaits for Legged Robots: Neural Networks with Geometric Patterns Perform Better

Neural networks evolved to produce gaits for legged robots. The use of the HyperNEAT generative encoding produces geometric patterns (regularities) in the neural wiring of the evolved brains, which improves performance by producing coordinated, regular leg movements.

Evolving artificial neural networks (ANNs) and gaits for robots are difficult, time-consuming tasks for engineers, making them suitable for evolutionary algorithms (aka genetic algorithms). Generative encodings (aka indirect and developmental encodings) perform better than direct encodings by producing neural regularities that result in behavioral regularities.

This video accompanies the following paper(s):

Evolving Artificial Neural Networks That Are Both Modular and Regular

This video accompanies the following paper(s):

Evolving Soft Robots with Multiple Materials (muscle, bone, etc.)

Here we evolve the bodies of soft robots made of multiple materials (muscle, bone, & support tissue) to move quickly. Evolution produces a diverse array of fun, wacky, interesting, but ultimately functional soft robots. Enjoy!

This video accompanies the following paper(s):

EndlessForms.com - Design objects with evolution and 3D print them!

On http://EndlessForms.com objects are evolved in the same way that plants and animals are bred. You pick the ones you like and they become the parents of the next generation of objects. As in biological evolution, the offspring look similar, but not identical, to their parents, allowing you to explore different designs. Under the hood, there is an evolutionary process in which the genomes of parents are mutated and crossed over to produce new offspring objects. Additionally, the objects are grown from their genomes similar to how a single fertilized egg grows into a jaguar, hawk, or human. This grounding in developmental biology enables the evolution of complex, natural-looking forms. For more info visit: http://endlessforms.com/about_the_technology.

This video accompanies the following paper(s):

Evolved Electrophysiological Soft Robots

The research field of evolutionary robotics abstracts some of the major themes in biological evolution (heritable traits, genetic variation, and competition for scarce resources) as tools to allow computers to generate new and interesting virtual creatures. One of the recent themes in this field is towards more embodied robots (those that produce interesting behavior through the design of their bodies, as well as their brains). Here, we build on previous work evolving soft robots to demonstrate the low level embodiment of electrical signals passing information through muscle tissue. Through this work we attempt bridge the divide between embodied cognition and abstracted artificial neural networks.

This video accompanies the following paper(s):

Talk at the Santa Fe Institute: Two Projects in BioInspired AI. Evolving regular, modular, hierarchical neural networks, and robot damage recovery.

This video accompanies the following paper(s):

Evolving Regular, Modular Neural Networks

I (Jeff Clune) summarize my research into evolving modular, regular neural networks, which are digital models of brains. The property of regularity is produced by using HyperNEAT, a generative encoding based on concepts from developmental biology. The property of modularity arises because we add a cost for connections between neurons in the network. Evolving structurally organized neural networks, including those that are regular and modular, is a necessary step in our long-term quest of evolving computational intelligence that rivals or surpasses human intelligence.

This video accompanies the following paper(s):

Evolving artificial neural networks with generative encodings inspired by developmental biology

In this dissertation I (Jeff Clune) investigate the difference between generative encodings and direct encodings for evolutionary algorithms. Generative encodings are inspired by developmental biology and were designed, in part, to increase the regularity of synthetically evolved phenotypes. Regularity is an important design principle in both natural organisms and engineered designs. The majority of this dissertation focuses on how the property of regularity enables a generative encoding to outperform direct encoding controls, and whether a bias towards regularity also hurts the performance of the generative encoding on some problems. I also report on whether researchers can bias the types of regularities produced by a generative encoding to accommodate user preferences. Finally, I study the degree to which a generative encoding produces another important design principle, modularity.

Several previous studies have shown that generative encodings outperform direct encodings on highly regular problems. However, prior to this dissertation, it was not known how generative encodings compare to direct encodings on problems with different levels of regularity. On three different problems, I show that a generative encoding can exploit intermediate amounts of problem regularity, which enabled the generative encoding to increasingly outperform direct encoding controls as problem regularity increased. This performance gap emerged because the generative encoding produced regular artificial neural networks (ANNs) that produced regular behaviors. The ANNs evolved with the generative encoding contained a diverse array of complicated, regular neural wiring patterns, whereas the ANNs produced by a direct encoding control were irregular.

I also document that the bias towards regularity can hurt a generative encoding on problems that have some amount of irregularity. I propose a new algorithm, called HybrID, wherein a generative encoding produces regular patterns and a direct encoding modifies
those patterns to provide fitness-enhancing irregularities. HybrID outperformed a generative encoding alone on three problems for nearly all levels of regularity, which raises the question of whether generative encodings may ultimately excel not as stand-alone algorithms, but by being hybridized with a further process of irregular refinement.

The results described so far document that a generative encoding can produce regular solutions. I then show that, at least for the generative encoding in this case study, it is possible to influence the types of regularities produced, which allows domain knowledge and preferences to be injected into the algorithm. I also investigated whether the generative encoding can produce modular solutions. I present the first documented case of this generative encoding producing a modular phenotype on a simple problem. However, the generative encoding's inability to create modularity on harder problems where modularity would have been beneficial suggests that more work is needed to increase the likelihood that this encoding produces modular ANNs in response to challenging, decomposable problems.

Overall, this dissertation paints a more complete picture of generative encodings than prior studies. Initially, it demonstrates that, by producing regular ANNs and behaviors, generative encodings increasingly outcompete direct encodings as problem regularity increases. It next documents that a bias towards regularity can harm the performance of direct encodings when problems contain irregularities. The HybrID algorithm suggests a path forward, however, by revealing that a refinement process that fine-tunes the regular patterns produced by a generative encoding can boost performance by accounting for problem irregularities. Finally, the dissertation shows that the generative encoding studied can produce modular networks on simple problems, but may struggle to do so on harder problems. The general conclusion that can be drawn from this work is that generative encodings can produce some of the properties seen in complex, natural organisms, and will likely be an important part of our long-term goal of synthetically evolving phenotypes that approach the capability, intelligence, and complexity of their natural rivals.

This video accompanies the following paper(s):

Evolving Gaits for Physical Robots Directly in Hardware with the HyperNEAT Generative Encoding

Some of the gaits evolved by the HyperNEAT algorithm.

This video accompanies the following paper(s):

Talk summarizing "Evolving Neural Networks That Are Both Modular and Regular: HyperNEAT Plus the Connection Cost Technique"

Talk given by Joost Huizinga at the 2014 GECCO Conference in Vancouver, British Columbia.

This video accompanies the following paper(s):

Evolving Gaits for Legged Robots: Neural Networks with Geometric Patterns Perform Better

Neural networks evolved to produce gaits for legged robots. The use of the HyperNEAT generative encoding produces geometric patterns (regularities) in the neural wiring of the evolved brains, which improves performance by producing coordinated, regular leg movements.

Evolving Artificial Neural Networks That Are Both Modular and Regular

Evolving Soft Robots with Multiple Materials (muscle, bone, etc.)

Here we evolve the bodies of soft robots made of multiple materials (muscle, bone, & support tissue) to move quickly. Evolution produces a diverse array of fun, wacky, interesting, but ultimately functional soft robots. Enjoy!

EndlessForms.com - Design objects with evolution and 3D print them!

On http://EndlessForms.com objects are evolved in the same way that plants and animals are bred. You pick the ones you like and they become the parents of the next generation of objects.

Evolved Electrophysiological Soft Robots

The research field of evolutionary robotics abstracts some of the major themes in biological evolution (heritable traits, genetic variation, and competition for scarce resources) as tools to allow computers to generate new and interesting virtual creatures.

Talk at the Santa Fe Institute: Two Projects in BioInspired AI. Evolving regular, modular, hierarchical neural networks, and robot damage recovery.

Evolving Regular, Modular Neural Networks

I (Jeff Clune) summarize my research into evolving modular, regular neural networks, which are digital models of brains. The property of regularity is produced by using HyperNEAT, a generative encoding based on concepts from developmental biology.

Evolving artificial neural networks with generative encodings inspired by developmental biology

In this dissertation I (Jeff Clune) investigate the difference between generative encodings and direct encodings for evolutionary algorithms. Generative encodings are inspired by developmental biology and were designed, in part, to increase the regularity of synthetically evolved phenotypes.

Evolving Gaits for Physical Robots Directly in Hardware with the HyperNEAT Generative Encoding

Some of the gaits evolved by the HyperNEAT algorithm.

Talk summarizing "Evolving Neural Networks That Are Both Modular and Regular: HyperNEAT Plus the Connection Cost Technique"

Talk given by Joost Huizinga at the 2014 GECCO Conference in Vancouver, British Columbia.


Publications