Evolving neural networks that are both modular and regular: HyperNEAT plus the Connection Cost Technique

Author(s): 
Huizinga J
Mouret JB
Clune J
Year: 
2014
Abstract: 

One of humanity’s grand scientific challenges is to create artificially intelligent robots that rival natural animals in intelligence and agility. A key enabler of such animal complexity is the fact that animal brains are structurally organized in that they exhibit modularity and regularity, amongst other attributes. Modularity is the localization of function within an encapsulated unit. Regularity refers to the compressibility of the information describing a structure, and typically involves symmetries and repetition. These properties improve evolvability, but they rarely emerge in evolutionary algorithms without specific techniques to encourage them. It has been shown that (1) modularity can be evolved in neural networks by adding a cost for neural connections and, separately, (2) that the HyperNEAT algorithm produces neural networks with complex, functional regularities. In this paper we show that adding the connection cost technique to HyperNEAT produces neural networks that are significantly more modular, regular, and higher performing than HyperNEAT without a connection cost, even when compared to a variant of HyperNEAT that was specifically designed to encourage modularity. Our results represent a stepping stone towards the goal of producing artificial neural networks that share key organizational properties with the brains of natural animals.


Evolving Artificial Neural Networks That Are Both Modular and Regular

Talk at the Santa Fe Institute: Two Projects in BioInspired AI. Evolving regular, modular, hierarchical neural networks, and robot damage recovery.

Talk summarizing "Evolving Neural Networks That Are Both Modular and Regular: HyperNEAT Plus the Connection Cost Technique"

Talk given by Joost Huizinga at the 2014 GECCO Conference in Vancouver, British Columbia.

Evolving Artificial Neural Networks That Are Both Modular and Regular

Talk at the Santa Fe Institute: Two Projects in BioInspired AI. Evolving regular, modular, hierarchical neural networks, and robot damage recovery.

Talk summarizing "Evolving Neural Networks That Are Both Modular and Regular: HyperNEAT Plus the Connection Cost Technique"

Talk given by Joost Huizinga at the 2014 GECCO Conference in Vancouver, British Columbia.

Pub. Info: 
Proceedings of the Genetic and Evolutionary Computation Conference. 697-704
BibTeX: 

@inproceedings{Huizinga:2014:ENN:2576768.2598232,
author = {Huizinga, Joost and Clune, Jeff and Mouret, Jean-Baptiste},
title = {Evolving Neural Networks That Are Both Modular and Regular: HyperNEAT Plus the Connection Cost Technique},
booktitle = {Proceedings of the 2014 Conference on Genetic and Evolutionary Computation},
series = {GECCO '14},
year = {2014},
isbn = {978-1-4503-2662-9},
location = {Vancouver, BC, Canada},
pages = {697--704},
numpages = {8},
url = {http://doi.acm.org/10.1145/2576768.2598232},
doi = {10.1145/2576768.2598232},
acmid = {2598232},
publisher = {ACM},
address = {New York, NY, USA},
keywords = {HyperNEAT, NSGA-II, artificial neural networks, modularity, regularity},
}