Variation and selection

An evolutionary model of learning in neural networks

Research output: Contribution to journalArticle

Abstract

In the present work, we study the emergence of nontrivial computational capabilities in networks competing against each other in an environment where possession of such capabilities is an advantage. Our approach is to simulate a variable population of network automata. In a way directly analogous to biological evolution, the population will converge, under the influence of selective pressure, to a group of automata that will be optimally suited for solving the task at hand. We visualize the approach as comprising two separate kinds of processes - a low-level 'performance' process and a higher level 'metaperformance' process.

Original languageEnglish (US)
Pages (from-to)75
Number of pages1
JournalNeural Networks
Volume1
Issue number1 SUPPL
StatePublished - 1988
Externally publishedYes

Fingerprint

Biological Evolution
Learning
Neural networks
Population

ASJC Scopus subject areas

  • Artificial Intelligence
  • Neuroscience(all)

Cite this

Variation and selection : An evolutionary model of learning in neural networks. / Bergman, Aviv.

In: Neural Networks, Vol. 1, No. 1 SUPPL, 1988, p. 75.

Research output: Contribution to journalArticle

@article{1e255ba24e864679aea043277e669410,
title = "Variation and selection: An evolutionary model of learning in neural networks",
abstract = "In the present work, we study the emergence of nontrivial computational capabilities in networks competing against each other in an environment where possession of such capabilities is an advantage. Our approach is to simulate a variable population of network automata. In a way directly analogous to biological evolution, the population will converge, under the influence of selective pressure, to a group of automata that will be optimally suited for solving the task at hand. We visualize the approach as comprising two separate kinds of processes - a low-level 'performance' process and a higher level 'metaperformance' process.",
author = "Aviv Bergman",
year = "1988",
language = "English (US)",
volume = "1",
pages = "75",
journal = "Neural Networks",
issn = "0893-6080",
publisher = "Elsevier Limited",
number = "1 SUPPL",

}

TY - JOUR

T1 - Variation and selection

T2 - An evolutionary model of learning in neural networks

AU - Bergman, Aviv

PY - 1988

Y1 - 1988

N2 - In the present work, we study the emergence of nontrivial computational capabilities in networks competing against each other in an environment where possession of such capabilities is an advantage. Our approach is to simulate a variable population of network automata. In a way directly analogous to biological evolution, the population will converge, under the influence of selective pressure, to a group of automata that will be optimally suited for solving the task at hand. We visualize the approach as comprising two separate kinds of processes - a low-level 'performance' process and a higher level 'metaperformance' process.

AB - In the present work, we study the emergence of nontrivial computational capabilities in networks competing against each other in an environment where possession of such capabilities is an advantage. Our approach is to simulate a variable population of network automata. In a way directly analogous to biological evolution, the population will converge, under the influence of selective pressure, to a group of automata that will be optimally suited for solving the task at hand. We visualize the approach as comprising two separate kinds of processes - a low-level 'performance' process and a higher level 'metaperformance' process.

UR - http://www.scopus.com/inward/record.url?scp=0024171911&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0024171911&partnerID=8YFLogxK

M3 - Article

VL - 1

SP - 75

JO - Neural Networks

JF - Neural Networks

SN - 0893-6080

IS - 1 SUPPL

ER -