Computers come from apes.
In celebration of Darwin’s 200th birthday, I’ll be participating in a panel discussion with members of my university’s departments of anthropology and biology. If you are both a Brain Hammer-head and a WillyPee-head, or whatever you call ‘em, come see “Evolution: Truth or Myth” at 6pm in the Student Center Multi-purpose Room (near the food court).
In preparation for the event I was thinking about some of my research on artificial life and evolving simple synthetic intelligences. A little auto-googling popped up this summary of a paper I co-authored with some former students, “Evolving Artificial Minds and Brains“. The following is excerpt from an introductory essay for the volume in which the paper appears. The editors of the volume and authors of the essay are Andrea C. Schalley and Drew Khlentzos. They do a pretty good job except for missing the point about how, mere responsiveness to stimuli being insufficient for mindedness, we are looking at nematode chemotaxis precisely because, in involving a memory, it crosses a threshold marking a difference in kind between mere reactivity and intelligence.
In “Evolving artificial minds and brains” Pete Mandik, Mike Collins and Alex Vereschagin argue for the need to posit mental representations in order to explain intelligent behaviour in very simple creatures. The creature they choose is the nem atode worm and the behaviour in question is chemotaxis. Many philosophers think that a creature’s brain state or neural state cannot count as genuinely mental if the creature lacks any awareness of it. Relatedly, they think that only behaviour the creature is conscious of can be genuinely intelligent behaviour. When the standards for mentality and intelligence are set so high, very few creatures turn out to be ca pable of enjoying mental states or exhibiting intelligent behaviour. Yet the more we learn about sophisticated cognitive behaviour in apparently simple organisms the more tenuous the connection between mentality and consciousness looks.
If there is a danger in setting the standards for mentality and intelligence too high, there is equally a danger in setting them too low, however. Many cognitive scientists would baulk at the suggestion that an organism as simple as a nematode worm could harbour mental representations or behave intelligently. Yet Mandik, Collins and Vereschagin argue that the worm’s directed movement in response to chemical stimuli does demand explanation in terms of certain mental representa tions. By “mental representations” they mean reliable forms of information about the creature’s (chemical) environment that are encoded and used by the organism in systematic ways to direct its behaviour.
To test the need for mental representations they construct neural networks that simulate positive chemotaxis in the nematode worm, comparing a variety of networks. Thus networks that incorporate both sensory input and a rudimentary form of memory in the form of recurrent connections between nodes are tested against networks without such memory and networks with no sensory input. The results are then compared with the observed behaviour of the nematode. Their finding is that the networks with both sensory input and the rudimentary form of memory have a distinct selectional advantage over those without both attributes.
Even if it is too much to require mental states to be conscious, there is still the sense that there is more to mentality than tracking and responding to environmental states. One worry is that there is simply not enough plasticity in the nema tode worm’s behaviour to justify the attribution of a mind. A more important worry is that the nematode does not plan - it is purely at the mercy of external forces pushing and pulling it in the direction of nutrients. In this regard, it is in structive to compare the behaviour of the nematode worm with the foresighted behaviour of the jumping spider, Portia Labiato. Portia is able to perform some quite astonishing feats of tracking, deception and surprise attack in order to hunt and kill its (often larger) spider prey. Its ability to plot a path to its victim would tax the computational powers of a chimpanzee let alone a rat. It has the ability to plan a future attack even when the intended victim has long disappeared from its sight. Portia appears to experiment and recall information about the peculiar habits of different species of spiders, plucking their webs in ways designed to arouse their interest by simulating the movements of prey without provoking a full attack. Yet where the human brain has 100 billion brain cells and a honeybee’s one million, Portia is estimated to have no more than 600,000 neurons!