If one has an inappropriate vision in the imagination, one generates an inappropriate “phase-portrait for the geometry of behavior” of the self. Our culture, lacking a vision of a multidimensional model of consciousness, simply oscillates back and forth between an excessively reified materialism and a compensatorily hysterical nihilism. This Nietzschean nihilism, in all its deconstructionist variants, has pretty much taken over the way literature is studied in the universities, and it also rules the cognitive science of Marvin Minsky, Dan Dennett, and Patricia and Paul Churchland, in which the self is looked upon as a superstition that arose from a naive folk psychology that existed before the age of enlightenment brought about by computers and artificial intelligence. This materialist/nihilist mind-set controls the universities.
when cognitive science turned its back on behaviourism more than 50 years ago and began dealing with signals and internal maps, goals and expectations, beliefs and desires, biologists were torn. All right, they conceded, people and some animals have minds…processing information and guiding purposeful behaviour…They resisted introducing intentional idioms into their theoretical work, except as useful metaphors when teaching or explaining to lay audiences.Genes weren’t really selfish, antibodies weren’t really seeking, cells weren’t really figuring out where they were. These little biological mechanisms weren’t really agents with agendas, even though thinking of them as if they were often led to insights.
We think that this commendable scientific caution has gone too far,
We reject a simplistic essentialism where humans have ‘real’ goals, and everything else has only metaphorical ‘as if’ goals. [we now can] move past this kind of all-or-nothing thinking about the human animal – naturalising human capacities and swapping a naive binary distinction for a continuum of how much agency any system has.
Agents, in this carefully limited perspective, need not be conscious, need not understand, need not have minds, but they do need to be structured to exploit physical regularities that enable them to use information (following the laws of computation) to perform task
The key dynamic that evolution discovered is a special kind of communication allowing privileged access of agents to the same information pool, which in turn made it possible to scale selves. This kickstarted the continuum of increasing agency.
If you agree that there is some mechanism by which electrically active cells can represent past memories, future counterfactuals and large-scale goals, there is no reason why non-neural electric networks wouldn’t be doing a simplified version of the same thing to accomplish anatomical homeostasis.
As Dennett puts it, “Although the new [cognitivist] theories abound with deliberately fanciful homunculus metaphors—subsystems like little people in the brain sending messages back and forth, asking for help, obeying and volunteering—the actual sub-systems are deemed to be unproblematic nonconscious bits of organic machinery, as utterly lacking in point of view or inner life as a kidney or kneecap.” In other words, the characterization of these “sub-personal” systems in “fanciful homunculus metaphors” is only provisional, for eventually all such metaphors are “discharged”—they are traded in for the storm of activity among such selfless processes as neural networks or AI data structures.
Many of those same theorists [who support Fodor modules] have been lukewarm-to-hostile about Marvin Minsky's Agents, who form The Society of Mind (1985). Minsky's Agents are homunculi that come in all sizes, from giant specialists with talents about as elaborate as those of Fodorian modules, down to meme-sized agents (polynemes, micronemes, censor-agents, suppressor-agents, and many others). It all looks too easy, the skeptics think. Wherever there is a task, posit a gang of task-sized agents to perform it—a theoretical move with all the virtues of theft over honest toil....