• AMMDI is an open-notebook hypertext writing experiment, authored by Mike Travers aka @mtraven. It's a work in progress and some parts are more polished than others. Comments welcome! More.
Incoming links
from goddinpotty/TODOs
  • DONE bug On Dennett, link to The Society of Mind not rendering
    • also the italics in that quote come out with the stars rather than rendered properly. Might be a thing with quotes?
from Kenan Malik
  • I'm not sure he does Dennett et al justice.
from The Embodied Mind
  • As Dennett puts it, “Although the new [cognitivist] theories abound with deliberately fanciful homunculus metaphors—subsystems like little people in the brain sending messages back and forth, asking for help, obeying and volunteering—the actual sub-systems are deemed to be unproblematic nonconscious bits of organic machinery, as utterly lacking in point of view or inner life as a kidney or kneecap.” In other words, the characterization of these “sub-personal” systems in “fanciful homunculus metaphors” is only provisional, for eventually all such metaphors are “discharged”—they are traded in for the storm of activity among such selfless processes as neural networks or AI data structures.
from Stances, a Catalog
  • Distinguish from Dennett Design Stance.
from William Irwin Thompson
  • If one has an inappropriate vision in the imagination, one generates an inappropriate “phase-portrait for the geometry of behavior” of the self. Our culture, lacking a vision of a multidimensional model of consciousness, simply oscillates back and forth between an excessively reified materialism and a compensatorily hysterical nihilism. This Nietzschean nihilism, in all its deconstructionist variants, has pretty much taken over the way literature is studied in the universities, and it also rules the cognitive science of Marvin Minsky, Dan Dennett, and Patricia and Paul Churchland, in which the self is looked upon as a superstition that arose from a naive folk psychology that existed before the age of enlightenment brought about by computers and artificial intelligence. This materialist/nihilist mind-set controls the universities.
    • Well that's a pretty standard take, can't say that I'm interested. More interesting is that he talks about The Embodied Mind.
from Weird Studies/William James on Consciousness
  • Some intro stuff on trying to define consciousness. Dennett appears as the enemy, they take his Consciousness Explained to be explaining it away, which I'm not sure is accurate.
from Materialism, Terry Eagleton
  • A synoptic overview of a bunch of quite different strains in philosophy that all in various ways share the name "materialism". It explicitly doesn't pay much attention to the kind of materialism I am familiar with, the scientific naturalism of Dennett, cognitive science, and general tech-atheist-rationalist discourse. It's focused more on people like Marx, Nietzsche, and the various ways they have thought about the body and the corporeal nature of thought and existence.
from designer stance
  • Distinguish from Dennett Design Stance.
from antiphilosophy
  • Most philosophy strikes me as amazingly wrongheaded and I can't bear it. OTOH, there are exceptions, philosophical writing that is clarifying (Dennett, Andy Clark, that sort, those that are basically theoretical cognitive scientists) or bracing/dizzying (Nietzsche, Deleuze, Sloterdijk). These don't feel like they should be the same field, to be honest, and I certainly read them with completely different sets of motivations and expectations.
from Some books on writing
  • (When I read that one, I was taken aback by how closely it matched my own thinking. Then I looked at the back where the acknowledgements are grudgingly placed, and found it was written by Daniel Dennett, which would explain that.)
from Agency: notes and references
  • Levin & Dennett

    • when cognitive science turned its back on behaviourism more than 50 years ago and began dealing with signals and internal maps, goals and expectations, beliefs and desires, biologists were torn. All right, they conceded, people and some animals have minds…processing information and guiding purposeful behaviour…They resisted introducing intentional idioms into their theoretical work, except as useful metaphors when teaching or explaining to lay audiences.Genes weren’t really selfish, antibodies weren’t really seeking, cells weren’t really figuring out where they were. These little biological mechanisms weren’t really agents with agendas, even though thinking of them as if they were often led to insights.
    • We think that this commendable scientific caution has gone too far,
    • We reject a simplistic essentialism where humans have ‘real’ goals, and everything else has only metaphorical ‘as if’ goals. [we now can] move past this kind of all-or-nothing thinking about the human animal – naturalising human capacities and swapping a naive binary distinction for a continuum of how much agency any system has.
    • Teleophobia
    • Huh I must say I am surprised to see Dennett on my side in this battle (is it a battle?).
    • Agents, in this carefully limited perspective, need not be conscious, need not understand, need not have minds, but they do need to be structured to exploit physical regularities that enable them to use information (following the laws of computation) to perform task
    • Hm I would say purposive, regardless of their information-processing caps, but maybe that is quibbling.
    • There's a just-so story about evolution of cooperation between cells using PD, but it ignores genetics which is probably wrong. It implies cells are self-interested, they really are communists.
    • Weird digression into morphogenesis? I guess there is an analogy there?
    • Confusing adaptivity with agency ?
    • The key dynamic that evolution discovered is a special kind of communication allowing privileged access of agents to the same information pool, which in turn made it possible to scale selves. This kickstarted the continuum of increasing agency.
    • If you agree that there is some mechanism by which electrically active cells can represent past memories, future counterfactuals and large-scale goals, there is no reason why non-neural electric networks wouldn’t be doing a simplified version of the same thing to accomplish anatomical homeostasis.
    • tulpas kind of encapsulate agency in a very practical form.
from @Against Narrativity
  • Dennett and Charles Taylor quoted as in support of narrativity in general. Taylor calls it an ‘inescapable structural requirement of human agency’.
Twin Pages


07 Sep 2022 03:38 - 07 Sep 2022 03:49

    • The one good philosopher. Well, not quite, but the one whose viewpoint is most congenial to MIT-flavored AI which is kind of my default belief system.
    • Many of those same theorists [who support Fodor modules] have been lukewarm-to-hostile about Marvin Minsky's Agents, who form The Society of Mind (1985). Minsky's Agents are homunculi that come in all sizes, from giant specialists with talents about as elaborate as those of Fodorian modules, down to meme-sized agents (polynemes, micronemes, censor-agents, suppressor-agents, and many others). It all looks too easy, the skeptics think. Wherever there is a task, posit a gang of task-sized agents to perform it—a theoretical move with all the virtues of theft over honest toil....
      • – Consciousness Explained, p261