• AMMDI is an open-notebook hypertext writing experiment, authored by Mike Travers aka @mtraven. It's a work in progress and some parts are more polished than others. Comments welcome! More.
Incoming links
from Computer Power and Human Reason
  • From The Empathy Diaries
    • Programs, argued Weizenbaum, can be written to help machines make decisions. But only people have the capacity for moral choice. This is ultimately what makes us human. And human choice is the product of judgment, not calculation. It includes nonmathematical factors, such as emotions.
    • Then Weizenbaum made a leap that I hadn’t: It was therefore immoral to teach children to program. When he looked at the children Seymour [Papert] was teaching, Weizenbaum didn’t see the wonder of “powerful ideas.” He saw the spread of a dangerous way of thinking to the young and vulnerable. Programming served as a primer in instrumental reason, a form of rationality that focused on the most efficient means to achieve an end but did not reflect on the values of that end. The act of programming, said Weizenbaum, encouraged programmers to live in the closed world of the machine. Where Seymour saw computers as a privileged place to learn, Weizenbaum saw a place where you could forget other people and your emotional and moral responsibilities to them.
from Sherry Turkle
  • author of The Empathy Diaries and The Second Self (among others). She's had a long career at MIT as a professor in science and technology studies (STS) and a long-time student of the culture and practices surrounding technology. As an ethnographer of those cultures she has a role as something of a professional outsider and a reputation as a "killjoy" (her term, from the Epilog of The Empathy Diaries); the Skyler White to the bad-boy Heisenbergs of high technology.
Twin Pages

The Empathy Diaries

14 Jul 2021 05:55 - 22 Feb 2022 11:41

    • One of my great allies at MIT, from my earliest days, was Professor Joseph Weizenbaum, an early critic of instrumental reasoning and where it led, an early critic of where Artificial Intelligence unexamined in its premises could lead us. I write about our relationship in my memoir, The Empathy Diaries. He felt betrayed when I married an AI scientist. He thought it would influence my thinking, assuming that my interest in studying children and the Logo language, was not born of intellectual curiosity but due to my love for Seymour Papert. He was wrong. I was both in love with Seymour and in love with the question: “How does programming change the way we think?”
    • As soon as children began to consider how the computer worked, they were led to consider whether or not it was alive. Since the computer’s programming made it seem “sort of alive,” they wondered if people were programmed as well? How exactly, asked children, were people different from machines? The question of free will, I thought, was what sex had been to the Victorians: threat and obsession, taboo and fascination. I thought these conversations were positive. I couldn’t agree with Weizenbaum, who wanted me to say, prior to investigation, that putting children and computers together was always a bad thing. I invited Weizenbaum to work with me; we could investigate children’s responses to computers together. But where I saw empirical questions, he saw a philosophical absolute. On this, we agreed to disagree. Yet, at MIT, we were solid allies in a larger critique of the engineering culture. (p???)