About
    • AMMDI is an open-notebook hypertext writing experiment, authored by Mike Travers aka @mtraven. It's a work in progress and some parts are more polished than others. Comments welcome! More.
Search
Full
Incoming links
from orthogonality thesis
  • Rationalism eschews specific goals and goods. They are relegated to the word "values" or "utility"; the focus of attention is on the powerful and fully general machinery of goal-satisfaction, for both human and computational agents.
from Heidegger on the Connection between Nihilism, Art, Technology and Politics
  • Well that is – unintuitive. That sounds like an even harsher critique of Rationalism than I typically make.
from eigenselves
from Making of a Counter-Culture
  • The quotes below really highlight for me how much Rationalism is a reactionary movement against sixties romanticism. That doesn't make it wrong – there were plenty of reasons to turn against that stuff – but explains a bit its cultural and political penumbra.
from libertarianism
  • Rationalism tends toward libertarianism, although it's not universal. And I really do think that their libertarianism is motivated more by a fondness for elegant distributed mechanisms than by a desire to slaughter leftists. Whatever the motivation, the ideas are deeply intertwined, and probably my objections to them are intertwined as well.
from autistic
  • Rationalism is sort of autism turned into a movement. It fascinates me but I hate it, and I guess that's a function of my personal relationship to my own autism.
from SlateStarCodex
  • Slate Star Codex is the former blog of Scott Alexander (aka Scott Siskind aka SSC), the most widely-read person in Rationalism sphere these days (2020 or so).
from LWMap/Being a Robust Agent
  • Because Rationalism is about an idealized version of thinking, it doesn't have much interest in the ways that humans (so far, the only examples we have of intelligent agents) actually work. It aims to make humans more closely approximate the ideal, even though the ideal is monstrous when taken to its logical extremes.
from torture
  • Rationalism seems to have a certain unhealthy interest in the subject; Roko's Basilisk involves an AI who tortures simulated humans, and Robin Hanson, whose whole schtick is tossing up horrible things without displaying horror, regularly coughs up suggestions like Torture Kids Instead. Extremely creepy.
from antipolitics
  • Rationalism features a prominent disdain for politics. There are many good reasons of course to hate politics, but disliking something does not make it thereby unimportant. And it doesn't excuse you from participation in the actual politics of the present day.
from Meaningness
  • David Chapman (aka @meaningness) has been a major influence on my own thinking. His work at the MIT AI lab with Phil Agre made a deep impression on me when I was trying to figure out my own academic path. This included a critical take on the standard cognitive science view of the mind, which is pretty much Rationalism minus the more cultish and cartoonish aspects.
from About
from Agency Made Me Do It
  • A caution: my goal is not to write a self-help book or a manual on how to acquire more agency. I guess this is something people might be looking for, given how it is basically the promise of a whole subindustry of productivity and self-help gurus, and a concern of Rationalism (see LWMap/Being a Robust Agent).
from Technic and Magic
  • Technic is the force behind our present world, and so responsible for its well-known flaws, but its exact nature is a bit unclear. It seems closely tied to technology, rationalism, modernism, abstraction, and capitalism, but is not quite any of those. It is relentlessly instrumental, purposeful, and totalizing in its use of language. It makes up our world and is also intent on destroying even the possibility of a world. It's too powerful to defeat, but there are other worlds available that Technic does not rule, and they offer the possibility of escape.
from LWMap/Agency: Introduction
  • Each volume of A Map That Reflects the Territory has a short introduction to its theme. I'm going to dissect a few quotes from the introduction to Agency, because they seem to compactly and precisely embody my issues with Rationalism in general:
from postrationalism
  • A difficult to define ideology, but literally means people who have moved beyond Rationalism. Meaningness and Ribbonfarm are usually taken to be postrationalist, and I'm close enough to those precincts that it probably means I am too, although in truth I was probably never Rationalist enough to qualify.
from paperclip maximizer
  • This seems entirely implausible to me. Part of this exercise is to investigate and defend that intuition and related doubts about the ironclad mathematical certainties that Rationalism produces so effortlessly.
from neoreaction
  • An extreme right-wing political ideology that for some reason has a serious following in a subset of the technology world, with considerable overlap with the Rationalism community. Also called NRx by those in the know. Neoreactionaries don't like to be called fascists or white nationalists, but their writings contain astounding levels of toxic racism and calls for violence, just the sort of thing you would expect from fascists or white nationalists.
    • Scott Alexander has put a lot of effort into distancing himself from neoreaction, which is good, but he's basically in trouble for being in.a social position where he had a need to do that. Which might not be quite fair, but this is how things work.
from anti-purpose
from LWMap/Meta-honesty
  • To say this is wrong is kind of an understatement; it strikes me as aggressively wrong, deliberately retro, an attempt to stick one's head in the sand to evade the postmodern condition. And it's foundational to Rationalism.
from "rationality" vs "rationalism"
  • Well that's kind of embarassing – I wrote tons of stuff about Rationalism without realizing that it is a label eschewed by rationalists themselves:
    • To call something an “ism” suggests that it is a matter ideology or faith, like Trotskyism or creationism....So, my suggestion is to use "rationality" consistently and to avoid using "rationalism". Via similarity to "scientist" and "physicist", "rationalist" doesn't seem to have the same problem
from LWMap/Agency: Introduction
  • There's a whole lot of this that I disagree with (see Rationalism), but here I just want to point out how it leads to a distorted and arguably harmful view of human agency as somehow deficient because it is not pitiless and single-minded.
from gradgrindism
  • Rationalismists are not really Gradgrinds, they are in fact a pretty playful and imaginative bunch in their way. But their ideology is grim, and their nightmares of paperclip maximizer have a Gradgrindian aspect to them.
from nihilism
  • The need to find some kind of value or purpose in a meaningless universe is kind of an unacknowledged note throughout Rationalism discourse, emerging in its nightmare dreams of like the paperclip maximizer or Roko's Basilisk or its attempts to wax poetic about utilitarianism. This is not really meant as a criticism. I see Rationalism as a sincere attempt to build something necessary – a religion, a shared way of making meaning – on top of the unpromising nihilist foundations of the materialist worldview. I'm sympathetic to their goals and efforts but kind of dubious about their solution.
from LWMap/The Rocket Alignment Problem
  • Yudkowsky's long list of objections and refutations makes me realize that my own quarrels with Rationalism probably aren't that interesting; they've already heard my objections a hundred times and have already dealt with and dismissed them. (They remain interesting to me though, if only because they help me clarify my own ideas).
from optimizing
  • One of my gripes with Rationalism is the unquestioned assumption that intelligence is about optimizing some quantity. Closely related to the similar gripe about winning. I find this an impoverished way to think about the mind.
from haecceity
  • Isn't this about the same as that other advanced vocabulary term, ipsissimosity? (Note: this has some bearing on my quarrels with Rationalism and the "objective spirit")
from 2021 Year-end review
  • What you see before you. I started this just before the last new year, polished it off in the first few months, and have been adding and tweaking it since. Haven't got the response I hoped for, but then I'm not pushing it very hard and it isn't clear what it's for.. I had thought it might spur some dialog with Rationalism but that didn't happen and it's not clear it would be interesting in any case. I might call it a failure if it had any actual goals.
    • Very little feedback but her is one: [What are the best blogs that is "opposed" to LW? : SneerClub]
    • Also got some appreciation from the guys on the Weird Studies Discord.
from YMCYL Kindle Notes
  • I thought this was the greatest description of Rationalism ever and have been dropping it a lot.
from The Enigma of Reason
  • Book by Hugo Mercier and Dan Sperber that has an interesting, non-Rationalism view of reason. Rather than striving to attain objectivity, reason is inherently purposeful and interested. This seems pretty common-sensical I suppose, but in this context it's kind of radical.
from SneerClub
  • SneerClub is the only anti- Rationalism hangout I know of. It's a bit too sneery for me, but I comment there on occasion. I have mixed feelings about Rationalism, and while I think their core beliefs are wack their practices are occasionally worthy and interesting, and I don't really want to take a sneering stance towards them. But sometimes I can't resist.
from atheism
  • The modern experiment has been to see if you could get by without it – whether you can drop god-talk and still retain good-talk, that is, any coherent notion of value. My sense is that this is an utterly failed project, and you can see the church of Rationalism as a well-intentioned but quite hopeless attempt to conjure up a full culture based on atheist utilitarianism.
from bird's eye view vs. frog's eye view
  • It seems to me that subjectivity and objectivity need to be balanced and integrated. Too much emphasis on the objective, and you get eliminativism or gradgrindism or Rationalism. Too much emphasis on the subjective and you get the rancid aspects of postmodernism and whatever it is that seems to afflict the younger generation, a kind of toxic emotional entitlement.
from LWMap/Being a Robust Agent
  • I think I've arrived at a compact understanding of what Rationalism is:
    • start with the natural goal-seeking and problem-solving abilities of actual humans
    • abstract this out so you have a model of goal-seeking in general.
    • assume that computational technology is going to make hyperaccelerated and hypercapable versions of this process (ignoring or confusing the relationship between abstract and actual goal-seekers)
    • notice that this is dangerous and produces monsters.
from AI Risk
  • The long-term superintellignce risk that is an obsession of Rationalism.
from illegibility
  • Rationalism seems too oriented towards legibility; for my taste at least. As I've said elsewhere, they seem intellectually retro, and haven't gotten the news about the limits of reason and representation:
    • The whole movement is kind of retro in a way that is sometimes appealing but just as often appalling. Peter Sloterdijk labelled rationalists as "the Amish of postmodernism" and it often does seem like an effort to be staunchly and cluelessly devoted to ideas that nobody really takes seriously any more.
from LWMap/A Map That Reflects the Territory
  • The Rationalism community has packaged up some of the best of LessWrong into book form, and when I saw that one of the five focus topics was agency I could not resist asking for a review copy, that being something of a pet subject of mine. Now I have to follow through with a review, and I'm taking the opportunity to also completely rebuild my writing and publishing stack.
from LWMap/Coherence Arguments Do Not Imply Goal Directed Behavior
  • The Rationalism counter to this, I think, is to say that humans are imperfectly rational due to the accidents of evolution, but AIs, being designed and untroubled by the complexity of biology, will be able to achieve something closer to theoretical rationality. Since this is provably better than what humans do, humans are potentially in deep trouble. Hence they have taken on the dual task of making humans more rational, and figuring out how to constrain AIs so they won't kill us.
from Technic and Magic
  • Technic is not exactly technology, but it's close enough to make me slightly defensive. As a software guy I have a professional interest in untangling technology from the bad ideas it is associated with, include Rationalism and the like.
from SlateStarCodex
  • My general opinion: he's an amazingly prolific and clever writer but there's something off about his viewpoint. This is part of what has gotten him into trouble with the mainstream, and it's quite related to my general objections with Rationalism. I've written quite a bit trying to pick apart some of his posts, and I freely admit that has generally been a very rewarding intellectual experience even if I don't end up vibing with him.
from winning
  • The constant references to "winning" in Rationalism discourse really grate on my nerves. I get what work it is doing – it's suggesting that life is a kind of competitive game, in which there is some kind of scoring metric, and you are able to compare your score with others. The best, most rational ideas are those that produce the most winning.
from About
Twin Pages

Rationalism

21 Dec 2020 03:59 - 01 Jan 2022 07:48

    • Rationalism is a movement of nerdy types (in both the best and worst senses), centered around the LessWrong website. Should be distinguished from small-r rationalism, which is just a philosophical position. Rationalism goes beyond the small-r version in that it is also a self-help movement that tries to promote what it considers better ways of thinking and being.
    • It has a very particular theory of what that means, comprising a theory of knowledge (representational objectivism) and of action (optimizing aka winning). Both of these theories seem extremely weak to me, in that they don't adequately describe the natural phenomena they are supposed to be about (human intelligence) and they don't serve as an adequate guide for building artificial versions of the same.
    • Nevertheless they manage to do a lot of interesting thinking based on this inadequate framework, and they attract smart and interestingly weird people, so I find myself paying them attention despite my disdain for their beliefs. A lot of this text is about me trying to work out this contradiction.
    • The other component of Rationalism is a belief that superintelligent AI is just around the corner and poses a grave ("existential") threat to humanity, and it is their duty to try to prevent this.
    • Rationalists have founded MIRI (the Machine Intelligence Research Insititute) to deal with this problem; and CFAR (Center for Applied Rationality) to promulgate rationalist self-improvement techniques. They are also tightly connected to the Effective Altruism movement. They've attracted funding from shady Silicon Valley billionaires and allies from within the respectable parts of academia. And they constitute a significant subculture within the world of technology and science, which makes them important. They are starting to penetrate the mainstream, as evidenced by this New Yorker article about some drama on the most popular rationalist blog, SlateStarCodex.
    • [update 2/13/2021: the mentioned New York Times article finally dropped and it seems pretty fair.
      • SlateStarCodex was a window into the Silicon Valley psyche. There are good reasons to try and understand that psyche, because the decisions made by tech companies and the people who run them eventually affect millions.
    • Rationalists occasionally refer to their movement as a "cult" in a half-ironic way. It has a lot of the aspects of a cult: an odd belief system, charismatic founders, apocalyptic prophecies, standard texts, and a certain closed-world aspect that both draws people in and repels outsiders. But it's a cult for mathematicians, and hence its belief system is a lot stronger and more appealing than, say, that of Scientology.
    • The NYT article has a quote by Scott Aaronson (a Rationalist-adjacent mathematician):
      • They are basically just hippies who talk a lot more about Bayes’ theorem than the original hippies.
    • Now, this is quite true in that Rationalists constitute a subculture and have established a network of group houses, have a lot of promiscuous sex (aka "polyamory"), and are into psychedelics. On the other hand in Meditations on Meditations on Moloch I find that they've inverted some key hippie attitudes, for better or worse. They embrace what the hippies rejected and want to build a world on different principles.
    • Some admirable things about Rationalists

      • They are super-smart of course. They seem to attract mathematicians who are too weird for academia, and we sure need more people like that.
      • Their ideas tend to be simple, precise, and stated with extreme clarity.
      • They want to save the world and otherwise do good.
      • They are serious and committed about putting their ideas into practice.
      • They are very reflective about their own thinking, and seek to continually improve it.
    • My major gripes

      • Assuming that the overarching goal of life is "winning"
      • Overly mathematical (confusing map with territory)
      • Occasional extreme arrogance
      • A sort of impenetrable closed-world style of self-justification.
      • Retro taste in ideas and sometimes esthetics.
        • Sloterdijk made a good crack (in You Must Change Your Life) about small-r rationalists; he called them "the Amish of postmodernism". Of course if that metaphor holds, then I should leave them alone to their quaint and deeply held beliefs, which might end up being superior to the mainstream for long-term survivability.
      • Connections (socially and intellectually) to unpleasant political movements like libertarianism, objectivism, and neoreaction, fueled by an antipolitics stance that is ultimately shallow.
      • Taking as axiomatic things that are extremely questionable at best (orthogonality thesis, Bayesianism as a theory of mind).
    • A Rationalist 2x2

      • OK, this was really just an experiment to see if I could make.a 2x2 table in Roam and yes, I could and it was pretty easy!
      • The upper-left and bottom-right quadrants are pretty self-explanatory.
      • The top-right is a bit contradictory because the claim of importance is central to Rationalism. They believe they are literally saving the world from likely destruction by superintelligent AIs, and what could be more important than that? But they could be wrong about their importance while still producing intellectual value, so this represents that possibility.
      • The bottom-left represents the possibility that Rationalism is not only wrong, but harmful, in that it distracts smart people from working on real problems, and to the extent it becomes a dominant ideology in the tech world its becomes that much more harmful. Also to the extent that Rationalism is an ally of bad political ideas (considerable), it's not just a harmless nerd social club.