optimizing

30 Oct 2021 - 28 Jun 2025
Open in Logseq
    • One of my gripes with Rationalism is the unquestioned assumption that intelligence is about optimizing some quantity. Closely related to the similar gripe about winning. I find this an impoverished way to think about the mind.
    • That equation of rationality with maximizing some quantity is often grounded out in the Von Neumann–Morgenstern utility theorem, which shows that under certain axioms of rationality, agents will behave as if they are maximizing some function. This is unconvincing because the axioms involved do not correspond to how real minds work.
    • The informal intuition is that the logics of evolution and economics seem like they inexorably impose competition on us; and so there is always going to be something corresponding to winning and losing and thus scoring high or low, even when that is not an explicit way of thinking.
    • I'm a conflict theory type of guy so would not want to go against the idea that competition is fundamental. Something about the optimization framework doesn't sit right with me though. Humans have the unique ability to redefine their own competitive landscapes, and reducing all human intelligent action to an optimization problem seems to miss what thinking is really about.
    • So it's not that optimization isn't real, it's that it's not a good way for humans to think about themselves. I guess this is most evident in the area of reproduction; the most basic form of optimization, implemented by evolution. With humans, reproductive success is obviously important but people who pursue it directly via some kind of rational process are considered kind of creepy, and often don't do very well at it. It's the kind of goal that can only be achieved indirectly (see anti-purpose).
    • Even evolution isn't quite the grim Molochian engine it is sometimes portrayed as. If it was just optimizing something, you'd think it would just produce oceans of E. Coli or something similar, rather than the wild profusion of forms it actually comes up with. Which again is not to say that optimization theory doesn't have a place, but it doesn't seem to have quite as fundamental a place as Rationalists seem to assume.
    • ‘[Definitions] of intelligence used throughout the cognitive sciences converge towards the idea that “Intelligence measures an agent’s ability to achieve goals in a wide range of environments”,’ write Anna Salamon and Luke Muehlhauser in Intelligence Explosion: Evidence and Import, a research paper published by the Machine Intelligence Research Institute. ‘We might call this the “optimisation power” concept of intelligence, for it measures an agent’s power to optimise the world according to its preferences across many domains.’
      • Tom Chivers. The AI Does Not Hate You: Superintelligence, Rationality and the Race to Save the World
    • I Prefer World Optimisation
    • Chaos Magic Is A Permission Field Amplifier - by Gordon
      • If I have a personal Abyss, it is optimisation. And I might still be stuck in it. To wit: optimisation should not be the core pursuit of your life. That is, as Bayo Akomolafe says, “how capitalism dreams.” So your life would be better -it would improve- if you cease to make optimisation a core pursuit. Which is itself an optimisation of your life. See?
      • Here, there be dragons: The Man and his Apple Vision Pro • Writings – Bayo Akomolafe
        • It needs to be repeated: Whiteness is optimized an-exposure.
        • Optimization is a ruling out. A fencing in. A managerial logistics concerned with interventions that reduce the raw, growling surfaces of the material to ergonomic interfaces. Screens, screens everywhere! A representational grid that locks out the surprising indeterminacy of things and proliferates a stable, safe archive of encounters. A taming of experience.
        • An optimized world is finished, complete, and defragged. In such a world, exposure is mere reflection – a duplication of images, a masturbatory reinforcement of our own selves. Ironically, with the Apple Vision Pro strapped to our heads, we risk losing vision. We risk slicing off the tentacularity of vision for its more optimized, an-exposed counterpart. The ‘vision’ that springs out of this corporate, megalithic, archetypal matrix of agencies and algorithms is a political distillation of what white modernity seeks to do. What the Man wants to do: to reduce the world to representation.