"IABIED Book Review: Core Arguments and Counterarguments" by Stephen McAleese

LessWrong (Curated & Popular) • February 05, 2026 • Solo Episode

Guests

No guests identified for this episode.

Description

The recent book “If Anyone Builds It Everyone Dies” (September 2025) by Eliezer Yudkowsky and Nate Soares argues that creating superintelligent AI in the near future would almost certainly cause human extinction: If any company or group, anywhere on the planet, builds an artificial superintelligence using anything remotely like current techniques, based on anything remotely like the present understanding of AI, then everyone, everywhere on Earth, will die. The goal of this post is to sum...

Audio