This episode includes an Overtime segment that’s available to paid subscribers. If you’re not a paid subscriber, we of course encourage you to become one. If you are a paid subscriber, you can access the complete conversation either via the audio and video on this post or by setting up the paid-subscriber podcast feed, which includes all exclusive audio content. To set that up, simply grab the RSS feed from this page (by first clicking the “...” icon on the audio player) and paste it into your favorite podcast app.
If you have trouble completing the process, check out this super-simple how-to guide.
0:00 Holly’s newsletter and her work with PauseAI
3:21 The state of AI safety
10:30 Why Holly started PauseAI US
14:31 Is AI acceleration… accelerating?
23:23 What rationalists got wrong about the singularity
29:19 Mechanize Inc. and the danger of AI safety sellouts
39:46 Holly: Here’s why AI should alarm you
48:01 Heading to Overtime
Robert Wright (Nonzero, The Evolution of God, Why Buddhism Is True) and Holly Elmore (PauseAI US). Recorded April 29, 2025.
Twitter: https://twitter.com/NonzeroPods
Holly’s newsletter:
Overtime titles:
0:00 GPT-4o sycophancy, e/acc zealotry
6:05 Why are the cool kids sneering at international cooperation?
28:33 The road(s) to AI takeover
36:37 What’s really behind big tech’s China hawkism?
41:15 The Dwarkesh question