This episode includes an Overtime segment that’s available to paid subscribers. If you’re not a paid subscriber, we of course encourage you to become one. If you are a paid subscriber, you can access the complete conversation either via the audio and video on this post or by setting up the paid-subscriber podcast feed, which includes all exclusive audio content. To set that up, simply grab the RSS feed from this page (by first clicking the “...” icon on the audio player) and paste it into your favorite podcast app.
If you have trouble completing the process, check out this super-simple how-to guide.
0:00 Steven’s years at OpenAI
1:59 An insider's view of the AI boom
11:33 The real meaning of “Feel the AGI”
16:21 AI safety’s rationalist kernel
24:13 What’s really driving the AI arms race(s)?
29:27 Is stifling China’s AI development smart?
44:05 Dario Amodei’s geopolitical naivete
56:10 Heading to Overtime
Robert Wright (Nonzero, The Evolution of God, Why Buddhism Is True) and Steven Adler (https://stevenadler.substack.com). Recorded May 21, 2025.
Twitter: https://twitter.com/NonzeroPods
Steven’s newsletter:
Overtime titles:
0:00 Unpacking OpenAI’s restructuring
5:07 OpenAI's mission—then and now
9:33 The stubborn logic of sycophantic AI
12:42 What a near-term AI catastrophe could look like
15:43 Is the agentic AI era here? Or even near?
Overtime video: