Cover Image for Book 4 → If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All (Eliezer Yudkowsky & Nate Soares)
Cover Image for Book 4 → If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All (Eliezer Yudkowsky & Nate Soares)
Avatar for AI zeitgeist
Presented by
AI zeitgeist
An online book club where we’ll read books on AI politics, economics, history, science, biology, philosophy, concerns, future (see website)
Hosted By
52 Went

Book 4 → If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All (Eliezer Yudkowsky & Nate Soares)

Google Meet
Registration
Past Event
Welcome! To join the event, please register below.
About Event

Theme: AI alignment / control

Presents a stark, argument-by-parable case that creating a superhuman AI without absolute safety guarantees will lead to human extinction—a warning framed as non-negotiable.

The authors are the OG figures in AI safety, urgently argue that this book may be the most important book of our time, painting a world where we're speeding toward catastrophe unless we change course.

Who's it for?

Anyone, anywhere can join! You’re welcome to join one, a few, or all seven book events. Ideally, you’ll have read and thought through the book, but the minimum is a few hours of “vibe reading” — enough to get a feel for it, with the intention to finish later.

See more at this page.

Avatar for AI zeitgeist
Presented by
AI zeitgeist
An online book club where we’ll read books on AI politics, economics, history, science, biology, philosophy, concerns, future (see website)
Hosted By
52 Went