Cover Image for Book 4 → If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All (Eliezer Yudkowsky & Nate Soares)
Cover Image for Book 4 → If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All (Eliezer Yudkowsky & Nate Soares)
Avatar for AI zeitgeist
Presented by
AI zeitgeist
Hosted By
43 Going

Book 4 → If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All (Eliezer Yudkowsky & Nate Soares)

Google Meet
Registration
Welcome! To join the event, please register below.
About Event

Theme: AI alignment / control

Presents a stark, argument-by-parable case that creating a superhuman AI without absolute safety guarantees will lead to human extinction—a warning framed as non-negotiable.

The authors are the OG figures in AI safety, urgently argue that this book may be the most important book of our time, painting a world where we're speeding toward catastrophe unless we change course.

Who's it for?

Anyone, anywhere can join! You’re welcome to join one, a few, or all seven book events. Ideally, you’ll have read and thought through the book, but the minimum is a few hours of “vibe reading” — enough to get a feel for it, with the intention to finish later.

See more at this page.

Avatar for AI zeitgeist
Presented by
AI zeitgeist
Hosted By
43 Going