Cover Image for Book 4 → If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All (Eliezer Yudkowsky & Nate Soares)
Cover Image for Book 4 → If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All (Eliezer Yudkowsky & Nate Soares)
Avatar for AI zeitgeist
Presented by
AI zeitgeist
Hosted By
8 Going

Book 4 → If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All (Eliezer Yudkowsky & Nate Soares)

Google Meet
Registration
Welcome! To join the event, please register below.
About Event

Theme: AI alignment / control

Presents a stark, argument-by-parable case that creating a superhuman AI without absolute safety guarantees will lead to human extinction—a warning framed as non-negotiable.

The authors are the OG figures in AI safety, urgently argue that this book may be the most important book of our time, painting a world where we're speeding toward catastrophe unless we change course.

See more at this page.

Avatar for AI zeitgeist
Presented by
AI zeitgeist
Hosted By
8 Going