![Cover Image for Teraflops, Gigawatts, and You [public]](https://images.lumacdn.com/cdn-cgi/image/format=auto,fit=cover,dpr=2,background=white,quality=75,width=400,height=400/event-covers/nu/657dff84-188b-4b6c-8615-efcca7f823f4.jpg)
![Cover Image for Teraflops, Gigawatts, and You [public]](https://images.lumacdn.com/cdn-cgi/image/format=auto,fit=cover,dpr=2,background=white,quality=75,width=400,height=400/event-covers/nu/657dff84-188b-4b6c-8615-efcca7f823f4.jpg)
Teraflops, Gigawatts, and You [public]
AI companies are expected to spend over $500 billion on chips and data centers in 2026—roughly what the US spent in 35 years building the entire interstate highway system, adjusted for inflation.
Justin Lebar (core contributor to Triton, LLVM member, CppCon speaker) will start from the specs of a single GPU and work his way up to the worldwide buildout—connecting dollars, teraflops, liters of water, and gigawatts along the way.
After this talk you'll be able to:
Contextualize "$500 billion in AI spending" and know how big that actually is
Develop an informed opinion on data center electricity, water, and climate impact
Name the supply chain players that matter—TSMC, Samsung, ASML—and why
Understand why chip geopolitics keeps making the news
No AI background needed. 45–60 min + Q&A.