

How Do We Make AI Compute Efficient—Without New Chips?
As AI and cloud workloads race ahead of data-center power budgets, computing is hitting energy and carbon ceilings. The opportunity: software-first efficiency—smarter algorithms, compilers, schedulers, and runtime orchestration that cut joules-per-task on today’s fleets.
In this 45-minute deep dive, we invite founders, investors, and infrastructure leaders to explore how to design and operate low-energy AI systems without waiting for new hardware. Together, we will:
Unpack the compute stack—algorithms, compilers, schedulers, and runtimes—and identify where software changes deliver the biggest energy wins.
Compare architecture choices (serverless vs. containers vs. bare-metal) and how hybrid mixes optimize cost, latency, and carbon.
Debate the edge: renewable assets as “the new data centers.” Can batteries, EVs, grid controllers, smart meters, and robots reliably host AI workloads?
This session is designed as an open, interactive dialogue. We’ll pressure-test assumptions, share insights, and collaborate across the founder–operator–investor stack on what it takes to make compute efficiency (and its carbon / energy impact) a first-class product and business outcome.