Cover Image for Inside Goodfire: Building safer AI systems with interpretability
Cover Image for Inside Goodfire: Building safer AI systems with interpretability
Avatar for BlueDot Impact
Presented by
BlueDot Impact
We’re building the workforce needed to safely navigate AGI. Contact: [email protected]

Inside Goodfire: Building safer AI systems with interpretability

Zoom
Registration
Welcome! To join the event, please register below.
About Event

How can we build safer AI systems with interpretability?

Join Goodfire and BlueDot Impact for a look at how Goodfire is working to make AI models something you can understand, debug and deliberately design.

Most of the AI industry treats models as opaque systems — training them at scale and evaluating them by their outputs. Goodfire is taking a different approach. By studying how models organise knowledge internally, they're building tools that let developers inspect and steer what AI models learn.

Founded in mid-2024, Goodfire has brought together researchers from interpretability teams at Google DeepMind and OpenAI, raised $207M from Menlo Ventures, Lightspeed, and Anthropic, and shipped Ember — a hosted API that lets developers work with a model's internal features directly.

We'll be having a conversation with Dan Balsam, co-founder & CTO of Goodfire.

He leads Goodfire's engineering and technical direction and is also a graduate of BlueDot's AI Alignment course.

Whether you’re an aspiring researcher, engineer, or policy professional, you’ll gain a deeper understanding of:

  • ​What Goodfire works on and why it matters

  • How Goodfire bridges its research and commercial products

  • ​Upcoming career opportunities and what Goodfire looks for in candidates


Part of our "Inside X" series: giving you exclusive access to the organisations building the future of beneficial AI.

Goodfire is also hiring: https://www.goodfire.ai/careers

Avatar for BlueDot Impact
Presented by
BlueDot Impact
We’re building the workforce needed to safely navigate AGI. Contact: [email protected]