Cover Image for Inside a Neolab: BDH and the Future of AI beyond the Transformer
Cover Image for Inside a Neolab: BDH and the Future of AI beyond the Transformer
Avatar for Pathway
Presented by
Pathway
Private Event

Inside a Neolab: BDH and the Future of AI beyond the Transformer

Register to See Address
Registration
Past Event
Welcome! To join the event, please register below.
About Event

The Wall Street Journal declared 2026 the year post-Transformers will dominate AI, naming Pathway alongside Yann LeCun, Ilya Sutskever, and Fei-Fei Li as the ones who will make it happen.

Stanford ACM is offering a rare chance to hear directly from the neo-lab building what comes next.

Join us for an intimate seminar with Jan Chorowski, CTO & Co-founder of the Palo-Alto based neolab Pathway as he presents BDH (Dragon Hatchling), a new large post-transformer architecture inspired by scale-free biological networks.

Memory and the ability to learn on the fly is the single biggest limitation facing current transformer-based AI models. Pathway is a post-transformer neo-lab that has solved that problem. We are delivering a faster path to AGI through true continuous learning and long horizon reasoning.

In this talk, Jan will walk through:

  • Why Transformers hit a wall, and what post-transformer architectures unlock

  • How BDH works: attention emerging from local, graph-based neuron interactions rather than centralized matrix multiplications

  • Mechanistic interpretability made real: large, sparse, non-negative latent spaces that map directly to neuron populations and show emergent monosemanticity

  • What continuous learning and long-horizon reasoning look like when built from the ground up

About the Speaker

Jan Chorowski is one of the most quietly influential figures in modern AI. A pioneer in neural sequence modeling, he was the first researcher to apply attention mechanisms to speech recognition. His work, including direct collaborations with Geoffrey Hinton and Yoshua Bengio, has accumulated 13,000+ citations and helped define the field of sequence-to-sequence learning.

Before co-founding Pathway, Jan held research and engineering roles at Google Brain and MILA.

Pathway is backed by leading investors and advisors, including Lukasz Kaiser, co-author of the original Transformer paper.

Format

🎤 Talk: 30-45 minutes
💬 Live Q&A, bring your hardest questions
🍕 Food included, because well... we love food.

This talk is open to all Stanford Faculty Members, Postdoctoral Fellows, students, Faculty Affiliates, and researchers at affiliated institutions.

📄 Associated paper: arxiv.org/abs/2509.26507

Seats are limited. RSVP now.

Location
Please register to see the exact location of this event.
Avatar for Pathway
Presented by
Pathway