

vLLM Inference Meetup · Knoxville & Oak Ridge
Deep technical sessions. Live demos. Real conversations.
Join Red Hat AI, UT College of Emerging and Collaborative Studies, and the AI Community in Knoxville and Oak Ridge for an evening of technical depth:
Hear directly from vLLM maintainers and committers
See live demos of real inference workflows
Connect with students, professionals, researchers, and practitioners pushing the state of the art
Meetup Agenda
4:00–4:30 PM — Doors Open, Check-In
4:30–4:40 PM — Welcome and Opening Remarks
Dr. Aysegul Cuhadar, Associate Professor for UT CECS and Dr. Monica Ihli, Principal AI Engineer
4:40–5:10 PM — Intro to Open Source AI, vLLM, and vLLM Project Update
Michael Goin, vLLM Maintainer and Principal Engineer, Red Hat AI
5:10–6:10 PM — Additional Talks from the East TN Community
More Details to Come!
6:10–6:30 PM — Community Discussion and Q&A
All Speakers and Community
6:30–7:30 PM — Meet the Speakers, Networking, Pizza and Drinks Provided
Who Should Come
vLLM users and contributors
ML and infra engineers working on inference and serving
Platform teams running GenAI in production
Students, practitioners, and anyone curious about efficient inference
Before You Arrive
Registration closes 24 hours before the event
Unregistered attendees cannot be admitted
See you in Knoxville - Oak Ridge! The inference conversation starts here.