Cover Image for vLLM Inference Meetup Kochi
Cover Image for vLLM Inference Meetup Kochi
Avatar for vLLM Meetups and Events
Join the vLLM community to discuss optimizing LLM inference!
76 Going
Registration
Approval Required
Your registration is subject to host approval.
Welcome! To join the event, please register below.
About Event

vLLM Inference Meetup in Kochi

We are excited to invite you to the vLLM meetup in Kochi hosted by Red Hat, NxtGen, Libreminds and Jain University.

This meetup brings together vLLM users, developers, maintainers, and engineers to explore the latest in optimised inference. Expect in-depth technical talks, practical demonstrations, and ample time to connect with the community.

​​What to Expect

  • Technical insights

  • Networking with industry experts

  • Hands-on learning & demos (GPUs provided by NxtGen )

​What to Bring

  • Your laptop with SSH installed (GPU instances provided by the organizers)

  • A government‑issued photo ID for venue security

  • Curiosity for tech insights and demos!

Agenda

  • Opening & Strategic Context: Why inference matters

  • vLLM Platform Overview: High-Performance Inference as a Competitive Advantage

  • lm-d at Scale: Distributed Inference for Cost, Throughput, and Resilience

  • Intelligent Routing: Optimizing Workload Efficiency with Semantic Decisioning

  • Codex integration with vLLM

Break

  • Hands-on Lab: vLLM Inference with GPU

    Agenda is subject to change. We may add extra demos or lightning updates.

    Registration closes 24 hours before the event. We cannot admit unregistered attendees.

    Please bring a photo ID to verify your registration on arrival.

    See you in Kochi!

    If you are building, deploying, or scaling inference, this is the room to be in. See you soon!

    Location
    Jain University Kochi
    Knowledge Park - Kochi, Nirmal Infopark Infopark P.O Kakkanad, Infopark Campus, Kochi, Kakkanad, Kerala 682042, India
    Avatar for vLLM Meetups and Events
    Join the vLLM community to discuss optimizing LLM inference!
    76 Going