Cover Image for AI Hallucination
Cover Image for AI Hallucination
166 Went

AI Hallucination

Hosted by pramod pawar & Mukta Aphale
Register to See Address
Pune, Maharashtra
Registration
Registration Closed
This event is not currently taking registrations. You may contact the host or subscribe to receive updates.
About Event

LLMs generate factually incorrect outputs (hallucinations) that look perfectly plausible, and detecting these requires analysing semantic uncertainty at meaning-level rather than token-level—a capability most observability platforms lack.

*Kindly note: This event will have limited entry for students. So if you are a student, you will be allowed entry on first come first serve basis only. (Apologies in advance!)

10:00-10:30: Introduction

10:30-11:00: Karan Shingde, ML Engineer @AiHello

11:00-11:30: Jagminder Sehrawat, CoFounder @Famli

11:30-12:00: Shubham Mhaske, AI Engineer @Genzeon

12:00-12:30: Round table discussion with all attendees sharing their knowledge, challenges and solutions about hallucination detection

12:30 onwards: Open Networking & Refreshments

Location
Please register to see the exact location of this event.
Pune, Maharashtra
166 Went