![Cover Image for [Virtual Event] Production Monitoring for Agents](https://images.lumacdn.com/cdn-cgi/image/format=auto,fit=cover,dpr=2,background=white,quality=75,width=400,height=400/event-covers/gj/0ede8a0e-45bd-4677-ae94-9786c9014c80.png)
![Cover Image for [Virtual Event] Production Monitoring for Agents](https://images.lumacdn.com/cdn-cgi/image/format=auto,fit=cover,dpr=2,background=white,quality=75,width=400,height=400/event-covers/gj/0ede8a0e-45bd-4677-ae94-9786c9014c80.png)
[Virtual Event] Production Monitoring for Agents
Join Harrison Chase, Co-founder and CEO at LangChain, for a technical deep dive into why production monitoring for AI agents requires a new approach to observability.
When you ship traditional software to production, you usually have a good sense of what to expect. Users click buttons, fill out forms, and navigate through more predictable paths. Your test suite may cover a majority of code paths, and monitoring tools track the usual signals: error rates, response times, and database queries. When something breaks, you look at logs and stack traces.
Agents operate differently. They accept natural language input, where the space of possible queries is unbounded. They are powered by LLMs that are sensitive to subtle changes in prompts and can produce different outputs for the same input. They also make decisions across multi-step workflows, tool calls, and retrieval operations that are difficult to fully anticipate during development.
In this session, we’ll explore:
Why production monitoring for agents is fundamentally different from traditional observability, and why understanding what your agent is actually doing requires visibility into the interactions themselves.
What teams need to monitor in production, including prompt-response pairs, multi-turn context, and the full trajectory of an agent across tool calls, intermediate steps, and outputs.
Why traces become the foundation for debugging, identifying failure patterns, and understanding how agents behave once they are exposed to real users.
Agenda:
11:00am: Presentation by Harrison
11:30am: Q&A session