

LLM Labs: Model Behavior
Registration
Past Event
About Event
Hands-on experimentation with system instructions and model behavior
Join us for an interactive session where we'll put system prompts to the test. We'll compare the same base model with and without different system instructions, exploring how prompts shape tone, style, and responses across different languages.
Bring your native language (any language welcome!) and help us discover how models behave differently when guided vs. unguided. We'll experiment with English, Japanese, German, Hindi, and whatever languages participants bring.
No prerequisites needed.
Format: Interactive lab exercises with live comparisons
Bring: Your laptop
RSVP to receive detailed lab instructions and setup guide.