

Istanbul | Claude + Local LLMs: Building Hybrid AI Workflows for Regulated Industries
One of the biggest barriers to adopting powerful AI models in enterprise is compliance. GDPR, KVKK, and sector-specific data regulations make it hard to simply hand sensitive data to a cloud API.
This meetup is about a practical answer to that problem: hybrid AI pipelines, where local LLMs handle sensitive data on-device, and Claude handles the heavy reasoning on anonymized output. You get compliance and state-of-the-art intelligence, without choosing between them.
This time we are meeting at Istanbul Esports and Gaming Center, which gives us access to high-end GPUs and NVIDIA DGX infrastructure. A big thank you to them for making that possible. We will be running live demos on the actual hardware, so you will see these workflows in action, not just on slides.
What we'll cover:
The legal perspective. Attorney Elif Eryılmaz on what KVKK actually requires, where the lines are, and how local LLM architectures hold up under scrutiny.
PII masking in practice. Said Sürücü walks through his workflow of using Claude Code to fine-tune small local models for anonymizing sensitive data, and shares real results.
Real-world local LLM deployments on NVIDIA DGX infrastructure. Alican Kiraz walks through projects he's built and what actually works in production.
Fine-tuning local models for Turkish. Mehmet Ali Bayram on performance gains and the practical tradeoffs of running your own models.
Who this is for:
This is a small, focused session. Priority registration goes to professionals at organizations actively evaluating local LLM deployment due to regulatory constraints, including legal, healthcare, finance, and public sector. If you've been sitting on the fence because of compliance concerns, this one is for you.
Capacity is limited. Register early.