Cover Image for BGA Workshop : Agentic AI - Protect your keys from agents
Cover Image for BGA Workshop : Agentic AI - Protect your keys from agents
Avatar for BGA Events
Presented by
BGA Events
Welcome to BGA Events Page! Join us for our upcoming events:

BGA Workshop : Agentic AI - Protect your keys from agents

Google Meet
Registration
Welcome! To join the event, please register below.
About Event

As AI coding agents become the new standard for Web3 development, a massive, unaddressed security blind spot has opened up on our own laptops. Join us for a 30–45 minute workshop with Daniel Tamas, founder of the agentic gaming company WAM, to learn how to secure your local environment and protect your critical crypto keys from AI exposure using the open-source tool, Cloak. ​​**

The Problem : The Context Window Leak: Every coding agent (Cursor, Claude Code, Copilot) scans your working directory to build context. This means your .env files—containing live private keys, wallet mnemonics, and payment secrets—are read and sent over the wire to AI providers on almost every prompt. No exploit required; it's just how they work. The .gitignore Illusion: Developers falsely believe .gitignore keeps their secrets safe. While it protects your git repository, it does absolutely nothing to prevent a local AI process from reading cat .env and exposing your real credentials from the disk. The Permanent Damage: In Web3, one leaked private key means irreversible loss of funds. Once a secret hits an external cache, a log file, or a fine-tuning pipeline on someone else's infrastructure, the window for damage is permanently open. Existing enterprise secret managers do not solve this local threat. ​​**

Why this workshop matters for BGA Members** Secure Your Agentic Workflow: Learn how to safely accelerate your Web3 game development using AI assistants without exposing your real-money payment rails, smart contract admin keys, or database URLs. Implement Zero-Trust Local Dev: Discover how to intercept, encrypt, and substitute your real credentials. You'll learn how to keep real secrets in a local, biometric-gated AES-256-GCM vault while feeding structurally valid "sandbox" fakes to AI agents on your disk. Maintain Zero-Friction Building: See how seamlessly this integrates into VS Code, Cursor, and Windsurf. You see and edit real credentials in your buffer, but the moment you save, the AI only sees fakes. Make Your Agent the Defender: Learn how to drop a simple SKILL.md file into your project to make your AI actively security-aware, forcing it to use secure runtime injection (cloak run) instead of reading your .env directly. ​​**

Agenda : The Silent Threat: How AI Coding Agents Actually Read Your Files Why Traditional Secret Managers & .gitignore Fail Local Devs Live Demo: Intercept, Encrypt, and Substitute with Cloak Agent Skills: Writing a SKILL.md to Make Your AI Security-Aware Case Study: Securing WAM's Crypto Payment Rails and Wallets And more... ​​**

Open Q&A

Avatar for BGA Events
Presented by
BGA Events
Welcome to BGA Events Page! Join us for our upcoming events: