

From APIs to Warehouses: AI-Assisted Data Ingestion with dlt
This hands-on workshop focuses on building reliable data ingestion pipelines to data warehouses (for example, Snowflake) using dlt (data load tool), enhanced with LLMs, the dlt dashboard, and dlt MCP.
You’ll work through the key building blocks of a production-ready ingestion setup, including:
Extracting data from APIs, files, and databases
Normalizing data into consistent schemas
Writing data to a data warehouse (e.g. Snowflake)
Using LLMs to accelerate dlt pipeline development
Validating data and schema changes using the dlt dashboard and dlt MCP
The session is fully practical and code-driven. By the end of the workshop, you’ll understand how to design maintainable, scalable ingestion pipelines and use AI and validation tools to build them faster and with confidence.
About the Speaker
Aashish Nair is a Data Engineer at dltHub and the creator of the famous dlt deployment course, where he teaches best practices for running dlt pipelines in production.
DataTalks.Club is the place to talk about data. Join our Slack community!
This event is sponsored by dltHub