

From Internal Metrics to Customer-Facing Analytics: Building a Scalable Usage Insights Platform with Delta Sharing
For SaaS platforms, understanding how users interact with the product is essential for driving engagement, improving features, and delivering value. But what if those same usage metrics could also be shared back to the customers themselves, to help them understand their own adoption and push internal adoption across teams?
At DataGalaxy, we designed a scalable, self-service analytics experience that gives our customers real-time visibility into how their teams are using our platform. The twist? Instead of embedding dashboards powered by yet another internal data store, we expose a Delta Sharing feed, enriched and aggregated from our centralized usage pipeline, and connect it directly to embedded Superset dashboards.
By turning usage data into a shareable, secure product layer, we're redefining what embedded analytics can look like for SaaS — without compromising control, performance, or maintainability.
Speaker
Alexandre Bergere, Head of Data & AI Engineer, Partners at @DataGalaxy & Data Architect freelance at @Datalex
Bio: Alexandre Bergère is a data and cloud engineering leader specializing in modern data platforms, AI-driven SaaS architectures, and data governance. Based in Paris, he combines deep technical expertise with strategic and entrepreneurial leadership to help organizations navigate the complexity of large-scale data and cloud ecosystems.
He leads large-scale cloud and data initiatives as Head of Data & AI Engineering at DataGalaxy, where he drives platform architecture, AI integration, and strategic partnerships across the data ecosystem. Co-Founder and CTO of DataRedKite (now part of DataGalaxy), he previously built and scaled an observability platform for the modern data stack. Deeply committed to education, Alexandre regularly teaches modern data architectures, cloud computing, and data engineering at leading institutions.
Host
Robert Pack, Staff Developer Advocate, Databricks
Bio: Robert Pack is a data and AI systems expert with foundations in process systems engineering and numerical optimization, holding degrees from RWTH Aachen University and Imperial College London. He began his career in process R&D within a multinational chemical company, where he designed and led transformative cloud and data initiatives that shaped the organization's digital landscape. As Chief Technology Lead, he was responsible for global data and AI platform engineering.
Robert’s open-source journey began with deep contributions to Delta Lake and Delta-RS and later expanded into the broader Rust and Apache Arrow data ecosystems. Today, as part of Databricks, he serves as a Developer Advocate dedicated to fostering a modern, high-performance data ecosystem for all. Through his work on projects such as Delta Kernel, Delta Lake, Unity Catalog, and Apache Iceberg, Robert remains committed to advancing the Open Lakehouse vision and facilitating the next generation of data infrastructure.