

AI/ML Workloads On Kubernetes
AI and machine learning workloads are increasingly becoming part of modern application platforms, raising important questions about how they are built, deployed, and operated. This session explores the intersection between AI/ML workloads and Kubernetes, focusing on where Kubernetes fits within the broader landscape of ML platforms and infrastructure choices.
Participants will gain a conceptual understanding of how containerization and orchestration concepts relate to AI/ML workflows, common architectural patterns, and the types of challenges teams consider when evaluating Kubernetes for training, inference, and data-intensive workloads. The session also touches on resource management, GPUs, scalability, and operational considerations that influence platform decisions.