

MoE: Mixture of Experts explained with Daria Soboleva, Head Research Scientist @ Cerebras
South Park Commons is delighted to host Daria Soboleva as she explains MoE: Mixture of Experts.
Event Details
4:30pm: Doors Open
5:00-6:00pm: Lecture + Discussion
Daria Soboleva is a Head Research Scientist at Cerebras with nearly a decade of AI/ML experience. She designed MoE recipes from the ground up for Cerebras hardware and now leads training at unprecedented scale across hardware, software, and ML teams. From this hands-on work, she distills practical MoE recipes into The MoE 101 Guide.
Her focus is building efficient, scalable AI systems end to end, spanning data, models, and infrastructure. This includes SlimPajama, a 627B-token dataset with over 1M downloads, and BTLM, a 3B model achieving 7B-class quality while using 3× less inference compute.
Previously, Daria built ML systems used by millions. At Yandex, she invented the YATI model that powers Yandex Search. At Google, she focused on improving Google Assistant's ASR and built models that were deployed in Google Captions and Google Gboard.
About South Park Commons (SPC)
SPC is a technical community and venture fund dedicated to helping founders, researchers, and technologists figure out what to work on next—what we call the -1 to 0 stage of your career. We believe this is best accomplished in the most talent-dense community possible.
If you are exploring what's next, we encourage you to apply to SPC.
*Please note doors close at event start time and we are unable to accommodate late arrivals.