Unlocking Mixture of Experts : From 1 Know-it-all to group of Jedi Masters
07-10, 14:35–15:05 (Europe/Prague), Terrace 2B

Answer this : In critical domains like Healthcare would you prefer a Jack-of-all-trades OR one Yoda, the master?

Join me on an exhilarating journey as we delve deep into the Mixture of Experts (MoE) technique which is a practical and intuitive next-step to elevate predictive powers of generalised know-it-all models.

A powerful approach to solve a variety of ML tasks, MoE operates on the principle of Divide and Conquer with some less obvious limitations, pros, and cons. You’ll go through a captivating exploration of insights, intuitive reasoning, solid mathematical underpinnings, and a treasure trove of interesting examples!

We'll kick off by surveying the landscape, from ensemble models to stacked estimators, gradually ascending towards the pinnacle of MoE. Along the way, we'll explore challenges, alternative routes, and the crucial art of knowing when to wield the MoE magic—AND when to hold back. Brace yourselves for a business-oriented finale, where we discuss metrics around cost, latency, and throughput for MoE models. And fear not! We'll wrap up with an array of resources equipping you to dive headfirst into pre-trained MoE models, fine-tune them, or even forge your own from scratch. May the force of Experts be with you !"


Expected audience expertise

Intermediate

I am on a quest to solve problems with data using ML and AI. Over the years I have built multiple industry first solutions for diverse domains like Marketing, Supply Chain, Polymers and Chemicals etc. I have consulted numerous multi-national companies, helping them build in house capabilities ( via corporate training ), providing Project based support and developing SoA solutions for their customers. I love sharing my experiences and giving back to the community I have learned so much from. This would by my 3rd talk at EuroPython!