Concepts
Concept

Mixture of Experts (MoE)

Mixture of Experts is a clever way to build really large AI models without making them slow. Instead of one giant brain that uses all its neurons for every question, MoE creates a team of specialist "experts" and a router that picks which experts to use for each question. This means the model can know a lot (because it has many experts) while staying fast (because only a few experts work at a time).