AI Nemotron 3: NVIDIA goes wide open on big-context, mixture-of-experts model The chipmaker opened its Transformer-Mamba MoE model, plus a trove of training data, in hopes it will "help strengthen an open ecosystem" for AI. Phillip de Wet Dec 15, 2025 - 2 min read Image credit: https://unsplash.com/@jogara_0 Get the full story: Subscribe for free Join peers managing over $100 billion in annual IT spend and subscribe to unlock full access to The Stack’s analysis and events. Subscribe now Already a member? Sign in