



Open-source multilingual embedding model using Mixture-of-Experts architecture, achieving excellent semantic performance with efficient inference and full offline support.
Loading more......
Nomic Embed Text V2 is an open-source, multilingual embeddings model that uses the Mixture-of-Experts (MoE) architecture, achieving excellent semantic performance while maintaining efficient inference.
Free and open-source for self-hosting, API access with competitive pricing