• Home
  • Categories
  • Tags
  • Pricing
  • Submit
    Decorative pattern
    1. Home
    2. Machine Learning Models
    3. Nomic Embed Text v2

    Nomic Embed Text v2

    Open-source multilingual embedding model using Mixture-of-Experts architecture, achieving excellent semantic performance with efficient inference and full offline support.

    🌐Visit Website

    About this tool

    Overview

    Nomic Embed Text V2 is an open-source, multilingual embeddings model that uses the Mixture-of-Experts (MoE) architecture, achieving excellent semantic performance while maintaining efficient inference.

    Features

    • Open-source and fully auditable
    • Mixture-of-Experts (MoE) architecture
    • Multilingual support
    • Can run completely offline via Ollama
    • Efficient inference on consumer hardware
    • No external API dependencies required

    Deployment Options

    • Cloud API access through Nomic
    • Self-hosted deployment
    • Offline execution with Ollama
    • Integration with popular frameworks

    Use Cases

    • Privacy-sensitive applications
    • Offline semantic search
    • Edge deployment scenarios
    • Research and development
    • Cost-conscious production deployments

    Pricing

    Free and open-source for self-hosting, API access with competitive pricing

    Surveys

    Loading more......

    Information

    Websitewww.nomic.ai
    PublishedMar 10, 2026

    Categories

    1 Item
    Machine Learning Models

    Tags

    3 Items
    #Embeddings#Multilingual#Open Source

    Similar Products

    6 result(s)
    Qwen3 Embedding
    Featured

    Multilingual embedding model supporting over 100 languages and ranking #1 on MTEB multilingual leaderboard. Offers flexible model sizes from 0.6B to 8B parameters with user-defined instructions.

    jina-embeddings-v5

    Jina AI's latest embedding model achieving the highest multilingual performance among models under 1B parameters with 71.7 average MTEB score and 67.7 MMTEB score.

    GTE Embeddings

    General Text Embeddings from Alibaba DAMO Academy trained on large-scale relevance pairs. Available in three sizes (large, base, small) with GTE-v1.5 supporting 8192 context length.

    voyage-3-large
    Featured

    State-of-the-art general-purpose and multilingual embedding model from Voyage AI that ranks first across eight domains spanning 100 datasets, outperforming OpenAI and Cohere models by significant margins.

    Nomic Embed Text
    Featured

    First fully reproducible open-source text embedding model with 8,192 context length. v2 introduces Mixture-of-Experts architecture for multilingual embeddings. Outperforms OpenAI models on benchmarks. This is an OSS model under Apache 2.0 license.

    BGE-reranker-v2-m3

    Multilingual cross-encoder reranking model from BAAI with under 600M parameters, achieving excellent performance in reranking retrieved documents for improved RAG accuracy.

    Decorative pattern
    Built with
    Ever Works
    Ever Works

    Connect with us

    Stay Updated

    Get the latest updates and exclusive content delivered to your inbox.

    Product

    • Categories
    • Tags
    • Pricing
    • Help

    Clients

    • Sign In
    • Register
    • Forgot password?

    Company

    • About Us
    • Admin
    • Sitemap

    Resources

    • Blog
    • Submit
    • API Documentation
    All product names, logos, and brands are the property of their respective owners. All company, product, and service names used in this repository, related repositories, and associated websites are for identification purposes only. The use of these names, logos, and brands does not imply endorsement, affiliation, or sponsorship. This directory may include content generated by artificial intelligence.
    Copyright © 2025 Awesome Vector Databases. All rights reserved.·Terms of Service·Privacy Policy·Cookies