• Home
  • Categories
  • Tags
  • Pricing
  • Submit
    Decorative pattern
    1. Home
    2. Machine Learning Models
    3. jina-embeddings-v5

    jina-embeddings-v5

    Jina AI's latest embedding model achieving the highest multilingual performance among models under 1B parameters with 71.7 average MTEB score and 67.7 MMTEB score.

    🌐Visit Website

    About this tool

    Overview

    jina-embeddings-v5-text-small was released on February 18, 2026, achieving state-of-the-art multilingual performance with 677M parameters—the highest among embedding models under 1B parameters.

    Features

    • 71.7 average score on MTEB English v2
    • 67.7 score on MMTEB (Multilingual)
    • 677M parameters for efficient deployment
    • API compatibility with OpenAI's text-embedding-3-large
    • Drop-in replacement for OpenAI embeddings
    • Open-source under Apache 2.0 license

    Performance

    • Highest multilingual performance under 1B parameters
    • Excellent semantic understanding across languages
    • Efficient inference on consumer hardware

    API Compatibility

    The API endpoint matches OpenAI's JSON schemas, allowing users to easily replace OpenAI models with Jina embeddings without code changes.

    Pricing

    Generous free tier with millions of tokens included, competitive paid tiers for production use

    Surveys

    Loading more......

    Information

    Websitehuggingface.co
    PublishedMar 10, 2026

    Categories

    1 Item
    Machine Learning Models

    Tags

    3 Items
    #Embeddings#Multilingual#Open Source

    Similar Products

    6 result(s)
    Qwen3 Embedding
    Featured

    Multilingual embedding model supporting over 100 languages and ranking #1 on MTEB multilingual leaderboard. Offers flexible model sizes from 0.6B to 8B parameters with user-defined instructions.

    Nomic Embed Text v2

    Open-source multilingual embedding model using Mixture-of-Experts architecture, achieving excellent semantic performance with efficient inference and full offline support.

    GTE Embeddings

    General Text Embeddings from Alibaba DAMO Academy trained on large-scale relevance pairs. Available in three sizes (large, base, small) with GTE-v1.5 supporting 8192 context length.

    voyage-3-large
    Featured

    State-of-the-art general-purpose and multilingual embedding model from Voyage AI that ranks first across eight domains spanning 100 datasets, outperforming OpenAI and Cohere models by significant margins.

    Nomic Embed Text
    Featured

    First fully reproducible open-source text embedding model with 8,192 context length. v2 introduces Mixture-of-Experts architecture for multilingual embeddings. Outperforms OpenAI models on benchmarks. This is an OSS model under Apache 2.0 license.

    BGE-reranker-v2-m3

    Multilingual cross-encoder reranking model from BAAI with under 600M parameters, achieving excellent performance in reranking retrieved documents for improved RAG accuracy.

    Decorative pattern
    Built with
    Ever Works
    Ever Works

    Connect with us

    Stay Updated

    Get the latest updates and exclusive content delivered to your inbox.

    Product

    • Categories
    • Tags
    • Pricing
    • Help

    Clients

    • Sign In
    • Register
    • Forgot password?

    Company

    • About Us
    • Admin
    • Sitemap

    Resources

    • Blog
    • Submit
    • API Documentation
    All product names, logos, and brands are the property of their respective owners. All company, product, and service names used in this repository, related repositories, and associated websites are for identification purposes only. The use of these names, logos, and brands does not imply endorsement, affiliation, or sponsorship. This directory may include content generated by artificial intelligence.
    Copyright © 2025 Awesome Vector Databases. All rights reserved.·Terms of Service·Privacy Policy·Cookies