• Home
  • Categories
  • Tags
  • Pricing
  • Submit
    Decorative pattern
    1. Home
    2. Machine Learning Models
    3. BGE-reranker-v2-m3

    BGE-reranker-v2-m3

    Multilingual cross-encoder reranking model from BAAI with under 600M parameters, achieving excellent performance in reranking retrieved documents for improved RAG accuracy.

    🌐Visit Website

    About this tool

    Overview

    BGE-reranker-v2-m3 is an open-source multilingual reranking model from the Beijing Academy of Artificial Intelligence (BAAI) using transformer-based cross-encoder architecture designed specifically for reranking tasks.

    Features

    • Under 600 million parameters for efficient deployment
    • Cross-encoder architecture for accurate relevance scoring
    • Multilingual support across many languages
    • Open-source under Apache 2.0 license
    • Can run on consumer GPUs
    • Direct similarity output without requiring embeddings

    Technical Details

    • Full-attention over input pairs for maximum accuracy
    • Takes question and document as input
    • Outputs similarity scores directly
    • More accurate than bi-encoders (embedding models)
    • More time-consuming than embedding models (trade-off for accuracy)

    Use Cases

    • Re-ranking top-k documents from embedding models
    • Improving RAG system accuracy
    • Two-stage retrieval pipelines
    • Question answering systems
    • Document relevance scoring

    Performance

    Strong Mean Reciprocal Rank (MRR) scores, often achieving results close to top commercial models while being open-source and self-hostable.

    Surveys

    Loading more......

    Information

    Websitehuggingface.co
    PublishedMar 10, 2026

    Categories

    1 Item
    Machine Learning Models

    Tags

    3 Items
    #Re Ranking#Multilingual#Open Source

    Similar Products

    6 result(s)
    Qwen3 Embedding
    Featured

    Multilingual embedding model supporting over 100 languages and ranking #1 on MTEB multilingual leaderboard. Offers flexible model sizes from 0.6B to 8B parameters with user-defined instructions.

    Nomic Embed Text
    Featured

    First fully reproducible open-source text embedding model with 8,192 context length. v2 introduces Mixture-of-Experts architecture for multilingual embeddings. Outperforms OpenAI models on benchmarks. This is an OSS model under Apache 2.0 license.

    jina-embeddings-v5

    Jina AI's latest embedding model achieving the highest multilingual performance among models under 1B parameters with 71.7 average MTEB score and 67.7 MMTEB score.

    Nomic Embed Text v2

    Open-source multilingual embedding model using Mixture-of-Experts architecture, achieving excellent semantic performance with efficient inference and full offline support.

    GTE Embeddings

    General Text Embeddings from Alibaba DAMO Academy trained on large-scale relevance pairs. Available in three sizes (large, base, small) with GTE-v1.5 supporting 8192 context length.

    E5 Embeddings

    Open-source text embedding models from Microsoft supporting 100+ languages. Features small, base, and large variants with weakly-supervised contrastive pre-training. This is an OSS model family released by Microsoft Research.

    Decorative pattern
    Built with
    Ever Works
    Ever Works

    Connect with us

    Stay Updated

    Get the latest updates and exclusive content delivered to your inbox.

    Product

    • Categories
    • Tags
    • Pricing
    • Help

    Clients

    • Sign In
    • Register
    • Forgot password?

    Company

    • About Us
    • Admin
    • Sitemap

    Resources

    • Blog
    • Submit
    • API Documentation
    All product names, logos, and brands are the property of their respective owners. All company, product, and service names used in this repository, related repositories, and associated websites are for identification purposes only. The use of these names, logos, and brands does not imply endorsement, affiliation, or sponsorship. This directory may include content generated by artificial intelligence.
    Copyright © 2025 Awesome Vector Databases. All rights reserved.·Terms of Service·Privacy Policy·Cookies