• Home
  • Categories
  • Tags
  • Pricing
  • Submit
    Decorative pattern
    1. Home
    2. Benchmarks & Evaluation
    3. MTEB Leaderboard

    MTEB Leaderboard

    Massive Text Embedding Benchmark leaderboard covering 58 datasets across 112 languages and 8 embedding tasks. Industry-standard benchmark for comparing text embedding models.

    🌐Visit Website

    About this tool

    Overview

    MTEB (Massive Text Embedding Benchmark) is a massive benchmark for measuring the performance of text embedding models on diverse embedding tasks. MTEB spans 8 embedding tasks covering a total of 58 datasets and 112 languages.

    Expansion: MMTEB

    MMTEB (Massive Multilingual Text Embedding Benchmark) expands MTEB to over 500 quality-controlled evaluation tasks across 1,000+ languages, making it the most comprehensive multilingual embedding benchmark.

    Current Top Models (2025-2026)

    gte-Qwen3-8B

    • Largest of a family of new embedding models built on top of Qwen3
    • Outperforms previous generation Qwen embedding models
    • Ranks high on both multi-lingual and English-only MTEB leaderboards

    NV-Embed-v2 (NVIDIA)

    • Released October 2025
    • Fine-tuned from Llama-3.1-8B
    • Particularly powerful at understanding multilingual text
    • Previous version (NV-Embed) achieved score of 69.32 on MTEB (56 embedding tasks)

    Benchmark Tasks

    MTEB covers 8 main embedding tasks:

    • Classification
    • Clustering
    • Pair classification
    • Reranking
    • Retrieval
    • Semantic Textual Similarity (STS)
    • Summarization
    • Bitext mining

    How to Access

    Official MTEB leaderboard available at: https://huggingface.co/spaces/mteb/leaderboard

    Users can select different benchmarks:

    • Multilingual leaderboard
    • English-only leaderboard
    • Domain-specific benchmarks

    Research Impact

    MTEB has become the de facto standard for:

    • Comparing embedding model performance
    • Selecting appropriate models for specific tasks
    • Tracking progress in text embedding research
    • Validating new embedding approaches
    Surveys

    Loading more......

    Information

    Websitehuggingface.co
    PublishedMar 8, 2026

    Categories

    1 Item
    Benchmarks & Evaluation

    Tags

    3 Items
    #Benchmark#Embeddings#Evaluation

    Similar Products

    6 result(s)
    MTEB: Massive Text Embedding Benchmark

    A massive text embedding benchmark for evaluating the quality of text embedding models, crucial for vector database applications.

    SISAP Indexing Challenge

    An annual competition focused on similarity search and indexing algorithms, including approximate nearest neighbor methods and high-dimensional vector indexing, providing benchmarks and results relevant to vector database research.

    VectorDBBench

    The open‑source repository containing the implementation, configuration, and scripts of VectorDBBench, enabling users to run standardized benchmarks across multiple vector database systems locally or in CI.

    BEIR

    BEIR (Benchmarking IR) is a benchmark suite for evaluating information retrieval and vector search systems across multiple tasks and datasets. Useful for comparing vector database performance.

    ANN-Benchmarks

    ANN-Benchmarks is a benchmarking platform specifically for evaluating the performance of approximate nearest neighbor (ANN) search algorithms, which are foundational to vector database evaluation and comparison.

    IntelLabs's Vector Search Datasets

    A collection of datasets curated by Intel Labs specifically for evaluating and benchmarking vector search algorithms and databases.

    Decorative pattern
    Built with
    Ever Works
    Ever Works

    Connect with us

    Stay Updated

    Get the latest updates and exclusive content delivered to your inbox.

    Product

    • Categories
    • Tags
    • Pricing
    • Help

    Clients

    • Sign In
    • Register
    • Forgot password?

    Company

    • About Us
    • Admin
    • Sitemap

    Resources

    • Blog
    • Submit
    • API Documentation
    All product names, logos, and brands are the property of their respective owners. All company, product, and service names used in this repository, related repositories, and associated websites are for identification purposes only. The use of these names, logos, and brands does not imply endorsement, affiliation, or sponsorship. This directory may include content generated by artificial intelligence.
    Copyright © 2025 Awesome Vector Databases. All rights reserved.·Terms of Service·Privacy Policy·Cookies