
Qwen3 Embedding
Multilingual embedding model supporting over 100 languages and ranking #1 on MTEB multilingual leaderboard. Offers flexible model sizes from 0.6B to 8B parameters with user-defined instructions.
About this tool
Overview
The Qwen3 Embedding series represents a significant advancement over its predecessor, the GTE-Qwen series, in text embedding and reranking capabilities, built upon the Qwen3 foundation models.
Key Features
- Support for over 100 languages including various programming languages
- Robust multilingual, cross-lingual, and code retrieval capabilities
- Full spectrum of sizes from 0.6B to 8B for both embedding and reranking models
- Flexible vector definitions across all dimensions
- User-defined instructions to enhance performance for specific tasks, languages, or scenarios
Performance
- 8B size embedding model ranks #1 in MTEB multilingual leaderboard (as of June 5, 2025, score 70.58)
- Reranking model excels in various text retrieval scenarios
- Superior performance in multilingual and cross-lingual tasks
Model Variants
- Qwen3-Embedding-8B (text)
- Qwen3-VL-Embedding (multimodal - supports text, images, screenshots, and video)
- Multiple size options for different efficiency/effectiveness trade-offs
License
Open-sourced under Apache 2.0 license
Availability
Available on Hugging Face, ModelScope, and GitHub with published technical report and code.
Surveys
Loading more......
Information
Websitegithub.com
PublishedMar 8, 2026
Categories
Tags
Similar Products
6 result(s)