



Multilingual embedding model supporting over 100 languages and ranking #1 on MTEB multilingual leaderboard. Offers flexible model sizes from 0.6B to 8B parameters with user-defined instructions.
Loading more......
The Qwen3 Embedding series represents a significant advancement over its predecessor, the GTE-Qwen series, in text embedding and reranking capabilities, built upon the Qwen3 foundation models.
Open-sourced under Apache 2.0 license
Available on Hugging Face, ModelScope, and GitHub with published technical report and code.