BAAI

BGE Reranker v2 M3

Multilingual reranker. 100+ languages. 1.1GB.

0.568B parametersxlm-robertamit8K context1.58GB - 1.58GB VRAM

About This Model

The BGE Reranker v2 M3, developed by BAAI, is a robust model designed for text reranking tasks, specifically within the realm of text classification. With just over 568 million parameters, this model leverages the xlm-roberta architecture to efficiently process and refine text-based queries, making it particularly adept at improving the relevance and quality of search results or document rankings. The model's context length of 8192 tokens allows it to handle longer documents and more complex queries, which is a significant advantage in scenarios where context is crucial.

Despite its relatively modest size, the BGE Reranker v2 M3 punches well above its weight, offering performance that rivals larger models while maintaining high efficiency. This makes it an excellent choice for users who need a powerful yet lightweight solution for text reranking tasks. The model is available in FP16 quantization, requiring only 1.6 GB of VRAM, which means it can be deployed on a wide range of hardware, including laptops and mid-range desktops. Ideal use cases include enhancing search engines, improving document retrieval systems, and refining content recommendation algorithms. Users looking for a balance between performance and resource efficiency will find this model particularly useful.

Check Your Hardware

See which quantizations of BGE Reranker v2 M3 your hardware can run.

Quantization Options

QuantizationBitsFile SizeVRAM NeededRAM NeededQuality
FP16161.08 GB1.58 GB2.08 GB
98%

Frequently Asked Questions

How much VRAM do I need to run BGE Reranker v2 M3?

BGE Reranker v2 M3 requires 1.58GB VRAM minimum with FP16 quantization. For full precision, you need 1.58GB VRAM.

What is the best quantization for BGE Reranker v2 M3?

Q4_K_M offers the best balance of quality and VRAM usage. Q8_0 is near-lossless if you have enough VRAM.