We use essential cookies to make our site work. With your consent, we may also use non-essential cookies to improve user experience and analyze website traffic…
BAAI/bge-en-icl cover image

BAAI/bge-en-icl

A LLM-based embedding model with in-context learning capabilities that achieves SOTA performance on BEIR and AIR-Bench. It leverages few-shot examples to enhance task performance.

A LLM-based embedding model with in-context learning capabilities that achieves SOTA performance on BEIR and AIR-Bench. It leverages few-shot examples to enhance task performance.

Public
$0.010 / Mtoken
8,192
ProjectPaperLicense

Input

The service tier used for processing the request. When set to 'priority', the request will be processed with higher priority. 3

inputs
You can add more items with the button on the right

whether to normalize the computed embeddings 2

The number of dimensions in the embedding. If not provided, the model's default will be used.If provided bigger than model's default, the embedding will be padded with zeros. (Default: empty, 32 ≤ dimensions ≤ 8192)

You need to login to use this model

Login

Output

BGE-EN-ICL

A large language model-based embedding model that supports in-context learning for enhanced task adaptation. Key features:

  • In-context learning with few-shot examples
  • SOTA performance on BEIR and AIR-Bench benchmarks
  • Flexible usage through FlagEmbedding or HuggingFace Transformers
  • Supports both zero-shot and few-shot scenarios
  • 7.11B parameters with F32 precision

For implementation details and usage examples, visit our GitHub repository.

Unlock the most affordable AI hosting

Run models at scale with our fully managed GPU infrastructure, delivering enterprise-grade uptime at the industry's best rates.