BAAI/bge-en-icl cover image

BAAI/bge-en-icl

A LLM-based embedding model with in-context learning capabilities that achieves SOTA performance on BEIR and AIR-Bench. It leverages few-shot examples to enhance task performance.

A LLM-based embedding model with in-context learning capabilities that achieves SOTA performance on BEIR and AIR-Bench. It leverages few-shot examples to enhance task performance.

Public
$0.010 / Mtoken
8,192
ProjectPaperLicense

Input

inputs
You can add more items with the button on the right

whether to normalize the computed embeddings 2

You need to login to use this model

Output

BGE-EN-ICL

A large language model-based embedding model that supports in-context learning for enhanced task adaptation. Key features:

  • In-context learning with few-shot examples
  • SOTA performance on BEIR and AIR-Bench benchmarks
  • Flexible usage through FlagEmbedding or HuggingFace Transformers
  • Supports both zero-shot and few-shot scenarios
  • 7.11B parameters with F32 precision

For implementation details and usage examples, visit our GitHub repository.