A LLM-based embedding model with in-context learning capabilities that achieves SOTA performance on BEIR and AIR-Bench. It leverages few-shot examples to enhance task performance.
A LLM-based embedding model with in-context learning capabilities that achieves SOTA performance on BEIR and AIR-Bench. It leverages few-shot examples to enhance task performance.
The service tier used for processing the request. When set to 'priority', the request will be processed with higher priority. 3
whether to normalize the computed embeddings 2
The number of dimensions in the embedding. If not provided, the model's default will be used.If provided bigger than model's default, the embedding will be padded with zeros. (Default: empty, 32 ≤ dimensions ≤ 8192)
You need to login to use this model
LoginA large language model-based embedding model that supports in-context learning for enhanced task adaptation. Key features:
For implementation details and usage examples, visit our GitHub repository.
Run models at scale with our fully managed GPU infrastructure, delivering enterprise-grade uptime at the industry's best rates.