Bge large en v1.5
Model Overview¶
The BAAI/bge-large-en-v1.5 model is a text embedding model developed by the Beijing Academy of Artificial Intelligence (BAAI), that can map any text to a low-dimensional dense vector for tasks like retrieval, classification, and semantic search. The model can be fine-tuned on your own data to improve its performance on domain-specific tasks.
Model inputs and outputs
Inputs
Text sequences of up to 512 tokens
Outputs
1024-dimensional dense vector embeddings
- Model Source: BAAI/bge-large-en-v1.5
- License: MIT License
QPC Configurations¶
Batch Size | SEQUENCE LENGTH | CORES | OLS | Generated URL | Download |
---|---|---|---|---|---|
1 | 512 | 2 | default | https://dc00tk1pxen80.cloudfront.net/SDK1.19.6/BAAI/bge-large-en-v1.5/compiled-bin-fp16-B1-C2-A7-best-throughput.tar.gz | Download |
4 | 512 | 2 | default | https://dc00tk1pxen80.cloudfront.net/SDK1.19.6/BAAI/bge-large-en-v1.5/compiled-bin-fp16-B4-C2-A7-best-throughput.tar.gz | Download |