Multilingual e5 large
Model Overview¶
Multilingual-E5-large model is initialized from xlm-roberta-large and continually trained on a mixture of multilingual datasets. It supports 100 languages from xlm-roberta, but low-resource languages may see performance degradation.
- Model Architecture: This model has 24 layers and the embedding size is 1024.
- Model Source: intfloat/multilingual-e5-large
- License: MIT License
QPC Configurations¶
| Batch Size | SEQUENCE LENGTH | CORES | OLS | Generated URL | Download |
|---|---|---|---|---|---|
| 1 | 128 | 16 | default | https://dc00tk1pxen80.cloudfront.net/SDK1.20.4/intfloat/multilingual-e5-large/compiled-bin-fp16-B1-C16-A1-best-throughput.tar.gz | Download |
| 1 | 128 | 4 | default | https://dc00tk1pxen80.cloudfront.net/SDK1.20.4/intfloat/multilingual-e5-large/compiled-bin-fp16-B1-C4-A1-best-throughput.tar.gz | Download |
| 1 | 128 | 8 | default | https://dc00tk1pxen80.cloudfront.net/SDK1.20.4/intfloat/multilingual-e5-large/compiled-bin-fp16-B1-C8-A1-best-throughput.tar.gz | Download |