Multilingual e5 large
Model Overview¶
Multilingual-E5-large model is initialized from xlm-roberta-large and continually trained on a mixture of multilingual datasets. It supports 100 languages from xlm-roberta, but low-resource languages may see performance degradation.
- Model Architecture: This model has 24 layers and the embedding size is 1024.
- Model Source: intfloat/multilingual-e5-large
- License: MIT License
QPC Configurations¶
Batch Size | SEQUENCE LENGTH | CORES | OLS | Generated URL | Download |
---|---|---|---|---|---|
1 | 512 | 2 | default | https://dc00tk1pxen80.cloudfront.net/SDK1.19.6/intfloat/multilingual-e5-large/compiled-bin-fp16-B1-C2-A7-best-throughput.tar.gz | Download |
4 | 512 | 2 | default | https://dc00tk1pxen80.cloudfront.net/SDK1.19.6/intfloat/multilingual-e5-large/compiled-bin-fp16-B4-C2-A7-best-throughput.tar.gz | Download |