Multilingual e5 small
Model Overview¶
Multilingual-E5-small is initialized from microsoft/Multilingual-MiniLM-L12-H384 and continually trained on a mixture of multilingual datasets. It supports 100 languages from xlm-roberta, but low-resource languages may see performance degradation.
- Model Architecture: This model has 12 layers and the embedding size is 384.
- Model Source: intfloat/multilingual-e5-small
- License: MIT License
QPC Configurations¶
Batch Size | SEQUENCE LENGTH | CORES | OLS | Generated URL | Download |
---|---|---|---|---|---|
1 | 512 | 2 | default | https://dc00tk1pxen80.cloudfront.net/SDK1.19.6/intfloat/multilingual-e5-small/compiled-bin-fp16-B1-C2-A7-best-throughput.tar.gz | Download |
4 | 512 | 2 | default | https://dc00tk1pxen80.cloudfront.net/SDK1.19.6/intfloat/multilingual-e5-small/compiled-bin-fp16-B4-C2-A7-best-throughput.tar.gz | Download |