Jais 7B
Model Overview¶
The Jais family of models is a comprehensive series of bilingual English-Arabic large language models (LLMs). These models are optimized to excel in Arabic while having strong English capabilities. We release two variants of foundation models that include:
Models pre-trained from scratch (jais-family-*).
Models pre-trained adaptively from Llama-2 (jais-adapted-*).
- Developed by: Inception, Cerebras Systems.
- Model Architecture: All Jais models are auto-regressive language models that use a transformer-based, decoder-only architecture (GPT-3). The Jais family of models are trained on up to 1.6 Trillion tokens of diverse English, Arabic and Code data. Input will be text only data and Output will be model generated text.
- Model Source: inceptionai/jais-adapted-7b
- License: Apache 2.0
- Language(s): (NLP): Arabic (MSA) and English.
QPC Configurations¶
Full Batch Size | Chunking Prompt Length | Context Length (CL) | Generated URL | Download |
---|---|---|---|---|
16 | 1024 | 4096 | https://dc00tk1pxen80.cloudfront.net/SDK1.18.2/jais-adapted-7b/qpc_16cores_1bs_1024pl_4096cl_-1mos_16fbs_4devices_mxfp6_mxint8.tar.gz | Download |
16 | 1024 | 4096 | https://dc00tk1pxen80.cloudfront.net/SDK1.18.2/jais-adapted-7b/qpc_16cores_1bs_1024pl_4096cl_-1mos_16fbs_8devices_mxfp6_mxint8.tar.gz | Download |