Stable LM 2 1.6B is a state-of-the-art 1.6 billion parameter small language model trained on multilingual data in English, Spanish, German, Italian, French, Portuguese, and Dutch.
This model's compact size and speed lower hardware barriers, allowing more developers to participate in the generative AI ecosystem.
In addition to the pre-trained and instruction-tuned version, we release the last checkpoint before the pre-training cooldown. We include optimizer states to facilitate developers in fine-tuning and experimentation. Data details will be provided in the upcoming technical report.
Stable LM 2 1.6B can be used now both commercially and non-commercially with a Stability AI Membership & you can test the model on Hugging Face.