Open source models will continue to grow in popularity. Small Language Models (SLMs) are smaller, faster to train with less compute. They can be used for tackling specific cases while being at a lower cost.
Photo by Tobias Bjørkli via Pexels |
SLMs can be more efficient
SLMs are faster in inference speed, and they also require less memory and storage.
SLMs and cost
Small Language models can run on less powerful machines, making them more affordable. This could be ideal for experimentation, startups and/or small size companies.
Here is a short list
- Tiny Llama. The 1.1B parameters AI Model, trained on 3T Tokens.
- Microsoft’s Phi-2. The 2.7B parameters, trained on 1.4T tokens.
- Gemini Nano. The 6B parameters.
- Deepseek Coder