Skip to main content

Small Language Models

[PLACEHOLDER]

 Open source models will continue to grow in popularity. Small Language Models (SLMs) are smaller, faster to train with less compute.  They can be used for tackling specific cases while being at a lower cost. 

drops-different-size-on-green-leaf-jungle-background
Photo by Tobias Bjørkli via Pexels

 SLMs can be more efficient

SLMs are faster in inference speed, and they also require less memory and storage. 

 SLMs and cost

Small Language models can run on less powerful machines, making them more affordable. This could be ideal for experimentation, startups and/or small size companies.

Here is a short list

  • Tiny Llama. The 1.1B parameters AI Model, trained on 3T Tokens.
  • Microsoft’s Phi-2. The 2.7B parameters, trained on 1.4T tokens.
  • Gemini Nano.  The 6B parameters.
  • Deepseek Coder

Trending posts

SLA-SLO-SLI and DevOps metrics

Companies are in need of the metrics that will allow them to stay in business by making sure they meet the expectations of their customers. The name of the game is higher customer satisfaction by winning their trust and loyalty. To do so, you want to provide good products and services. Therefore you need to find ways to monitor performance, drive continuous improvements and deliver the quality expected by the consumer in this highly competitive market. Photos from AlphaTradeZone via Pexel and Spacejoy via Unsplash SLAs, SLOs and SLIs are a good way to achieve the above. They allow clients and vendors to be on the same page when it comes to expected system performance. If we go one level deeper, vendors/providers work on NFRs (Non-Functional Requirements) when working on their solutions. NFRs define the quality attributes of a system. I bring them up because the relationship between them and the SLAs is that they provide, in a way, foundational aspects for the SLA-SLO-SL...

Assembling MLOps practice - part 1

In one of our previous articles it was highlighted how DevOps manages the End-to-End application cycle, leveraging agility and automation. CI/CD pipelines, collaboration and transparency, monitoring and automation are part of the list on how DevOps leverages and facilitates agility. What if then we bring those to support ML? That is how MLOps comes to the table and starts making sense! Lego Alike data assembly - Generated with Gemini A big tech corporation, or a startup, nowadays will see how it is becoming a requirement to incorporate AI and Machine learning (ML) in their operations. ML components are key parts of the ecosystem, supporting the solutions provided to clients. As a result, DevOps and MLOps have become part of the "secret sauce" for success.  What is MLOps Just to bring the definition of what you probably know (or put together based on the above) MLOps focuses on the life-cycle management of machine learning models. It combines machine learning with traditional ...

Take a break on zero emission day 2024

 Do you know how much you contribute to the daily emissions in your city? How much does the city you live in contribute within your country? How much does your country contribute to the emissions on our planet? Do you know its impact? Do you know why we have a zero emission day? Photo by Pixabay via Pexels Let us start by getting our acronyms right, shall we? You may have heard the term GHG emissions, wondering what that means. GHG stands for Green House Gas. These gases are part of the cause of the rising temperature on Earth. What is interesting about them  is that they absorb infrared radiation resulting in the greenhouse effect. Within the greenhouse gases you find carbon dioxide, methane, nitrous oxide, ozone, water vapour. The vast majority of carbon dioxide emissions by humans come from the burning of fossil fuels. Key sectors to consider for GHG Fuel Exploitation Power Industry Transport Waste Agriculture Buildings Industry combustion and processes Top GHG emissions...

Assembling MLOps practice - part 2

 Part I of this series, published in May, discussed the definition of MLOps and outlined the requirements for implementing this practice within an organisation. It also addressed some of the roles necessary within the team to support MLOps. Lego Alike data assembly - Generated with Gemini   This time, we move forward by exploring part of the technical stack that could be an option for implementing MLOps.  Before proceeding, below is a CTA to the first part of the article for reference. Assembling an MLOps Practice - Part 1 ML components are key parts of the ecosystem, supporting the solutions provided to clients. As a result, DevOps and MLOps have become part of the "secret sauce" for success... Take me there Components of your MLOps stack. The MLOps stack optimises the machine learning life-cycle by fostering collaboration across teams, delivering continuous integration and depl...

This blog uses cookies to improve your browsing experience. Simple analytics might be in place for pageviews purposes. They are harmless and never personally identify you.

Agreed