Skip to main content

Demystifying OKR Scoring

[PLACEHOLDER]

You have probably read that one of the many good things about OKRs is that it provides structure and clarity to work towards common goals. It helps connect company, teams and individuals’ objectives to measurable results.

 

Photo by Garreth Brown via Pexels

In a previous Beolle article, Herak wrote about HOSKR and OKRs.

In this iteration we will focus on the OKR scoring.

Measuring the “How”

The KRs in OKRs are the Key Results. With them we measure the progress towards the Objectives we have set. So how do we score them in a way that makes sense, and measure the success?

Few “gotchas” before we start

  1. Grades are an indication where you're going.
  2. In OKRs, scoring between .6 to .7 is your target. Scores between .8 and 1.0 are rare, meaning they are not the usual. If you find yourself completing all your OKRs within this range then something is not correct, for example, your Objectives are not Ambitious enough, meaning you always knew you (or your company or your team) were going to achieve it without much effort.
  3. Low grading isn't to be punished.
  4. Scores matter less than the process. If you are taking too long during the scoring gathering, preventing you, or the team, from “doing”, then reconsider and correct the approach. 
  5. Company-wide scoring enforces the commitment.
  6. The scoring is a guide of what to keep doing, what to change, what not to keep doing. 
  7. The sweet spot for the amount of KRs within the same Objective is three (3). Having said that, you could have more, if you think that makes sense to your OKR.
  8. Focus on the results when scoring, meaning on fulfilling the outcome. Needs to be clear why you are giving that grade. Be ready to explain it. 
  9. Always welcome your team members’ opinions.
  10. Publicly grade organizational OKRs. 

How to score the OKRs?

  1. Have regular meetings to check how the OKRs are doing.
  2. Place the grading. Where 0 means no progress, and 1.0 means complete.
  3. Calculate a score for the objective, which is an average from the gradings. Tip: My personal preference is keeping it simple by avoiding different weights. Keeping an equal weight for all will simplify the use of the tool and the reporting. 
  4. Once calculated, the team needs to review the scoring together. Chat about challenges and blockers getting in the way to meet the Objectives. This allows them to identify the areas of improvement, determining if they stay course, or if there are opportunities to pivot. 
  5. Assign to work, teams and individuals to execute, and the cycle begins again.

There are a few ways to work on the scoring. I will stick with the traditional grading scale approach.

You want to review your OKRs frequently. With proper tools, you can log your process as close to real-time as possible. Tracking often will help with accuracy.  

In the traditional grading:

  1. You score your KRs in a scale of 0.1 to 1.0
  2. Leverage colours, representing rate of success.
    1. Green is from 0.7 to 1.0.  This is for a status of completed/delivered/DONE. 
    2. Yellow is from 0.4 to 0.6. This is for “progress has been made, but we felt short”
    3. Red is from 0 to 0.3. This is for “we have failed to make significant progress”

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1.0


Ideally, OKR scores fall between 0.6 to 0.7 (60 to 70 percent). If scoring below 0.6, your team is underperforming. If consistently scoring above 0.7 or completing 100 percent of the results, your goals might be insufficiently ambitious

- Photo by Andrea Piacquadio


As an example, let us say you want to determine the Monthly objective score from Obj1, that has 3 KRs.
Obj1 score = (KR1 + KR2 + …+ KRn-1 + KRn)/Amount of KRs


Excel - OKR example.



What about tools?

Here are some examples:

Microsoft Excel Sheet

SmartSheets have good templates set for OKRs.

Trello. Leveraging cards and the waterfall chart.

Hive.

Note from the author

As a SAFe certified, and a person that enjoys the Scaled Agile Framework, it is always great to revisit the SAFe material that presents the Strategic Themes that connects the enterprise strategy to the portfolio vision. In there you will also find the relationship between Value Stream KPIs and the Strategic Themes.
Why this is relevant to the topic in this article is because the two (2) frameworks are linked together, complimenting each other, as OKRs are directly or indirectly related to the KPIs (See the image below).
If you want to read about this connection, then follow the value stream kpis link.  

Value stream KPIs are derived from strategic themes and local concerns


Trending posts

AGILE For DIGITAL AGENCIES

Introduction Some Digital agencies have a project process where waterfalls still plays a big part of it, and as far as I can tell, the tech team is usually the one suffering as they are at the last part of the chain left with limited budget and time for execution. I do believe that adopting an Agile approach could make a Digital Agency better and faster. In this article I’m presenting you just another point of view of why it make sense looking at Agile Methodology.  Why Agile for a Digital Agency? The Agile movement started in the software development industry, but it has being proven to be useful in others as well. It becomes handy for the type of business that has changing priorities, changing requirements and flexible deliverables. In the Digital Agency of today you need a different mindset. Creative will always play a huge role (“the bread and butter”). But the “big guys” need to understand that without technology there is no Digital Agency. Technical resources are

AI with great power comes responsibility

Generative AI continues to be front and centre of all topics. Companies continue to make an effort for making sense of the technology, investing in their teams, as well as vendors/providers in order to “crack” those use cases that will give them the advantage in this competitive market, and while we are still in this phase of the “AI revolution” where things are still getting sorted.   Photo by Google DeepMind on Unsplash I bet that Uncle Ben’s advise could go beyond Peter Parker, as many of us can make use of that wisdom due to the many things that are currently happening. AI would not be the exception when using this iconic phrase from one of the best comics out there. Uncle Ben and Peter Parker - Spiderman A short list of products out there in the space of generated AI: Text to image Dall.E-2 Fotor Midjourney NightCafe Adobe Firefly

Goal setting frameworks for Product Management - OKR and HOSKR

As a business analyst and product manager we often use various frameworks to synthesize and organize our product ideas and goals. I think of frameworks as tools in our product management tool kit which we use depending on the task at hand.  And speaking of goals, OKR is a very popular framework that I often use to set the goals for the products I am managing. However recently I participated the #ProductCon conference hosted by Product School  and I stumbled upon one of the talks in which Rapha Cohen, the CPO at Google Waze introduced a more effective framework for setting product goals. The framework is called HOSKR.  In this post I'll describe both the OKR and HOSKR frameworks in more details using examples. I hope this will provide you, our readers, more practical insights on how to effectively use these frameworks to set your product goals.  OKR OKR stands for O bjectives and K ey R esults. If you are reading this post then you are on our Beolle blog and I am going to use one o

Key takeaways from landmark EU AI Act

 Recently, the European Parliament voted and passed the landmark EU AI Act. It's the first of its kind and sets a benchmark for future AI regulations worldwide . The EU AI Act lays the foundation for AI governance, and it's pertinent for organizations delving into AI systems to comply with the legislation, build robust and secure AI systems, and avoid non-compliance fines.  Photo by Karolina Grabowska via Pexels My three key takeaways from the legislation are as follows: The Act introduces the definition of an AI system: "An AI system is a machine-based system designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments" The Act introduces the classification of AI systems based on risk to society. The Act outlin

This blog uses cookies to improve your browsing experience. Simple analytics might be in place for pageviews purposes. They are harmless and never personally identify you.

Agreed