Strategy & Planning

Measuring Skills & Goals

Measure skills and goals through assessments and reviews to create clarity and drive decision making.

Are you measuring skills and goals at your organization to capture the data needed to create clarity and drive performance and development decision making?  If not, you better start.  

In this article, we'll share the road map and best practices that will show you how to get this data using reviews and assessments.  But first, a quick reminder of what the Personalized Skill Development process is and how this step fits into it.

Personalized Skill Development

We've simplified this process down into the four repeatable steps (shown below) that will drive exceptional performance by personalizing your workforce's development in alignment with what the business values.

  1. Define – define which skills are critical to driving exceptional performance
  2. Measure – measure skills and goals to create clarity and drive decision making
  3. Plan – plan development efforts using data to determine the highest pay-off opportunities in alignment with business goals
  4. Develop – develop using a cadence for learners and managers that keeps them focused on the most critical skills / outcomes

This Personalized Skill Development process will enable you to:

  • Define what success looks like
  • Identify critical performance gaps
  • Accelerate skill development
  • Drive exceptional performance

These steps are meant to be sequential and build upon each other. Fight the temptation to jump right into creating development activities for your workforce and instead implement this sustainable process so that your talent development efforts will drive exceptional performance.

For those just starting their research, our Personalized Skill Development Guide will give you an overview of the entire process. In our article, Define Your Organization's Critical Skills, we covered Step 1 by sharing how to first collaborate with business leaders to understand their objectives and then second, determine which skills and behaviors are critical to achieving those goals.

Now that you know which skills and goals are critical to success, this article will go into detail on how to accomplish Step 2: measure skills and goals to create clarity and drive decision making.

We will break this step down into two parts and tackle each in turn.

  • Administer – administer assessments / reviews to collect behavioral and performance data and feedback
  • Track – track data over time to set a baseline and track progress

Administer Reviews & Assessments

Administer reviews and assessments to your learners against the behavior-based, skills model (ex: competency model) or goal expectations that you created in Step 1. This will give them the self-awareness that they need to develop and you the data that you need to identify performance gaps in your organization and determine where the highest payoff opportunities lie. 

Before we discuss best practices related to administering reviews and assessments, however, I need to explain the difference between goals and skills, and assessments and reviews. 

Goals v. Skills

Goals and skills are two different ways to define "success" at your organization. Both are important and play different roles in creating clarity for your workforce.

Goals - goals define the "what" needs to be accomplished and ideally are measurable in terms of units, time, score, etc. and are time bound. SMART is a common framework used to structure goals. Covey's book, The Four Disciplines of Execution states that the best goals can fit into the formula "from X to Y by when".

Skills - skills define the "how" we get there part of the expectations and are ideally measurable in terms of demonstrable behaviors. Many organizations begin with values and then progress to other, more sophisticated behavioral models, such as competency models. Click HERE for a guide showing you how to build your own custom competency model.

A simple example of these two ways of defining success could look like this:

Rubia is in the role of an asphalt plant foreman and has a goal of shipping 90% of production orders when requested.  Because this is such an important part of the customer experience at Rubia's company, her leadership has invested in a competency model for the role of plant foreman.  Within that competency model, there are 12 skills, but two are particularly relevant to this achieving this goal: Planning and Production. In this scenario, Rubia is given both a goal to be achieved ("what") and developable skills ("how") that will help her achieve that outcome.

Assessments v. Reviews

Assessments and reviews are two different processes to collect development and performance "data" at your organization. Both are important and play different roles in measuring success for your workforce.

Assessments – assessments combine feedback from one (self-evaluation) or more perspectives to understand about where the subject of the assessment is relative to a defined model (ex: leadership 360)

Reviews – reviews might include multiple points of feedback, but one score is ultimately decided upon, typically by the manager (ex: performance review)

How organizations use these two different measurement techniques often depends on their maturity and what they value. In our experience, organizations typically start with reviews loosely tied to performance expectations by the subject's manager. This can be a challenging process for many employees since often there are not clear definitions and metrics tied to these expectations. As an organization matures, however, more consistent performance expectations (goals and skills) get added. 

As the organization continues to progress, it might begin leveraging assessments - often brought in by consultants - in support of leadership development. This is because leadership is less about "what my manager thinks" and more about "how do I show up to my stakeholders". Progressive organizations are beginning to realize that this is important not just for senior leaders, but for really any employee in a skilled position.

For this reason, we advocate for using both assessments and reviews in your organization. We recommend using reviews to measure goals for all employees and to measure skills for frontline employees.  Assessments are best used for measuring skills for skilled positions, managers and leaders because of the increased complexity and variety of stakeholders inherent in these roles. 

Using these approaches gives every employee in your organization clarity around the skills and goals required to be successful in their role.

If we continue with our example from above, it could look like this for Rubia:

Our asphalt plant foreman Rubia has a goal of shipping 90% of production orders when requested, but as of her last review, she was only averaging 80%. Instead of her manager simply telling her to "do better" or “work harder”, they dug into Rubia's competency-based assessment and learned that while Rubia's team sees her as proficient in the skill of Production, they believe that she has an opportunity with the skill of Planning. She is only a level 2 “Basic” and she needs to be at a level 3 “Applied” to do the job well. Armed with this data-driven insight, Rubia can focus her efforts on developing her skill in Planning and applying what she learns into her work flow.

Now that we have a common understanding of goals, skills, assessments and reviews, let's discuss the benefits, standard process and some best practices for administering reviews and assessments.


We start with reviews because they are most common.  When used correctly, reviews are a very powerful tool for both the individual and the organization to gain insight into the progress being made against goals and skills.

Benefits of Administering Reviews

There are many benefits to conducting reviews, including:

  • creating clarity around the outcomes that the organization needs to achieve
  • increasing the value of manager-direct report performance conversations
  • capturing data that can be used prioritize further actions

Review Administration Process

A standard process for administering reviews looks sometime like this:

  1. Communicate ahead of the review window to remind people that it is coming and exactly what you need them to do in preparation
  2. Conduct any pre-review training on the tool - ideally one designed specifically for this - that you will use to administer the review to your workforce
  3. Unlock (some organizations) the goals for direct reports to conduct their self-evaluations
  4. Direct reports complete their self-evaluations and "pass" them on to their managers
  5. Managers conduct their own evaluations of their direct reports
  6. Sometimes managers will collaborate at this point to "level-set" ratings among different peer groups / roles
  7. Managers conduct individual performance review sessions with direct reports
  8. Managers make final determination on score and submit

Best Practices for Administering Reviews

Since performance reviews are so common, we often forget how impactful they are on the employee and do not give the process the attention that it deserves. Here are some best practices to help you take it from a "check-the-box" exercise to one that will create alignment and engagement.

Begin the conversation by asking the associate what a successful review conversation looks like to them.  While this will probably catch them by surprise at first, it will help them relax and be more transparent.  It will also give you a measuring stick that they created to refer back to at the end of the conversation.

Ensure that managers use specific examples when sharing why they gave the subject the rating that they did.

Encourage managers to use language related to the organization’s values / beliefs when describing the behaviors they want to see from the associate.

Mid-year reviews can be streamlined compared to year-end reviews, but use the same rating scale so that you can easily compare the two review periods.

Always give the subject the opportunity to point out where they think they should focus developmental energy.  This is an important insight, even if the manager ultimately disagrees.           

Use the right tool for the job. ATLAS was designed to support this process (it’s even optimized for behavior-based models), but there are other options available. Map out your process and then find a software that will power it.

Traditional annual performance reviews carry a negative stigma in many organizations because they are so poorly executed. Since your organization likely already has some version of a performance review in place, we challenge you to review the list of best practices above and determine where you can start making incremental changes to improve it.

Never hesitate to contact us if you think we can help!


Assessments are not as common and often only used for senior leadership.  Progressive organizations are beginning to realize, however, that when used correctly, assessments can be a very powerful developmental tool for gaining insight into more complex skills at the individual and group skill levels and progress being made to close critical gaps and expand capabilities.

Benefits of Using Assessments

There are many benefits to conducting assessments, including:

  • creating a skills inventory at individual, group and organizational levels
  • increasing the value of manager-direct report developmental conversations
  • capturing data that can be used to focus developmental activities

Assessment Administration Process

A standard process for administering assessments looks something like this:

  1. Load your model into whatever tool - ideally one designed specifically for this - that you will use to administer it to your workforce
  2. Make any customizations that will add value or engage the learner (see best practices below)
  3. Run a test assessment to ensure that directions are clear and formatting is correct
  4. Communicate with the assessment subjects and their managers to ensure alignment on the purpose and process
  5. Collect the evaluator data and be sure to check it for accuracy and relevance
  6. Communicate with the evaluators to let them know the assessment request is coming and why it's important that they participate
  7. Launch the assessments at a time that works best for the evaluators 
  8. Leave the assessments open for enough time and leverage reminders to get a reasonably high completion percentage
  9. Close the assessments and publish the results

Best Practices for Administering Assessments

While the process for administering an assessment is relatively simple, it can be tedious and time consuming without the proper tool and there are a number of best practices to keep in mind.

Use the right tool for the job. ATLAS was designed to administer custom assessments (it’s even optimized for behavior-based models), but there are other options available. Map out your process and then find a software that will power it.

For a behavior-based model, make sure the subjects have reviewed it before they're being assessed against it.

Add open ended questions to your model that will give the subject additional context and let the evaluator share something that the model does not address.

It’s best to get multiple points of view, but make sure that you're soliciting feedback from those that interact often with your learner. Don't sacrifice the quality of the feedback to hit a certain number of evaluators.

Organize evaluators into different relationship groups – manager, peers, direct reports, customers, etc. - that are meaningful to the subject and feedback.

Always launch assessments based on business rhythms.  If the evaluators tend to work early (construction), set it to launch very early.  If they tend to like night work (tech), launch midday.  

Set the due dates based on your audience. Give individual contributor roles a week to complete - if they can’t fit it into one week, another week won’t help.  For managers, leave the assessment open for about 2 weeks. 

Send reminders a few days before it is due and the day that it is due.    

If your organization or the role that you are working with is not familiar with using assessments, start with a self-assessment, then move to a 180 degree assessment (self + manager) and then finally to a 360 degree assessment (self + manager + direct reports / peers) if applicable.

180s can be enough for skilled individual contributor roles, but we highly recommend getting to 360s for anyone managing direct reports. As Mark Fernandes, a very successful human performance coach likes to say, “without feedback from others, we will quickly invent our own reality”. 

Assessments will give you direct insight into the human behaviors and perceptions that drive your organization’s skilled roles and people leaders.  Given how important these two groups are to your organization and culture, the multiple points of feedback in a 360 are critical to long-term success.

Key Take-A-Ways

In this section we’ve explained how to use reviews and assessments to capture critical data on the behavior-based, skills models or goal expectations that you created in Step 1: Define Your Organization's Critical Skills. When done correctly, these evaluations will give your workforce the self-awareness that they need to take action and you the data to identify performance gaps in your organization and determine where the highest payoff opportunities lie.

These are not “one-and-done” evaluations, however.  You should implement a cadence where you are collecting at least semi-annual data on development and performance to create clarity and drive decision making. 

Track Data Over Time

Track data over time to to set a baseline and track progress.  We recommend conducting two rounds of reviews / assessments each year. 

This will depend on your organization as some even combine their assessments and reviews into a single process.  In general you want to implement a cadence where you are collecting at least semi-annual data on development and performance.  

  • Any less and you will struggle to make data-driven adjustments within the year.  
  • Any more and you will start to get serious push back from your workforce.

Which combination of reviews and assessment is typically dependent on the group of employees in question.  To keep things simple, we recommend the following approaches based on three different types of employees.

  • Frontline 
  • Skilled / Managers
  • Leaders / Succession   


Frontline employees typically have a defined role with very specific tasks, no direct reports and work under the direct supervision of a manager.  

There tends to be fewer skills (5-8 skills) in these roles and they can often be clearly defined and learned relatively quickly.  At the same time, it can be challenging to set performance goals for a 6-12 month period for these frontline roles, so most organizations set behavior-based expectations that are captured in semi-annual reviews with their manager.  

We agree with this approach, but recommend organizing these behavior-based expectations into skills based on roles.  That way you get a semi-annual skills inventory update and the frontline employee gets insight into which skills they should be working on, instead of just a subjective “meets expectations”.     

Finally, we encourage organizations to instill a cadence of very short conversations between the manager and direct report to inspire incremental development.  We recognize that this can be a challenging time commitment for most organizations’ frontline workforce, but with the right framework and tools, a 15 minute conversation can be very valuable.   

An annual cadence could look something like this:

Skilled / Managers  

This group is made up of skilled employees - who do not have direct reports, but do require more specialized skills than the frontline employees above - and managers who have direct responsibility for frontline employees.   

Behavior-based models tend to be more complicated for this group.  For skilled employees these models will include more technical / specialized skills for their roles and for managers they will include more soft / leadership skills.  Managers have an obvious need for soft / leadership skills, but organizations are realizing that their subject matter experts also need a basic level of soft skills in order to be effective.   

On the goal side, most organizations will begin to assign performance goals to these types of roles or expect them to create their own goals, often with the help of their manager.  These goals should adhere to the S.M.A.R.T. goal approach if possible and be set early in the year. 

We recommend tracking a combination of behavioral-based skills and performance goals with this group.  Here again, we recommend running this process twice per year to create clarity and show progress.  

The check-in cadence between this group and their leaders should happen more often and take a bit longer than for frontline employees.  We call this check-in a “30x30” since we recommend having a roughly 30 minutes conversation every 30 days.  These should be highly structured, collaborative and future oriented.        

With these two elements in mind, an annual cadence could look something like this:         

Leaders / Succession 

This group is made up of leaders and the succession pool that is being groomed for those leadership positions.  Similar to the Skilled / Managers group, you should be tracking a combination of behavioral-based skills and performance goals.

The difference with this group on the skills side is that leadership models tend to be much broader to account for the broad nature of senior leadership roles and they interact with a much wider variety of stakeholders, including teams several layers deep.

Similarly, performance goals tend to be broader, more long term in nature and have more to do with the leader’s ability to influence and engage diverse teams than hit highly specific operational targets.    

With this in mind, we recommend leveraging 360 assessments and performance goals for this group.  

Once per year, a leader should be assessed against your organization’s full leadership model and evaluators should include representatives from the leader’s wide variety of stakeholders.  During the assessment debrief, the leader should choose - with the help of their coach / manager - a subset of skills to focus on developing for the year.  

At the mid-year, this leader should then be assessed against only this subset of skills.  This will keep the leader focused on the most critical skills that they need to be developing and not overwhelm their stakeholders with long assessments. 

Performance goals and check-ins would function similarly to the Skilled / Manager group, but would likely rely on a wider variety of metrics and require more time to discuss.  Organizations will often use professional coaches in these conversations given how the impact that these leaders have on the organization.   

Key Take-A-Ways 

In this section we’ve explained how to use a semi-annual cadence of reviews and assessments to track data over time to set a baseline and track progress.

Since many organizations already do some type of performance review, the extra value comes from using this process to evaluate your employees against defined skills relevant to their role and then helping employees focus on developing those skills.

As you conduct these assessments and reviews overtime, you will set a baseline skills inventory and then be able to show business leaders how development efforts are translating into skill level and performance increases.

Again, we highly encourage you to leverage technology to track all of this data so that it is a sustainable, scalable process across your organization.


At the beginning of this article we asked you if you’re measuring skills and goals to create clarity and drive decision making.  

Well, if you begin administering assessments and reviews to collect performance and development data and feedback, and then track this data over time to set a baseline and track progress, then you can answer YES!

You now have clear data that you can use to plan and develop against. Business leaders will begin to appreciate the power in integrating development and performance and will make you a true partner in achieving business outcomes.

Let’s wrap up with our example about Rubia from above:

Lucky for Rubia, her company uses ATLAS Navigator's software and it recommends a couple of e-learning courses that specifically target the behaviors that she needs to improve in the Planning skill. By the time her next review comes around, Rubia has begun to apply what she's learned and is now at a 92% on-time rate. Rubia has learned how to leverage developmental data to solve an individual performance challenge and her company has done the same to elevate the customer experience.

Now that you have the road map and best practices for measuring your efforts through assessments and reviews, we encourage you to check out our Guide to Personalized Skill Development to understand the rest of the process.

Happy developing and know that we're here if you get stuck and need help!

Make Personalized Skill Development Easy

If you’d love to drive exceptional performance at your own organization, but the process sounds tedious, let us help!

ATLAS Navigator is a technology company specializing in Personalized Skill Development. We offer a software tool that powers the entire process, making it easy to manage, and consulting to accelerate your progress.

Contact us directly for a free consultative session packed with insights and use any of the tools and thought pieces on our website to guide your own process.  

☎ 540.292.3394 | | | LinkedIn



Similar posts

Get to know us!

Still in research mode?  Subscribe to our periodic insights and resources to see if we're a good fit for you.