360 Assessment Management

Engaging Assessment Debriefs [Guide]

Learn how to lead a 360 assessment debrief in a way that engages the subject, focuses on development and mitigates negative reactions.

You administered a round of assessments, results are in and now it's time to lead the assessment debrief with the subject and their manager.  

You know there is a wealth of insights wrapped up in the assessment data, but you also know that the subject of the assessment is human and likely going to get some feedback they weren't expecting. How do you conduct this assessment debrief in a way that engages the subject, focuses on development and mitigates negative reactions?

Assessment debriefs are your opportunity to engage the subject and their manager by encouraging conversation, adding context and inspiring action.  We provide our own collection of best practices here, but please adapt it to your organization’s unique culture and any special considerations for the role and person being debriefed.

In this guide we will cover how to handle:

  • Preparing for the Assessment Debrief
  • Sharing the Assessment Results
  • Creating a Development Plan

Preparing for the Assessment Debrief

In general, we recommend that the subject should receive their results at the same time they meet with someone for the assessment debrief. The person reviewing the results with the subject could be their manager, someone from talent development or human resources, a mentor, coach, etc.

This person should be experienced in leading this kind of conversation and well prepared, particularly if this is the first time the subject has taken a multi-evaluator assessment.

We will use ATLAS Navigator's software and exportable PDF reports to illustrate our points, but this same process should be used with other assessment tools as well.  We will also be primarily discussing multi-evaluator assessments because those tend to be the most complicated.  You can still apply many of these same concepts to an assessment that only includes the subject and their manager. 

The Basics

When: Schedule 60 - 90 minutes to review this assessment.

Where: Ideally meet in-person in a quiet, private place. If you have to meet virtually, use video conferencing because reading body language is very important for a conversation like this.

Mindset: Frame the conversation in a positive manner. Communicate with the subject ahead of time, making it clear that you are looking to add context regarding the ratings as well as advise on developmental activities.

Identifying Trends

Review the subject's results prior to the assessment debrief. Start with looking at high level trends, paying particular attention to the alignment between the subject (subject of the assessment) and the average of the scores from the other evaluator groups.

In the example below, we call this "all others" and it is an average of the Manager, Peer, Direct Report and Other's ratings. Also, some assessment tools let you set a proficiency target.

Identify the 4 key scoring alignment attributes:

  • Recognized Strengths – where both the subject and All Others’ ratings were at or above the target proficiency.
  • Unrecognized Strengths – where All Others was at or above the target while the subject self rated below the target.
  • Recognized Opportunities – where both All Others and the subject’s ratings were below the target proficiency.
  • Blindspots – where All Others was below the target while the subject self rated above the target.

Other trends to look for include:

  • Evaluator Consistency – how consistent are ratings between the subject and others?
  • General Comments – are there any themes or surprises?
  • Historical Trends – review any trend data if they have taken this review before.

Sharing the Assessment Results

Now that you’ve reviewed the results yourself, it is time to move on to the main event - reviewing the results with the subject and their manager.  

It is important to start the assessment debrief session by setting the stage for the subject before moving on to the actual results. Assessments often include feedback that the subject was not expecting and can be difficult to process in a developmental manner if the conversation is not properly facilitated.

Setting the Stage

Emphasize the purpose of the assessment as a developmental tool. Acknowledge that the assessment results are subjective, but they do represent how others perceive the participant.

The nature of multi-evaluator feedback is that it allows the participant to compare their own perception against that of their manager, direct reports, peers, and others.  Also point out that multi-evaluator feedback does not necessarily measure business results. Instead, it is a measure of the behaviors or “how” you achieve results.

Establish roles in the assessment debrief. As the manager / coach, your job is to guide the discussion while the subject needs to be open and curious about the feedback. Once you've reviewed the feedback together, you will collaborate on creating a plan of action.

Before showing them the results, ask them what a successful conversation would look like for them.  Don’t be surprised if you get a blank stare the first time since most employees have never been asked this.  This is a trust building opportunity, however, so don’t lose it.  Dig a little.  

Be ready with a few bullets of your own on what success looks like and plan to refer back to it at the end of the conversation to see if you both achieved success.

Navigating the Results

Every assessment tool is going to have its own way of displaying assessment results. We'll use examples from ATLAS Navigator's platform and exportable PDF report here, but ensure that whatever assessment tool you use gives access to the same key data used below.

On the primary digital dashboard, the subject can see (1) the category view of their most recent assessment, (2) a snapshot of their development plan, and (3) their overall proficiency score.

We'll start with the category view of the assessment data in the left hand tile. Here, the subject can see how their self rating compares to the average rating of others in each area.

This will give the group a general sense for if the subject tends to be overconfident in their self perception or under confident. You would be surprised how often high potentials don't see what others see in them.

Skills Overview

Next, drill down on each category to see which skills make up that category. Drilling down into any of the categories displays all the skills that make up that category.

Here, the group should look at the same 4 alignment attributes shared above.

  • Recognized Strengths – where both the subject and All Others’ ratings were at or above the target proficiency.
  • Unrecognized Strengths – where All Others was at or above the target while the subject self rated below the target.
  • Recognized Opportunities – where both All Others and the subject’s ratings were below the target proficiency.
  • Blindspots – where All Others was below the target while the subject self rated above the target.

Questions To Ask the Subject:

  1. How do you read this page? Do the scores seem about right?
  2. Are there any surprises? Disappointments?
  3. Do you see any common themes in the scores?
  4. What is the overall message about your strengths reflected in the numbers?

Individual Skill Review

At this level, spend time reviewing how each group rated the subject. Below are the key points / trends you should review with the subject to better understand the story in the data.

Consistency – How consistent are the ratings between the different evaluator groups? Do each of the groups tend to agree with their rating of the subject? Or is there a wide range of ratings? Depending on how robust the tool is, you can even see how many in each evaluator group gave certain scores (while still keeping the feedback anonymous).

A lack of consistency could be caused by a variety of reasons. It is possible that the subject shows up differently for each group. Another explanation is that some evaluator groups might have greater experience and exposure to this particular skill.

As the coach in the debrief, your job is to get curious and ask questions about why the subject believes there is evaluator inconsistency.

Company Average – here you can see how the subject’s skill rating compares to the internal company average for that skill.

360 Score – this number is the average of all evaluators who completed the assessment

Target & Gap – this is the company established target that determines the proficiency level that is required for the skill to be considered a strength.

Gap – the difference between the 360 score and target.

Just Others – skill average rating without self included.

Trend Data – here you can see progress over time if an individual completes an assessment multiple times.

Questions To Ask the Subject:

  1. How did you rate yourself relative to the way others rated you?
  2. What patterns or themes do you see? What do you make of them?
  3. How do people see you differently depending on your working relationship with them?
  4. Why would a particular group rate you this way, versus the way others rated you?

Capability / Competency Models Specifically

For assessments that utilize a more robust rubric rating scale, ATLAS Navigator's platform has a popup that enables the group to refer back to the multi-level rubric.

After reviewing the ratings, you can click on the rubric icon to display the behavioral anchors for each proficiency level. These can be helpful in deciding what behaviors the subject should focus on maximizing or improving.

Verbatim Comments

So far, the feedback has been in data form, but a good assessment will include prompts for the evaluators to provide open form feedback to the subject on individual skills or general comments at the end of the assessment.

When reviewing these verbatim comments with the leaner:


  • Look for any themes that might stand out
  • Use the comments to add context to the numerical ratings
  • Remember that these are other peoples’ perceptions
  • Note any areas of agreement and celebrate this - awareness is the first step toward development
  • Dig into areas of disagreement and stay curious  -  the goal of the conversation is to create clarity, not argue over who’s right and wrong


  • Try to guess who said what
  • Just focus on the negative
  • Immediately dismiss any information as inaccurate

Creating a Development Plan

The final and most important step of the assessment debrief is to inspire the subject to take action on the feedback. It's important to land this next step at the point of awareness, before the subject gets sucked back into the daily demands of their job.

Decide Where To Focus

With the coach's guidance, the subject should select one or two skills / behaviors to work on in the next 30 days. It is important that you do not overwhelm the subject with too many development goals or encourage procrastination by setting deadlines too far out.

Questions To Ask the Subject:

  • Based on your assessment results, what are the main messages (overall, by specific evaluator groups, across multiple groups, etc.)?
  • What feedback is most critical to your success?
  • What do your key coworkers see as opportunities?
  • Which one or two changes would have the greatest impact on your success and job satisfaction?
  • Are there any immediate quick wins that would be meaningful?

Building the Plan

Finally, decide exactly how the subject is going to work to improve these skills / behaviors.  Get creative, make it collaborative and engage them in the process.

These are great opportunities for on-the-job training, coaching from another employee who is an expert in this skill, or even formal activities like classes, books and training courses.  

Whatever it is, set realistic expectations around the developmental activity and how you want to see it show up in their behavior.  It is critical that the manager commits to support and challenge the subject around the skills and development activities that were chosen.    

Here again, this can be done manually, or you can use our software tool to build a dynamic individual development plan as you review the assessment results. The key is to keep it focused and write it down!

Linking To Learning

ATLAS Navigator's tool is uniquely designed to encourage this behavior because it enables the HR / Talent Development team to suggest and guide the subject to developmental activities (using URL linking) right in the feedback dashboard (right hand, lower tile).

The subject can choose from the developmental activities that have already been linked into the system and add it to their developmental plan with one click. These activities can be anything from content in your LMS to eLearning courses hosted in a 3rd party database like LinkedIn Learning.

The subject or manager can also add custom activities to the development plan as well, thereby capturing the informal learning opportunities that are so critical to any development program.

These activities have been curated for employees with that particular score on that particular assessment. In the example above, the subject's 360 score puts them at a level 2 for the skill of Leveraging Diversity. Someone getting a 360 score of 4 would likely see more advanced learning activities to choose from.

This level of personalization is key to engaging the subject and ensuring that they are utilizing the best developmental activities available to them.

30x30 Developmental Cadence

Maintain this momentum into the future by establishing a developmental cadence of the subject meeting with their manager for 30 minutes every 30 days to support progress on committed learning activities and pick one or two new ones.  It is best to focus on incremental improvements. 

Questions To Ask the Subject:

  • What progress did you make on what was committed to in our last conversation?
  • What is the commitment for this next period?
  • How can I help? 

If their manager holds them accountable by consistently asking these three questions every 30 days, employees will be amazed at their progress by year end.  

Finally, reassess your subjects against the same assessment over time to show how your development efforts are impacting the subject and business objectives.  More on how and when to evaluate your workforce can be found here in our article, Measuring Skills & Goals.  

Key Take-A-Ways 

At the beginning of this article we shared that assessment debriefs are your opportunity to engage the subject by encouraging conversation, adding context and inspiring action.  

3 Keys  to Engaging the Subject:

Encourage conversation by framing the debrief in terms of development, asking the subject to define success and asking them questions throughout the debrief.

Add context by identifying and sharing trends, staying curious and connecting verbatim feedback to evaluator ratings.

Inspire action by using this newfound self-awareness to personalize a development plan that meets your subject where they’re at and is designed to propel them along their journey.  

Now that you have the road map and best practices for engaging your workforce using assessments, we encourage you to check out our Guide to Personalized Talent Management for more ways to achieve exceptional performance by personalizing your talent strategy.

ATLAS Man Logo - No Background

Happy developing and know that we’re here if you get stuck and need help!

If you're interested in exploring how ATLAS Navigator's software is optimized specifically for powering 360 assessments, set up a free Insights Call with us! 

Info@ATLASnav.us | www.ATLASnav.us | LinkedIn


Similar posts

Get to know us!

Still in research mode?  Subscribe to our periodic insights and resources to see if we're a good fit for you.