Join Todd Dewett for an in-depth discussion in this video Providing effective 360 reviews, part of Performance Review Foundations.
Today, 360 Evaluations are very common. These tools, also called multi rater or multi source feedback, were originally about employee development, not performance evaluations. The idea was to triangulate the truth about an employee by using more than simply their boss as a feedback source. By using their peers, direct reports, bosses and customers, along with their own self ratings, you should get closer to reality about that person's performance.
This data can then be used in mapping out development paths moving forward. People like data, so it wasn't long before organizations started finding many uses for 360 Evaluations, including employee development, performance appraisals, compensation decisions, and organizational development. However, the most common is still employee development and appraisals. And there are a few clear advantages to using 360 Evals. First, there's no doubt that 360 ratings provide a better, broader perspective on employees.
The perspective created is inevitably more rich, compared to only having the person's boss provide feedback. In addition, for most employees it's eye-opening to learn what others see. It builds perspective, and reminds us that others don't always see us the way we see ourselves. Yet another advantage is that data collected, in and of itself, is proof that decisions were not arbitrary. Yes, we can debate the quality of the data collected, but from a reliability perspective, decisions backed by data tend to be safer.
Having said that, many scholars and practitioners have noted serious problems with 360 Evals. The first deals with validity and reliability. Without deep knowledge of survey construction and statistics, ensuring that your items are consistently measuring what you think they're measuring, is very difficult. It's not uncommon for 360 users to complain that items don't relate to competencies, or that too many items are redundant. The trick here is using legitimate experts, whether internal or external, to build your 360 instrument.
The next common issue is sample size. How many users will be participating? There are two major issues here. First, will you have a sufficient number of users to ensure anonymity? You can say it's anonymous, but the smaller the number of participants, the more people will know who their raters are. A related question is about expertise. Everyone in an organization has a different job. Your raters need a fairly detailed understanding of the jobs they'll evaluate, but we all know, that's not always the case.
Another huge problem is the tendency of 360s to focus on what's wrong with the person. We identify where they are not amazing and think about how to develop those areas. That's not wrong per se, but most observers think a better plan is to use tools like this to better identify clear strengths, and then focus on how to better leverage the person's strengths as a part of their development. One of the most common complaints about 360s is the amount of work they require. If you have a massive competency model underlying your 360 process, and several hundred items designed to measure every competency, you can spend many hours per employee just filling out the 360.
No one likes that. So what should you do in the face of these challenges? In short, the most effective 360s proactively push out needed information to answer questions before they arise. They rely on shorter evaluations instead of longer evaluations. And they provide access to coaches or other resources, so that employees know where to go with questions. Companies remain torn about using 360s for development versus real decision making, such as appraisals. In either case, you increase your odds of success by remembering these three rules.
First, let's think about being proactive. Before a 360 begins, push out answers to these questions. What is a 360, and exactly how is it being used? Who is participating? What choice do I have in choosing raters? How long will it take? Is anonymity guaranteed, and how? Who will see my results? How accurate is this process? How do I benefit from this process? I want you to use multiple communication channels to proactively get this information into the hands of all employees.
And encourage every manager to informally talk through these issues, to clarify why we're doing this and to increase support for the process. Next, and this is a big one, keep it short. If you ask raters to fill out hundreds of items, you're asking them to become fatigued, and that's when you get truly poor data. Strive for less, strive for focus. Many organizations successfully use 30 to 50 items, not hundreds. Finally, as the process unfolds, be sure to push out information about any and all support resources for participants.
These might include their boss, online information, dedicated 360 coaches, or HR reps. The more they feel supported, the more seriously they'll engage the process. 360s are no different than any other tool. They have some real benefits, and yet they're far from perfect. Make your use of 360s count by following the advice we just discussed. That way, you'll be supporting a feedback system that helps people improve without overburdening the very people you wish to help.
Lynda.com is a PMI Registered Education Provider. This course qualifies for professional development units (PDUs). To view the activity and PDU details for this course, click here.
The information contained in the following course is provided with the viewer's understanding that the course should not be used as a substitute for consulting a human resource professional at your company for specific guidance. Lynda.com and LinkedIn expressly disclaim liability for any damages, loss, or risk, incurred as a direct or indirect consequence, from the use and application of any content herein.
The PMI Registered Education Provider logo is a registered mark of the Project Management Institute, Inc.
- Understanding the performance cycle
- Setting performance goals
- Collecting performance data and feedback
- Writing the review
- Discussing performance with an employee
- Using a performance improvement plan (PIP)<br><br>
- The PMI Registered Education Provider logo is a registered mark of the Project Management Institute, Inc.