AI Writes Your Performance Review. What Could Go Wrong?

AI Writes Your Performance Review. What Could Go Wrong? - Professional coverage

According to Bloomberg Business, JPMorgan Chase & Co. is now allowing its managers to use artificial intelligence to help write employee performance reviews. This move aims to relieve bosses from one of their most dreaded annual tasks. The technology could potentially deliver more useful feedback than what employees sometimes get from human managers alone. However, experts are warning that outsourcing too much of the process could turn reviews into what they call “AI workslop.” Benjamin Levick, who leads AI at corporate card company Ramp, noted that boundaries are being redrawn by this technology shift. He acknowledged the risk of reviews feeling dehumanizing if the AI implementation is clunky.

Special Offer Banner

The Rise of AI Workslop

Here’s the thing about AI writing performance reviews: it’s basically outsourcing judgment. Managers already hate writing these things, and now they have an easy button. But what happens when the AI generates feedback that’s generic, overly positive, or just plain wrong? We’re talking about people’s careers and compensation here. And let’s be real – if you’re a manager drowning in work, are you really going to carefully edit that AI-generated review, or just hit “send”?

The Human Touch vs Efficiency

Look, I get the appeal. Writing performance reviews is tedious and time-consuming. Companies like JPMorgan are probably thinking about all the manager hours they can save. But there’s a huge difference between using AI to brainstorm points and letting it write the whole thing. The best feedback comes from genuine human observation and conversation. Can an algorithm really capture the nuance of how someone handles client relationships or mentors junior team members? Probably not.

It’s Already Happening Underground

What’s really interesting is that managers are already doing this whether companies approve or not. People are making up their own rules with ChatGPT and other tools. So the cat’s out of the bag. The question isn’t whether AI will be involved in performance reviews – it’s how companies will manage the quality control. Without clear guidelines, we’re headed for a mess of inconsistent, potentially biased, and definitely generic feedback across organizations.

Where This Is Headed

I think we’re going to see a lot of experimentation and probably some spectacular failures before companies figure this out. The risk isn’t just bad reviews – it’s employees feeling like their careers are being managed by algorithms. That’s a fast track to disengagement. Companies need to be really careful about how they implement this. Used as a tool to enhance human judgment? Maybe. Used as a replacement for it? That’s a recipe for disaster.

Leave a Reply

Your email address will not be published. Required fields are marked *