The Empathy Trap - Why working with AI-agents is "easier" (And why that's concerning)
Earlier this week, during a deep-dive discussion about Artificial Intelligence and Business Process Optimization (BPO), I had a sort of epiphany. In the heat of the conversation, I impulsively blurted out a sentence that immediately shocked me:
"It's easier for me to work with AI Agents, as I don't have to deal with human sensitivity."
The moment the words left my mouth, I froze (mentally).
And as I sat with the discomfort of my own statement, I had to confront an undeniable reality: from a pure workflow perspective, it is true.
Sure, AI is far from perfect. It makes glaring errors, it hallucinates facts, and it regularly requires close supervision and correction. But there is a massive difference in how those errors are resolved. When an AI messes up, I can be as blunt, direct, and entirely unfiltered as I want (it's not needed and burn useless tokens, so it's more an internal monologue 😀). An AI-agent never react negatively. It doesn't get angry, defensive, or upset if my tone is perceived as harsh or inappropriate (unless you promp it to do it).
Working with humans, on the other hand, requires a constant, underlying expenditure of cognitive energy. You have to constantly monitor your behavior, carefully phrasing feedback to avoid bruised egos, and navigate the exhausting tightrope of remaining "politically correct." Human collaboration is beautiful, but it is also full of emotional friction. With an AI, that friction vanishes completely.
I fully realize how concerning it is to view work through this lens. In fact, if you look at the prospect of replacing human interaction with machines simply to avoid dealing with feelings and don't find it deeply unsettling, you likely suffer from a disturbing lack of empathy. We are social creatures; our "sensitivities" are what make us human.
Yet, when we put on our business hats, the brutal mechanics of efficiency take over.
As a consultant, my primary job is to improve efficiency. When you are engaged in Business Process Optimization, your fundamental goal is to identify and remove burdens to that efficiency. You are hunting for friction in the system. And whether we like to admit it or not, the management of human emotions, the miscommunications, the ego clashes, and the necessity for delicate diplomacy are all sources of friction and needs additional energy.
When a technology comes along that can perform a task without requiring that emotional overhead, the economic pull toward it becomes a force of nature. In the corporate universe, maximum efficiency is the equivalent of maximum entropy. Just as the Second Law of Thermodynamics dictates that a physical system will inevitably drive toward maximum entropy - a flat, uniform state of equilibrium where all energy variances are smoothed out - Business Process Optimization relentlessly drives toward its own sterile (and auditable) equilibrium. Our human quirks, our emotional needs, our casual banter, and yes, our sensitivities, are irregular, unpredictable spikes of energy. The cold law of optimization dictates that to achieve maximum efficiency, this perfectly frictionless state, all human fluctuations must be leveled out completely.
I wrote about this inevitable collision between the imperative of business efficiency and human systems previously in my thoughts on the 2026 Tipping Point of the Social Innovator's Dilemma.
This is exactly why the disruption of Wertschöpfung (value creation) is an ongoing, irresistible process. Businesses will always gravitate toward the path of maximum efficiency, and our social systems have also started to follow the same rules. With AI, we aren't just automating tasks; we are erasing the emotional labor from our value chains.
It might be "easier" to work without human sensitivity, but I am not sure that I will like the world we are being pushed toward.
Live long and prosper 😉🖖