GUANGXI ZHUANG, CHINA — A study conducted by researchers affiliated with Guangxi Power Grid and Chinese universities, participated by 13 customer service representatives (CSRs), reveals that AI assistants, while promising efficiency, often introduce new challenges for customer service representatives.
Workers report inaccuracies, additional verification tasks, and unreliable emotion recognition, which undermines the technology’s intended benefits.
AI assistants fall short in real-world customer service
While these assistants were designed to streamline workflows—reducing typing and memorization tasks—they frequently introduce errors, forcing workers to manually correct transcriptions, verify data, and ignore unreliable emotion analysis features.
One CSR noted, “The AI assistant isn’t that smart in reality.”
“It gives phone numbers in bits and pieces, so I have to manually enter them.” highlighting the system’s inability to handle basic tasks accurately,” the CSR added.
The research found that AI tools often misfire, requiring constant human intervention. False alerts on customer emotions, mismatched organizational standards, and fragmented data outputs create more work rather than less.
“If we don’t manually verify and adjust it, and just go with the system’s default suggestion, there will be many errors,” another respondent said. Rather than simplifying jobs, the AI forces CSRs to adapt to its shortcomings, adding unexpected layers of labor.
Emotion recognition flaws undermine AI’s potential
One of the most criticized features of the AI assistants was their emotion recognition capability, which frequently mislabeled neutral conversations as negative. CSRs reported that the system lacked sufficient emotional tags and often misinterpreted normal speech intensity, resulting in false alerts.
As a result, many workers disregarded the emotion analysis entirely, rendering a key selling point of the technology useless.
The study concludes that while AI can enhance efficiency in theory, its real-world implementation often fails to deliver.
“While the AI enhances work efficiency, it simultaneously increases CSRs’ learning burdens due to the need for extra adaptation and correction,” the report states.
The AI assistants lead to what is referred to as the earning, compliance, and psychological burdens placed on workers who continually have to overcome the system’s weaknesses.
This raises the question of whether existing AI tools are genuinely intended to assist employees or whether they merely redistribute labor without eliminating it. Nevertheless, it is through proper review of the AI tools, its integration into systems, and human oversight that the deployment of AI in various fields can be balanced without compromising the outcomes.