Let me let you know one thing your vendor is praying you by no means discover out.
The shiny “agent wellbeing” dashboard they pitched you final quarter, the one with the emoji faces lighting up subsequent to your name heart brokers’ names, the one which promised to revolutionize worker engagement by studying emotional state from voice knowledge? It’s been unlawful throughout the complete European Union for nicely over a yr.
Not restricted. Not closely regulated. Not topic to a voluntary code of conduct. Unlawful. Banned outright. Article 5(1)(f) of the EU AI Act, in power since February 2, 2025.
And right here’s the actually outrageous half. A surprising variety of UC and call heart distributors are nonetheless promoting it. Nonetheless demoing it at commerce exhibits. Nonetheless writing it into enterprise contracts. Nonetheless, astonishingly, claiming in gross sales conferences that it’s a aggressive differentiator.
It isn’t a differentiator. It’s a €35 million high quality ready to land on somebody’s desk. And except you’re paying very shut consideration, that desk could be yours.
The reality is, emotion AI at work is now not a product class in Europe. It’s a violation of basic rights. That’s not my opinion. That’s the legislation.
The Soiled Little Secret the Enterprise Software program Business Doesn’t Need You Studying
For one of the best a part of a decade, one pitch has run by way of the enterprise AI market. It went one thing like this. Managers may lastly see the unseeable. The internal lifetime of the workforce could possibly be measured. A well-designed algorithm may inform a workforce chief how their individuals have been actually feeling, with out anybody ever truly having to, you recognize, discuss to them.
It was all the time a creepy proposition. It implied one of the best ways to grasp a human being was to cease talking with them and begin analyzing their face. Nevertheless it bought. Boy, did it promote. Sentiment overlays on video calls. Vocal stress evaluation on agent traces. Wearables that scored worker focus from coronary heart fee variability. Facial features AI that graded customer support reps on how sincerely they smiled.
The European Union has now written into legislation the view that this was by no means a product class in any respect. It was a breach of human dignity at work, dressed up in dashboard design.
You’ll be able to argue with the reasoning. You can not argue with the high quality.
What the EU Really Banned, and Why Your Compliance Group Ought to Already Be Panicking
Here’s what Article 5(1)(f) truly says, in plain English. Any AI system that infers the feelings of an individual in a office or academic setting is prohibited. Full cease. The one exceptions are slim carve-outs for medical or security functions, like detecting driver fatigue in a logistics fleet.
The ban applies to suppliers, that means the seller promoting the software program. It applies to deployers, that means the employer utilizing it. And crucially, it applies no matter the place the seller is headquartered, as long as the system touches individuals within the EU.
Is your contact heart platform taking calls from Hamburg or Madrid? You’re in scope.
Does your wearables program embody operations in Dublin or Milan? You’re in scope.
Is your collaboration suite utilized by workers sitting wherever within the European Financial Space? You’re in scope. And so is your vendor.
“Seven % of worldwide turnover. Whichever is increased. That’s the high quality tier reserved for the very worst AI practices the European Union can think about. And office emotion recognition sits proper there, subsequent to social scoring and subliminal manipulation. Let that sink in.”
The Date Each CIO Ought to Have Had Circled in Pink Ink
February 2, 2025. That’s the day Article 5 got here into power.
That was greater than a yr in the past. A yr wherein distributors may have quietly ripped the characteristic out of European builds. A yr wherein authorized groups may have written consumer advisories. A yr wherein consumers may have been instructed, truthfully, {that a} chunk of what they have been paying for was now illegal.
As an alternative, a lot of the business has responded with a masterclass in trying the opposite manner. No press releases. No product remembers. No “necessary replace concerning your deployment” emails. Only a quiet hope that no one will get round to imposing it till August 2026, when the remainder of the AI Act rolls in and the noise will get louder.
Hope, I’m afraid, just isn’t a compliance technique.
The European Fee’s November 2025 evaluate of the AI Act particularly declined to melt the prohibited practices listing. The bans are staying. The Irish Office Relations Fee, of all regulators, will implement the office emotion recognition prohibition in Eire. France’s CNIL is dealing with it domestically. Complaints are being filed. The primary main enforcement case is predicted this yr.
Your vendor has had 14 months. What have they really finished about it?
Emotion AI vs Sentiment Evaluation: The Distinction That Will Resolve Who Will get Fined
That is the place sensible consumers have to get very exact, very quick.
The AI Act bans the inference of feelings from biometric knowledge. That’s voice, face, gait, physiological sign, keystroke rhythm. Something the place the system reads a physique and attracts an emotional conclusion.
It doesn’t ban the detection of readily obvious bodily states. A instrument that notes an individual is smiling, with out drawing a conclusion about whether or not they’re glad, is lawful. A instrument that concludes they’re glad just isn’t.
It additionally doesn’t ban text-only sentiment evaluation. Scanning written assist tickets or chat logs for optimistic and adverse tone just isn’t an emotion recognition system underneath the Act, as a result of it doesn’t use biometric knowledge. That distinction alone goes to determine which options survive in European product builds and which get quietly buried.
Right here’s a helpful take a look at. In case your vendor is promoting you “voice-based agent temper detection,” that’s a banned characteristic. In case your vendor is promoting you “written ticket sentiment scoring,” that’s in all probability high quality. In case your vendor is promoting you “facial features engagement analytics” on Groups calls, that’s a banned characteristic. In case your vendor can’t let you know which class their product falls into, discover a higher vendor.
“In case your vendor can’t clarify, in writing, whether or not their product infers emotion from biometric knowledge, you have already got your reply. And it isn’t the one you need.”
The Contact Middle Time Bomb No person in UC Needs to Defuse
Brace your self, as a result of that is the place it will get genuinely messy for UC Right now readers.
The AI Act splits emotion recognition into two buckets, they usually sit in dramatically completely different authorized packing containers.
Emotion inference utilized to your workers: prohibited. Seven % of worldwide turnover high quality tier. Article 5(1)(f).
Emotion inference utilized to your prospects: high-risk, not banned. Permitted, however topic to in depth compliance necessities coming totally into impact in August 2026.
Now image the typical trendy contact heart deployment. A single voice analytics engine sits on the decision. It listens to each events. It produces outputs for each. The seller in all probability bought it on a mixed pitch of “buyer sentiment insights” and “agent teaching and wellbeing monitoring.”
In any European deployment, that structure is now break up down the center by the AI Act. The shopper-facing half must be totally compliant by August 2026. The agent-facing half has been outright unlawful since final February.
Which suggests, virtually, an enormous swath of contact heart software program deployed throughout European operations must be reconfigured, restricted to text-only options, or switched off solely on the agent aspect. Ask your vendor, right now, which aspect of that break up their product sits on. Ask them to place the reply in writing. Should you don’t get a solution, or the reply is evasive, you already know what you’re coping with.
Wearables, Webcams, and the Hidden Surveillance You Purchased By Accident
The ban reaches a lot additional than the decision heart.
Any office wearable that infers stress, focus, or emotional state from coronary heart fee variability, galvanic pores and skin response, or mind exercise is, if used to watch workers, a prohibited system. Among the extra formidable frontline workforce experiments working proper now are crusing straight at this authorized wall.
Collaboration platforms are uncovered too, and that is the place the legislation truly is sensible for as soon as.
Assembly transcripts? Utterly high quality. AI-generated summaries of what was stated in a name? Wonderful. Motion objects, selections captured, follow-ups flagged, searchable archives of your workforce’s standups? All high quality. And should you perceive why, you perceive the complete logic of the AI Act.
Right here it’s in a single sentence. The European Union didn’t ban AI within the office. It banned one very particular factor, which is the inference of an individual’s inner emotional state from their biometric knowledge. That’s it. That’s the entire prohibition. Every little thing else survives.
A gathering transcript doesn’t infer something about anybody’s emotions. It takes audio and converts it into textual content. It captures phrases, not feelings. It data what was stated, not how the speaker felt when saying it. A transcript of a product evaluate assembly comprises the product selections, not a psychological profile of the individuals making them. That’s a professional productiveness instrument. That’s what note-taking software program is meant to do, and the AI Act has zero drawback with it.
“A transcript captures phrases. An emotion recognition system captures emotions. One is a productiveness instrument. The opposite is office surveillance dressed as much as seem like a productiveness instrument. The EU AI Act is completely able to telling the distinction. Your vendor needs to be too.”
The identical logic runs by way of the remainder of the stack. Textual content-only sentiment evaluation, scanning written Slack messages or assist tickets for optimistic and adverse tone, just isn’t a prohibited system. It doesn’t use biometric knowledge. It processes textual content. AI that summarizes an e mail thread, drafts a reply, flags pressing messages, or pulls out key themes from written buyer suggestions is all lawful. None of it reads a human physique to infer a human feeling.
The place the road will get crossed is the second a instrument provides a layer on prime that analyzes the speaker’s voice to determine they sounded harassed, or reads their face on video to attain how engaged they regarded, or tracks their keystroke rhythm to deduce frustration. Now you’ve left the world of productiveness software program and entered the world of Article 5(1)(f). One characteristic is a gathering assistant. The opposite is a surveillance system carrying a gathering assistant’s costume.
Because of this a handful of enterprise distributors have very quietly eliminated sentiment and engagement overlays from European builds over the previous 18 months, whereas leaving transcription and summarization options solely alone. They know precisely the place the road is. The query is whether or not your vendor has truly drawn it, or continues to be hoping no one notices that their “engagement analytics” module does exactly what Brussels has forbidden.
Some are betting that what they promote is “expression detection” somewhat than “emotion inference” and hoping regulators break up the hair of their favor. The Fee’s tips explicitly instruct regulators to interpret the ban broadly, not narrowly. I wouldn’t wish to be the Common Counsel making that argument in entrance of CNIL.
“This isn’t a small technical provision. It’s the European Union telling a whole software program business that considered one of its favourite product pitches is a human rights violation. The distributors nonetheless pretending in any other case are working out of street.”
The Fines That May Wipe Out a Quarter of International Income
Three penalty tiers apply underneath the AI Act.
Breach of a prohibited follow, together with office emotion recognition: as much as €35 million or 7% of worldwide annual turnover, whichever is increased.
Breach of high-risk AI obligations: as much as €15 million or 3% of worldwide turnover.
Offering incorrect info to regulators: as much as €7.5 million or 1%.
And right here’s the kicker. As a result of emotion recognition sometimes processes biometric knowledge, which is particular class knowledge underneath GDPR, most violations may also set off a parallel GDPR discovering. Fines can theoretically stack to 11% of worldwide turnover. For a big platform vendor, that’s 1 / 4 of a yr’s income, gone.
The ICO’s resolution in opposition to Serco Leisure in 2024, ordering the corporate to cease utilizing facial and fingerprint scanning for workers attendance throughout 38 websites, provides you a good indication of the urge for food knowledge safety authorities have developed for office biometric circumstances. And that was earlier than the AI Act even got here into power.
What You Must Do Earlier than Your Subsequent Board Assembly
In case your group runs UC, CX, or worker expertise software program throughout any European operation, right here’s your week one guidelines.
One. Ask each single vendor, in writing, whether or not their product infers worker emotional state from voice, facial, physiological, or behavioral biometric knowledge. Direct query. Written reply. No waffle.
Two. Ask whether or not these options are enabled by default in European deployments and whether or not they are often disabled at tenant stage. If they’ll’t be disabled, that’s a pink flag.
Three. Ask for the seller’s written compliance evaluation in opposition to Article 5(1)(f) of the AI Act. In the event that they shrug, you now know the chance sits with you.
4. Separate customer-side and agent-side analytics in contract and configuration. Totally different authorized worlds. Don’t let the seller collapse them within the gross sales pitch.
5. Audit your wearables and workforce administration stack urgently. The frontline tech layer has grown quick and quietly, and a few of it’s inferring much more about employee inner states than consumers realized at level of sale.
Six. Loop in your works council or worker representatives now. Session earlier than deployment is what regulators count on, and it’s the one posture that survives scrutiny when the primary enforcement case lands.
The Reckoning Is Coming. The Solely Query Is Who Will get Made an Instance Of
Right here’s my sincere learn on the place that is going.
There will likely be a primary main enforcement case. It is going to occur this yr. It is going to nearly actually contain a vendor most UC Right now readers acknowledge. And when it lands, each purchaser who signed a contract with out asking the onerous questions will likely be dragged right into a procurement evaluate that they may have averted with one e mail and one written reply.
The distributors who constructed their product decks round emotion AI are, as of this yr, in a really quiet panic. The regulators are, politely, sharpening their instruments. The consumers who signed the contracts are, by and enormous, solely unaware of it.
You don’t wish to be the one who finds out the onerous manner. Ask the questions this week. Get the solutions in writing. As a result of when the high quality lands, “my vendor didn’t inform me” received’t be a protection.
It’ll be exhibit A.
Sources: European Fee, Tips on Prohibited AI Practices (February 2025); EU AI Act Article 5(1)(f) and Recital 44; ICO Serco Leisure enforcement (2024); OECD Algorithmic Administration within the Office (2025); IAPP Biometrics within the EU (2025).

