
By Gagan Malik
The story goes like this: brilliant engineers wanted to connect humanity. They wanted a digital town square where strangers could talk, share ideas, and build something together. Good intentions. Messy outcomes. Very tragic. Very avoidable.
That is not the full story. The full story is that we ran the experiments, read the dashboards, and kept shipping the version that made people angrier and more contemptuous because that version made people stay. We did not stumble into toxic platforms. We optimised into them.
Paul Ekman spent fifty years mapping human facial expressions across cultures and landed on seven universal emotions. Six of them are things that happen to you: fear, sadness, anger, disgust, happiness, surprise. Contempt is structurally different. It is the only emotion that appears on just one side of the face. A half-sneer. It does not signal what you are feeling. It signals a verdict about someone else: 'I am above you, morally and socially.' paulekman
Every major social platform gave that specific emotion a one-click broadcast mechanism. The quote-post. The ratio. The public reply count visible to everyone watching. Whether that was deliberate product strategy or an accidental consequence of building for virality is genuinely debated. What is not debated is the outcome: we took the one emotion in human psychology designed for ranking people and handed it a billion-user megaphone.
Ekman's 1992 foundational paper established that contempt is an approach emotion: it drives you toward a target rather than away from it. In plain English, contempt makes you click, reply, and come back tomorrow. I know this personally. In early 2022 I opened Twitter to follow a breaking news story, got pulled into a contempt thread I had not searched for, and looked up forty-five minutes later having produced nothing, decided nothing, and burned roughly $200 (about £160) in productive thinking time. The decision was to 'just check quickly.' The surprise was how fast the algorithm found something to make me sneer at. The consequence was a wasted afternoon and a slightly lower opinion of people I had never met. gruberpeplab
Twitter's own internal Health Metrics research, published in 2021, confirmed the algorithm was disproportionately amplifying politically contemptuous content, not because users searched for it, but because it kept them scrolling. The engineers saw the data. The features stayed live. That is not an accident of scale. That is a decision made on a Tuesday afternoon.
Emotion recognition AI, the technology quietly running inside automated hiring platforms and content moderation pipelines, is built predominantly on Ekman's emotion taxonomy. A 2012 study in 'PNAS' by Jack et al. found that facial expressions are not as universal as Ekman claimed: East Asian participants categorised significantly fewer distinct expressions than Western ones, meaning the scientific foundation these products stand on is genuinely contested. The products shipped anyway, into consequential decisions about employment, credit, and access. pnas
Here is the inference, flagged as inference: if these systems are trained on datasets scraped from platforms that spent fifteen years algorithmically promoting contempt-coded content, it is reasonable to expect the models will encode contemptuous expressions as markers of normal human engagement. That has not been proven at scale. The hiring platform scanning your face during a video interview is not waiting for the proof either.
During China's Cultural Revolution, public struggle sessions did something pure punishment never could: they made spectators complicit. You did not just watch someone get denounced. You were required to join, or your silence marked you as a sympathiser. Participation was the mechanism of control, not the violence itself.
This is the dimension social media adds that simple 'public shaming' framings miss. Algorithmic amplification does not just show you contempt. It ranks your relevance based on what you engage with, which means staying neutral quietly buries you. The platform does not force you to join the pile-on. It just makes neutrality expensive.
The honest counter-argument is this: nobody planned it. Engineers built for connection, scale did the rest, and you cannot hold a product team responsible for the aggregate ugliness of a species with a documented taste for public punishment. Critics including James Russell have questioned whether Ekman's emotion categories are as cleanly universal as his framework claims, which makes the entire 'we designed contempt in' argument harder to sustain if the underlying science is contested. communicationcache
That objection has real weight, right up to 2021, when Facebook's own internal research, revealed by whistleblower Frances Haugen, showed the company knew its algorithms were amplifying divisive and contemptuous content and chose growth targets over redesign. Ignorance is a defence. A board presentation is not.
Contempt is the one emotion research consistently finds most correlated with irreparable breakdown, in marriages, in organisations, and in democracies, because unlike anger it does not leave a door open for repair. We have scaled that emotion across every surface humans use to form opinions, hire people, and raise children. If the AI systems being built on this data are never retrained on cleaner signal, the next time an algorithm silently decides you are not a cultural fit for a job you were qualified for, it will be making that call with a model trained on fifteen years of people sneering at each other for engagement points. thinkingfeelingbeing