القصصغرفة الأخبارنبذةالأسعار
اسأل جاجانابدأ

استكشف

  • الرئيسية
  • نبذة
  • القصص
  • غرفة الأخبار
  • الأسعار

تواصل

  • اسأل
  • حجز مكالمة
  • إرسال بريد إلكتروني

موارد

  • التوثيق
  • خريطة الموقع
  • خلاصة آر إس إس

قانوني

  • الخصوصية
  • شروط الاستخدام

جميع الحقوق محفوظة © 2026 جاجان مالك.

الخصوصية|شروط الاستخدام|خريطة الموقع
  1. الرئيسية
  2. Someone Designed That Button
مقالات18 مارس 2026

Someone Designed That Button

By Gagan Malik

It was the summer of 2017. I had just walked out of a well-paid full-time job at the UK's largest insurer to try my luck as a free agent. No safety net, no guaranteed income, just the conviction that I was good enough to make it work. Six weeks in, I accepted a consulting brief from a payday loan provider. The money was significant. The timing was perfect. I said yes before I finished reading the brief.

Three weeks later I had built something I was genuinely proud of. A loan application flow that showed the APR plainly, removed the pre-ticked consent boxes, and treated the person on the other end of the form as an adult capable of understanding what they were signing. It tested well in user research. Participants said they felt informed, not rushed. Legal approved it. By every professional and ethical measure I could apply, it was better design. I took it to the group CTO expecting a conversation about rollout.

He looked at the screen, nodded slowly, and said: "It's good. But can we just reskin the current one?"

The current one had a deliberately obscured APR disclosure buried in grey text below the fold. It had pre-ticked consent boxes that required active effort to untick. It had a progress bar that moved faster in the early stages to create false momentum. Every pattern in it had a name in the design community's own literature: dark patterns, identified and catalogued by researcher Harry Brignull from 2010 onwards, each one a documented method for steering users toward decisions that served the business over the person making them. I had spent three weeks removing them. He wanted them back in different colours. eleken

I declined and exited the project.

The company, owned by a Russian billionaire whose portfolio of near-identical operations stretched across several markets, continued without me. A designer replaced me. The dark patterns shipped. The same obscured APR, the same pre-ticked consent boxes, dressed in different colours, went live later that year. I left money on the table that would have cleared outstanding debt and seeded a retirement fund I am still rebuilding. My exit produced zero change in outcomes for the people who signed those forms. I have spent eight years deciding whether that makes my decision honourable or merely convenient.

Don Norman argues, in The Design of Everyday Things, that design failure is almost never individual failure. It is systemic: absent governance, poorly specified briefs, and organisations that punish friction-raising and reward shipping. He is right that the design profession has no licensing board, no professional oath, no accountability mechanism equivalent to anything in medicine, engineering, or law. The reason this has persisted for over a century is structural, not moral: design harm is almost always attributed to the system that deploys the design, not the design itself. The drone manufacturer gets investigated. The human factors team does not. The payday lender gets regulated. The UX consultancy that built the journey does not. Until design harm has a visible attribution pathway, the absence of governance is a predictable structural outcome, not a conspiracy. uxmatters

That argument is correct. It is also precisely what the people who own those systems want designers to believe.

Frances Haugen handed the Wall Street Journal the internal Facebook research in October 2021 showing the company knew its algorithm amplified divisive, emotionally harmful content, measured the damage internally, and continued the optimisation anyway. The institutional argument collapses at that document. Because the moment a human being inside the institution raises their hand and is told to sit down, it stops being a systemic failure. It becomes a decision. Made in a room. By a named person. Who chose a growth metric over a named harm with full knowledge of the cost. My CTO was not a monster. He was a man under pressure, optimising for a number. But when I showed him the harm built into the existing journey and he said "reskin it," he made a choice. So did I. The difference between us is not intelligence or intent. It is which consequence each of us was willing to absorb personally. vce.usc

The immune system is useful here. When it functions correctly, it identifies and neutralises threats. In autoimmune disease, it destroys healthy tissue with complete operational efficiency. The mechanism works exactly as designed. The targeting data is wrong. The destruction is not a malfunction. It is a fully operational system firing at the wrong coordinates.

Neil deGrasse Tyson, in his StarTalk essay "A Scientist's View of War," published in March 2026, traces every weapon on the kill-ratio curve from fists to hydrogen bomb and shows that each step does not merely scale damage — it degrades the targeting data by removing the human being from the consequence. The drone controller, designed with the ergonomic precision Henry Dreyfuss applied to a telephone handset in his 1955 book Designing for People, delivers force to a coordinate on a screen. The social media feed, optimised for engagement with the precision of a Formula One pit crew, does not merely amplify existing beliefs — it generates the epistemic conditions under which unfalsifiable beliefs spread faster than falsifiable ones. A 2018 MIT study published in Science found that false news travels six times faster than accurate information on Twitter, and that humans, not bots, are primarily responsible for the spread. Outrage is more engaging than correction because it cannot be resolved by data, which is also the condition under which people become willing to die for abstract causes, as Tyson argues. The algorithm did not intend to build a radicalisation pipeline. It intended to maximise daily active users. The pipeline was a known side effect, internally documented, and shipped anyway. youtube

On 26 September 1983, Lt. Col. Stanislav Petrov sat at a console in Serpukhov-15 that displayed an unambiguous alert: five incoming U.S. ICBMs. Protocol was clear. The interface was functioning correctly. What the interface could not communicate, by design, was that it was wrong. The Oko early-warning system was not showing Petrov a binary. It was showing him a probabilistic reading that its display had collapsed into a command: launch or don't launch. The underlying data was something closer to a high-confidence atmospheric artifact — sunlight refracting through high-altitude clouds at a specific solar angle, a known failure mode documented in the system's own error records. A system that surfaces its own uncertainty to the operator is a different dashboard from one that resolves that uncertainty into an order. One of them ends the world less frequently. Petrov overrode the command with his own uncertainty estimate and was right. The Soviet Union reprimanded him for improper paperwork. He died in 2017 in a small flat outside Moscow, undecorated. The system that nearly ended everything has been upgraded several times since. No civilian UX professional has reviewed it. en.wikipedia

I grew up in Delhi in the 1990s watching India and Pakistan aim nuclear weapons at each other over Kashmir. The sanctions that followed India's 1998 Pokhran tests raised the price of goods that had nothing to do with geopolitics for families that had contributed nothing to the decision. The logic of the dashboards did not follow the decision-makers home. It followed the cost of living for everyone else. R.K. Laxman's Common Man, who stood at the edge of power's decisions in the Times of India for sixty years in his checked jacket and bare feet, understood this without a policy brief. In a cartoon from the Iraq War build-up in late 2002, he stands before a skyline of U.S. missiles and is told: "Nothing to feel nervous. These are weapons of JUST destruction, not MASS destruction". He says nothing. He never does. He is the user the system was built around and never built for. britannica

R.K. Laxman, Times of India. "Nothing to feel nervous. These are weapons of JUST destruction, not MASS destruction!"
R.K. Laxman, Times of India. "Nothing to feel nervous. These are weapons of JUST destruction, not MASS destruction!"Stobo Art

Sara Shayesteh was five years old. She attended Shajareh Tayyebeh primary school in Minab, Hormozgan province, Iran. She is number 30 on a list of 61 names verified by Middle East Eye from gymnastics federation records, a handwritten list, and the Tasnim news agency. On Saturday 7 March 2026, she was among at least 165 people killed in what Middle East Eye reported as a strike on the school. A second missile hit the prayer hall where the school principal had moved surviving children to shelter, after telephoning their parents to come and collect them. middleeasteye

The parents who came toward the building were among the dead.

Somewhere, a targeting system held that school's coordinates. Someone specified the confirmation interface, the strike authorisation flow, the visual grammar of a proceed button. Someone, somewhere in that procurement chain, wrote a brief. It named the objective, the user, the desired outcome. It described the system with precision.

It did not name her. That omission is not an accident of process. It is the design.

المزيد من المقالات

عرض الأرشيف
مقالات

Diplomatic Immunity

16 مارس 2026
مقالات

Everyone Wanted an App Store for AI Agents. Meta Just Built the Mall.

15 مارس 2026
مقالات

Why We Designed Contempt Into Our Interfaces

13 مارس 2026