We’ve all had that sinking feeling when you realise you’ve signed up for something online that you never meant to. Maybe a barrage of marketing spam you accepted by failing to tick a tiny box you never saw. Or perhaps you got to the last step of the checkout process on a shopping site, only to discover extra charges.
These little design tricks have a name: dark patterns. They’re the subtle ploys many digital companies use to manipulate you into doing something, such as disclosing personal or financial details.
Often, designers exploit loopholes in human psychology. They might use colours such as red and green interchangeably, to wrongfoot assumptions about consistency, or make “cancel” options less conspicuous by rendering them in grey, or smaller.
Harry Brignull, a user-experience consultant, has created a website listing 11 dark pattern types to watch out for. A “roach motel” is when the design makes it simple to sign up but hard to cancel (for example, a subscription); “disguised ads” masquerade as content that isn’t trying to sell you something; and “privacy Zuckering” — named after Facebook CEO Mark Zuckerberg — is the trick of getting you to overshare data.
Brignull’s site has a Hall of Shame filled with examples of trickery — such as when, in 2016, Microsoft recommended users of older versions of Windows to upgrade to Windows 10. Clicking the “x” button, which usually closes the dialogue box, actually downloaded the software — a classic “bait-and-switch” in Brignull’s taxonomy.
Last month, investigative journalism site ProPublica unearthed another example. It revealed how Intuit, an accounting software company, in effect tricks Americans into paying to file their taxes each year, even though they qualify for a fully free service.
These deceptive practices serve to boost revenue: thousands of hard-to-cancel subscriptions generate a lot of income. But the ultimate aim is to lock in more users.
“[Tech companies] recognise the irrational side of human psychology and exploit that, persuasively designing it to their own end, which is attention,” says James Williams, a researcher at Oxford University who previously worked for Google and now studies questions of free will in the digital world. “At the end of the day, that’s their business model.”
To consumers, companies such as YouTube, Google and Twitter provide a service — be it entertainment or information. But, as Williams points out, advertising is what they actually sell. So there is an incentive to resort to manipulation — including dark patterns — to boost audience engagement and, through that, the amount advertisers will pay to reach all those eyeballs. “Whole forms of media are designed according to the incentive structures and logic of advertising,” Williams says.
No segment of the audience is exempt from this logic. Last year, Jack Poulson, a computational scientist, was asked to work on a project to improve YouTube recommendations based on conversational queries. The team knew that adults generally use search keywords that computers understand, but that children use natural language. So the team was given a data set of searches done by children to train a recommendation model on.
“The whole point of modelling children better is to manipulate them better through advertising,” Poulson tells me. “Am I OK with children being manipulated for some unaccountable business’s purposes? There are all kinds of fraudulent ads that Google makes a lot of money from selling… you’re going to obviously lead to more cases of children being [targeted] with fraudulent ads.” (Poulson left Google last August in protest over its China search engine.)
So the next time you discover unexpected charges on your card for a “free trial” you thought you’d cancelled, or click on a news story that’s really an advert, try not to blame yourself. Our human brains are fallible, and tech companies are well aware of their quirks. But being wise to their ruses — and motives — is the first line of defence.
Madhumia Murgia is the FT’s European technolgy correspondent
Copyright The Financial Times Limited . All rights reserved. Please don't copy articles from FT.com and redistribute by email or post to the web.