Every day, you use apps and services that were carefully crafted by teams of professional designers to deliver the best user experience possible. At least, that’s the idea. However, if you’ve ever found it easier to sign up for an account than it is to cancel it, you’ve stumbled onto a dark pattern. And over the next several weeks, WIRED is going to dissect common examples across online shopping, social media, search, and more.
The term “dark patterns” was first coined by UX specialist Harry Brignull to describe the ways in which software can subtly trick users into doing things they didn’t mean to do, or discouraging behavior that’s bad for the company. When you want to unsubscribe from a mailing list, but the “Unsubscribe” button is tiny, low-contrast, and buried in paragraphs of text at the bottom of an email, it’s a strong sign the company is putting up subtle roadblocks between you and cancellation.
Dark Patterns is made possible by the Omidyar Network. All WIRED content is editorially independent and produced by our journalists.
The button to buy an item that’s on sale, on the other hand, is large, bright, and at the top of the email. There’s no hiding that one.
Not all dark patterns are designed maliciously, and some UX designers might not even be aware that they’ve built a system that’s tricking users. In many cases, designers might just be doing what works. But being cognizant of how app design plays on human biases is key to avoid falling victim to dark patterns.
UX Design Leads Your Behavior, and That’s (Usually) a Good Thing
Websites and apps rely on design language to direct users on how to accomplish the task they want to do. A red circle lets you know there’s a notification that needs your attention. Click on an X icon to close whatever you’re working on. If a user can’t immediately understand how an app works, they’re likely to get frustrated and stop using it. So, to give users a positive experience, UX designers build their software to be as intuitive as possible.
“If you don’t feel successful using a tech tool, you won’t continue using it,” says behavior scientist and author of Tiny Habits BJ Fogg. “Look at all the apps on your phone—all those apps you used once and never again. They failed to move you forward to time number two, much less create a habit. Those apps didn’t help you feel successful.”
Take an app like Duolingo, for example. It allows you to sign in via services like Google or Facebook, and after a few basic setup questions, immediately drops into a lesson. Compare this to other apps like Rosetta Stone that require a several-step-long process to create an account, then asks users to pick from a selection of lesson plans and sign up for a trial account, which includes providing payment information before they’ve even tried the app.
“Duolingo does a pretty good job of helping people feel successful. Without that, I would predict Duolingo wouldn’t be the number one language-learning platform,” says Fogg, who says he uses the app every day to learn Hawaiian.
Since every tap is a chance for a user to get frustrated and leave, developers have an interest in learning and manipulating the micro-decisions that users make that can lead to enjoying or hating an app.
When Design Goes Bad
The trouble comes when the company that makes an app or site has different priorities than the person using it. For example, when you sign up for a monthly subscription service, most companies will make that process easy. However, if you want to cancel, the company might put a couple of speed bumps in the way to discourage you. Sometimes this can be subtle, like making the “Never mind, I’d like to stay” button bright and colorful while making the “Yes, I really want to cancel, let’s get on with it” button more subtle.
This might seem like a minor thing. Most users will probably figure out the correct button to click. But if even a small number of users don’t pay attention and keep their subscriptions when they don’t mean to, that can mean money for the company.
“Lots of companies will make it hard for people to leave,” says Brignull. “They are going to get around to it eventually, but if they might stay for an extra 10 percent of the time, or 20 percent, the accounts might live just a little bit longer. And if you’re doing that en masse for hundreds of thousands of people, that translates to enormous amounts of money, for people who are going to leave anyway.”
In some extreme cases, the hurdles to performing a behavior that’s bad for the company can be incredibly difficult. If you want to close your Amazon account, for example, you have to contact Amazon directly and ask the company to do it. You can’t do it yourself. And you’ll find instruction at the bottom of a Help page telling you all the reasons you shouldn’t.
If you proceed, Amazon will ask you to fill out a form that will send an email asking to close your account. Then the company replies with an automated email explaining why you shouldn’t do that a second time. If you’re really, really sure, you can click a link at the bottom of a massive email that will take you to a page where you can send another email to Amazon, confirming that you truly do, in your heart of hearts, want to close your account.
This technique is what Brignull calls a “roach motel.” It’s easy to get in, but a lot harder to get back out. For some services, this might not be by design. A lot of effort goes into making sure that the sign-up process is easy, but polishing the account-closing process isn’t as high of a priority.
In other cases, like with Amazon, effort might be put in specifically to make a task harder because, from the company’s perspective, it shouldn’t be easy. A charitable interpretation is that Amazon doesn’t want users accidentally deleting their accounts, because that would cause them to lose content they’ve purchased, so Amazon makes it harder on purpose, for the user’s benefit. But it doesn’t hurt that Amazon also benefits when customers who do want to close their accounts get frustrated and perhaps give up while doing so.
Facebook. Instagram. YouTube. Amazon. How much do you trust these products? Take our survey and tell us what you think.
That’s just one example. Brignull has identified a dozen types of dark patterns like the roach motel, all of them listed on his website, at Darkpatterns.org. For example, there’s the “sneak into basket,” where a retailer sneaks something into your shopping cart while you’re trying to purchase something else, like a warranty or service plan that you have to opt-out of to have it removed from your cart.
You’ve probably also seen what Brignull calls “confirmshaming,” where a site guilts you into opting into something or staying subscribed to a newsletter. Like when a site shows you a photo of a sad puppy before you confirm you want to unsubscribe, or when you’re reading an article and a full-screen pop-up appears, asking for your email address, and your only options are “OK” or “No, I hate reading about interesting things.” Brignull also showcases especially egregious examples of dark patterns on the @darkpatterns Twitter account he operates.
What to Do About Dark Patterns
When dealing with dark patterns, the bad news is that companies have teams of people dedicated to testing and experimenting on what techniques get the most desirable response, and you … are just you. The good news is that education is a powerful tool.
“If you know what cognitive biases are and the kind of tricks that can be used to change your mind to persuade you to do things, then you’re less likely to have them trick you,” Brignull explains.
Brignull also recommends calling out companies publicly. A difficult experience canceling a subscription might help make a few bucks in the short run, but if a site is called out for misleading customers, they might take steps to correct the design in order to keep customers happy. “Complaining quite vocally is a very good thing. So don’t complain by email when no one can see it, because you’ll just get fucked off. If you complain in public, then you’re more likely to get a faster and more efficient response,” Brignull says.
Dark patterns are everywhere, and while not every attempt to manipulate a user’s behavior is harmful to a user, it’s always important to be aware that a company’s goals don’t always align with your own. For some companies, if they can trick you into doing something you wouldn’t do otherwise, they will.
More Great WIRED Stories
- The country is reopening. I’m still on lockdown
- Want to start a podcast or livestream? Here’s what you need
- Doomscrolling is slowly eroding your mental health
- Women’s roller derby has a plan for Covid, and it kicks ass
- Hacker Lexicon: What is a side channel attack?
- 👁 If done right, AI could make policing fairer. Plus: Get the latest AI news
- ✨ Optimize your home life with our Gear team’s best picks, from robot vacuums to affordable mattresses to smart speakers