What is a dark pattern?
Coined by UX specialist Harry Brignull, Dark Patterns are deceptive design techniques deliberately inserted on app and website interfaces that trick people into completing actions they have no intention of doing. They do this by modifying the choices available and manipulating the flow of information to users. Actions these unethical design practices lure users into doing always have some commercial benefits for the businesses behind them. Typical examples include unknowingly getting users to give up their data, sign up for subscriptions and make purchases.
Why do dark patterns exist?
The answer is pretty straightforward. Dark patterns exist due to their effectiveness in driving user behaviour and action. As a result, they’ve become more prevalent in digital products, as businesses increasingly focus on pushing users to perform actions that help them hit crucial metrics that evaluate business performances such as revenues clicks, sign-ups, downloads and more. Unfortunately, due to pressure from above, the downside of this is that designers are now focusing more on integrating these deceptive practices to the detriment of user experience.
Unethical or Illegal?
Dark patterns exploit peoples cognitive biases, getting them to complete an action they don’t want to do, but is in the best interest of a business. So we can all agree that they’re unethical. However, the legality of dark pattern use in design has always been a bit of a grey area. For years, people have been calling for a crackdown on this deceptive practice. Yet, aside from odd examples, such as LinkedIn’s $13 Million lawsuit, companies have been freely using dark patterns to trick users into performing unwanted actions without any repercussions. However, after years of sitting on the fine line of legality, it seems that lawmakers are finally stepping in to regulate dark pattern use.
The most significant move to tackle this deceptive practice occurred earlier this year with an update of the California Consumer Privacy Act (CCPA). Initially passed in 2018, this legislation gave Californians the right to “to say no to the sale of personal information”. Unfortunately, in response, companies began using dark patterns to trick users into giving consent. As a result, in this latest amendment, California attorney general Xavier Becerra has added a ban on the use of dark patterns relating to the users’ right to opt-out of the sale of personal information. Although not a blanket ban, it’s a stride in the right direction and will pave the way for further regulation of dark patterns in design.
Types of dark patterns
In his initial taxonomy, Harry Brignull identified twelve different dark patterns, which we’ll expand on with examples in this section.
However, before jumping in, just a quick mention of a recent study conducted by a team of researchers detailed six attributes used to differentiate dark patterns. Instead of delving into the details of these in this article, we’ll do an in-depth breakdown of the research in the first edition of our monthly newsletter, The Design Insider: Focus.
Dark pattern #1: Misdirection
The design purposefully focuses your attention on one thing to distract your attention from another.Harry Brignull
Companies use this to direct an individual’s attention towards an option that is more beneficial to their business objectives, not in the user’s best interest. The screenshots above, taken from Wizz Air, are a great example of intentional misdirection in design.
When users are booking a flight, they’re directed to this screen to select a seat. The first thing they’ll notice is the preselected “recommended seats by Wizz air” option, which costs an additional £20 and is purposefully designed, with its big pink call to action and description that includes a list of benefits, to capture the user’s attention (Screenshot 1). Entirely focused on the preselected option, most users will think they have to select their seats now and not notice the FREE “choose seats later” option written in plain text at the bottom of the list.
Ultimately, directing users towards the paid option is beneficial for Wizz, who, like most budget airlines, rely heavily on add-ons to recoup the costs of their cheap flights. So even when users see and select the free option, they’re shown a carefully crafted copy with red crosses, highlighting the downsides of this choice that discourages them from proceeding.
Dark pattern #2: Hidden Cost
You get to the last step of the checkout process, only to discover some unexpected charges have appeared.Harry Brignull
One of the most frustrating things about online shopping is when you find something you like, add it to the basket and head to the checkout, only to notice additional charges added to the final bill. Yet, surprisingly, despite their detrimental effect on the user experience, companies still use these dark patterns when designing their online stores.
Sports direct is a perfect example of this. If you look at the first screenshot before proceeding to checkout above, the tracksuits price is £49.99. The price is the same on the second screenshot, where users fill in their delivery details and the first screenshot below, where users select their delivery method. However, the second screenshot below has a £4.99 delivery charge, which isn’t indicated anywhere, added to the final total.
From a user experience perspective, this practice is highly frustrating. So why do companies/designers persist? Well, the logic is that if they indicated these charges when you added items to a basket because you’ve put in little effort, you’d be more likely to remove the object. So adding the costs after users have put in the effort of entering their address and details means increases the likelihood they’ll make the purchase.
Dark patterns #3: Bait and Switch
You set out to do one thing, but a different, undesirable thing happens instead.Harry Brignall
Imaging flicking the light switch in the bathroom and instead of turning the light on, it flushed the toilet. Of course, you’d first question your sanity, then jump on the phone and get someone to come and fix the problem. In UX design, this dark pattern where interactions yield an unexpected outcome is known as the bait and switch.
Unlike the others, this dark pattern isn’t very prevalent in digital products. The most well-known example was on the Windows 10 update pop, where users found clicking that the “X” would start downloading the update instead of exiting. After a considerable backlash, they switched back to normal very fast.
Dark patterns #4: Sneak into basket
You attempt to purchase something, but somewhere in the purchasing journey the site sneaks an additional item into your basket, often through the use of an opt-out radio button or checkbox on a prior page.Harry Brignull
Although now illegal in the UK and EU, there was a time where eCommerce platforms would sneak everything, from items to subscriptions, into our baskets without hesitation. Unfortunately, those who didn’t notice got a nasty surprise on their bank statements, while those who caught them reacted with anger.
Sports Direct were one of the last perpetrators who used this dark pattern before it became illegal. As you can see from the screenshot above, they’d sneak in their magazine and a mug in the basket right before checkout. Users were required to go back to the bag to remove these items
Dark patterns #5: Trick Questions
While filling in a form you respond to a question that tricks you into giving an answer you didn’t intend. When glanced upon quickly the question appears to ask one thing, but when read carefully it asks another thing entirely.Harry Brignull
Forms are a vital tool for companies to collect user data for communication and marketing purposes. So it’s no surprise that they use confusing language, design and text structure to craft questions that trick users into giving up their details.
The example above is the form of consent from Yahoo’s website. Where most would typically ask users to subscribe, they’ve asked if they’d like to “unsubscribe from the daily Yahoo”, with a big highlighted CTA that reads “No Cancel”. But, of course, most of us skim through these forms, so the designers have deliberately used this dark pattern knowing that we’d assume the prominent CTA would unsubscribe us. So instead, we stay subscribed while the CTA’s that we needed to click for a desired outcome remain in plain text underneath.
Dark patterns #6: Confirmshaming
The act of guilting the user into opting into something. The option to decline is worded in such a way as to shame the user into compliance.Harry Brignull
Essentially a digital form of guilt-tripping, this dark pattern works by using creatively crafted copy to shame users into confirming for the desired business objective, such as signing up to a newsletter or mailing list.
The example above is a popup from an eCommerce website that’s prompting users to sign up for a 15% discount. However, instead of the opt-out message reading something like “I’m not interested”, it guilt trips users by saying, “No thanks, I’m not into savings”, which pretty much everyone needs to do.
Dark patterns #7: Friend spam
The product asks for your email or social media permissions under the pretence it will be used for a desirable outcome (e.g. finding friends), but then spams all your contacts in a message that claims to be from you.Harry Brignull
Friend spam is a dark pattern, typically found during sign up, in which the platform trick users in to give access to their mailing/contact lists, which they then begin sending emails to, prompting them to sign and use the service/product. Nothing out of the ordinary; however, when your entire contact list starts getting spammed with messages that seem like you’ve written yourself, you could get some angry and confusing messages.
LinkedIn “Add to network” feature was a perfect example of friend spam. It seamlessly reached out to a user’s contacts inviting them to join and sent them follow-up emails message pretending to be the user, without their permission. The result was a lawsuit of 13 million USD for using its users’ contact list for personal gain.
Dark patterns #8: Forced Continuity
When your free trial with a service comes to an end and your credit card silently starts getting charged without any warning. In some cases this is made even worse by making it difficult to cancel the membership.Harry Brignull
Signing up for a free trial is super easy. So when you forget to cancel and get moved onto a paid subscription without any notice, it can burn to see a hefty charge on your bank statement. Aside from this, once your on the premium plan, cancelling can be quite the challenge
The Telegraph newspaper cancellation policy is a perfect example of this dark pattern in action. While all subscribers can sign up nice and easily for a free trial. Once its up, only International subscribers can cancel online in a couple of clicks. In contrast, UK subscribers face the inconvenience of calling a premium hotline within working hours to do the same.
Dark patterns #9: Disguised ads
Adverts that are disguised as other kinds of content or navigation, in order to get you to click on them.Harry Brignull
Some are so obvious, while others are tough to detect dark patterns that trick us into clicking on them in the expectation of some native functionality, only to quickly realise they’re redirecting us to a third party website. Whatever form they’re in, disguised ads are downright frustrating. Yet despite their negative impact on the user experience, you’ll find them widely used on websites for the all-important ad revenue they bring.
Free live streaming and movie websites use disguised ads like nobody’s business. In the example above, you have text that says, “Click here to close and play”. However, when clicked, instead of closing and revealing the player, it redirects us to a third party website, in this case, Coinbase.
Dark patterns #10: Roach Motel
The design makes it very easy for you to get into a certain situation, but then makes it hard for you to get out of itHarry Brignull
Roach motel is when an app/websites design makes the process of completing an action, such as signing up for a subscription, super easy, but opting out and unsubscribing incredibly difficult.
Adobe creative cloud subscriptions are an excellent example of this. As you can see highlighted on the screenshot above, signing up is a super easy two step process. Where cancellation (Screenshots below) consists of a four step process consisting of:
- Feedback: You have to give a reason why your cancelling. Once you give reason a pop coms up asking you you’d like to change plans. The “Change my plan” CTA is another dark pattern example, as its highlighted to stand compared to “No Thanks” button for cancellation.
- Details: If you have an annual plan, you’re shown how much you have left to pay for the year.
- Offers: They hit you with some enticing offers to keep you subscribed. Typically this would be 2 months for free.
- Review: If you reject the offer you then have to review your position after which you can cancel. For annual subscriptions, you can do so until you haven’t payed the outstanding amount.
Dark pattern #11: Privacy Zuckering
You are tricked into publicly sharing more information about yourself than you really intended to.Harry Brignull
Named after Mark Zuckerberg due to Facebook’s poor privacy management, settings, and controls early, this dark pattern concerns tricking users into oversharing personal information. Data privacy laws have gotten stricter in recent times, so Privacy Zuckering is associated with behind the scenes data brokerage and selling of personal data to third parties. Things are getting even more rigid, with Apple unveiling the app tracking transparency control, which requires apps to tell users which personal data they collect and request their permission to do so.
Dark pattern #12: Price Comparison Prevention
The retailer makes it hard for you to compare the price of an item with another item, so you cannot make an informed decision.Harry Brignull
Having the ability to compare the prices between items is an essential aspect of a good consumer experience. Yet many companies try to make it challenging by displaying prices based on different units of measurement, or in some cases, not at all.
Linkedin’s pricing page for premium accounts is an excellent example of this dark pattern technique in action. On the first screenshot, you can see all the options clearly displayed. However, as seen in the second screenshot, you have to click on the plan to reveal the prices. Even the price it shows is still not clear, as it says up to £39.99 a month when paid annually. You can only see the monthly cost after you move forward to the checkout screen.
To design or not to design? Alternative approaches
Designers have been using dark patterns to trick users into performing unwanted actions for years. Yet it’s only recently, fuelled by changes in legislation, scrutiny in the press and coverage on shows such as the Social Dilemma on Netflix have these deceptive patterns come to light.
As more people become aware of their existence in the coming years, designers will face increased questions and scrutiny on how they use them when designing digital products. So we’ll need to put more focus on how we can positively use dark patterns or risk losing users’ trust, who are now more socially conscious and clued on than ever before.
So, how should we deal with the everlasting dilemma of dark pattern use? The answer is simple, become more transparent, more honest, and more upfront with your customers. Below you can see an example where hidden costs are alleviated. The company goes a step further to be very open and transparent about their pricing, giving the end-users a realistic breakdown of the price they pay and where their money goes. Such an approach inspires trust and makes the users more comfortable as they can more accurately evaluate whether this is the best option.
Whereas previously, the focus has firmly been on using dark patterns to hitting OKR’s. As designers, we must now balance business objectives and consider the impact of using these deceptive design techniques on our users’ mental, financial, and physical.