All About Cookies is an independent, advertising-supported website. Some of the offers that appear on this site are from third-party advertisers from which All About Cookies receives compensation. This compensation may impact how and where products appear on this site (including, for example, the order in which they appear).
All About Cookies does not include all financial or credit offers that might be available to consumers nor do we include all companies or all available products. Information is accurate as of the publishing date and has not been provided or endorsed by the advertiser.
The All About Cookies editorial team strives to provide accurate, in-depth information and reviews to help you, our reader, make online privacy decisions with confidence. Here's what you can expect from us:
- All About Cookies makes money when you click the links on our site to some of the products and offers that we mention. These partnerships do not influence our opinions or recommendations. Read more about how we make money.
- Partners are not able to review or request changes to our content except for compliance reasons.
- We aim to make sure everything on our site is up-to-date and accurate as of the publishing date, but we cannot guarantee we haven't missed something. It's your responsibility to double-check all information before making any decision. If you spot something that looks wrong, please let us know.
When we think of user experience (UX) design, images of developers hard at work creating user-friendly websites and applications come to mind. Today's UX designers always prioritize the interface user's interests above all, designing for the easiest and most user-friendly customer journey possible, right?
Well, not exactly. Today, many developers design interfaces with the company's best interests in mind, using “dark patterns” laden with deceptive language and misleading layouts to boost sales and even steal personal information from inattentive users. Although privacy laws like the GDPR and CCPA prohibit shady UX practices, consumers need to be aware of what they can do to prevent falling victim to such tactics.
12 common types of dark patterns in UX
Is there legislation against dark patterns?
Why dark patterns ultimately don't work
FAQs
Bottom line
Meet the experts
What are dark patterns?
Have you ever been signed up for a subscription you didn't want or had a purchase upgrade added because you failed to notice a default acceptance? How about being taken to undesired ad pages and signed up for a trial when you thought you were only requesting more information? And worse yet, feeling trapped and frustrated when there doesn't appear to be an easy way to cancel?
If so, you most likely have been victimized by a dark pattern — “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice,” according to the most current definition from the California Consumer Privacy Act (CCPA).[1]
In 2010, the U.K.-based UX specialist Harry Brignull created the term "dark pattern" to describe various types of deceptive design practices that use tricky language and architecture to create an easy path for users to do what a company wants — and a significantly more difficult route to do what isn't in the company's best interests.
Today, companies (looking at you, Google, Facebook, Amazon, and LinkedIn) desperately want your data — and will do almost anything to get it. Many use default settings to automatically collect as much information as possible, making it unclear and complicated to change your privacy selections manually.
Company websites also often use flowery language to frame how sharing your information benefits you and even go so far as to warn about how disabling cookies (embedded devices used to collect personal data and track browsing history) will lessen your user experience.
12 common types of dark patterns in UX
Dark patterns typically fall into four categories: preselection, nagging, hiding information, or subverting privacy.
- Preselection uses default selections to choose for the user.
- Nagging bombards the user with constant pop-up notification requests
- Hiding information employs extensive text and fine print to conceal essential facts.
- Subverting privacy involves using deceptive wording to induce users to disclose more information than intended.
Many dark patterns have unfamiliar or confusing names, but they become clearer when explained below. The first line of defense for consumers regarding dark patterns is recognition, so let's get started.
Some dark patterns have become so normalized that users may not even realize they’re being manipulated.
As William J. Roberts, a partner at Day Pitney LLP and Adjunct Professor of data privacy law at the University of Connecticut School of Law, puts it: “The dark patterns that are most effective or hardest to detect are often the ones that feel ‘normal’ because they have become widespread."
Here are 12 of the most common dark patterns in practice today.
1. Bait and switch
Bait and switch is one of the most common dark pattern themes. The user will believe they're doing one thing, but then they will get a different and unexpected result. For example, clicking to learn more about a product and being taken to an ad page, where you are signed up for a trial that auto-renews unless canceled.
2. Hidden costs
Hidden costs are common and one of the more frustrating themes of dark patterns. You reach the last purchase checkout step only to discover your bill has unexpectedly grown with the silent addition of delivery charges, service fees, or taxes. The surprise and blatant deception of this dark pattern can quickly raise frustration and anger in even the most patient and loyal customer.
3. Friend spam
This dark pattern occurs when a company requests an email or social media permission under a seemingly safe pretense, such as adding friends, but then uses this acceptance for a less innocent purpose, like spamming your contact list with a request that appears to be coming from you.
LinkedIn infamously used this shady tactic. When you signed up for its service, the platform would spam your contact lists with LinkedIn ads that appeared to come from you, a trusted source.
4. Confirm shaming
This dark pattern leverages human psychology and employs a strategic layout or language that taps into emotions to influence a user's decision. Examples would be a cancellation page placing a sad puppy picture next to the kill button or a discount decline option with the wording "No, thank you. I hate saving money."
5. Privacy zuckering
Named after Facebook creator Mark Zuckerberg, this dark pattern employs language deception and design trickery to coax users into sharing more personal information than intended or making it difficult for them to find and follow the steps that restrict their privacy.
Now, companies use less obvious methods of data brokering, often through wordy term acceptance agreements (usually in fine print) written in the hope that you will skim over the critical words, allowing the company to sell your data.
6. Misdirection
This dark pattern employs shiny design trickery to direct the user's attention to one area while diverting it from another, often where an unwanted occurrence takes place.
An example would be clicking an option to cancel a service and arriving at a website detailing all the service's benefits, along with a significant discount offer that is accepted by default unless you uncheck an easily overlooked box. Hopefully, you notice the tiny and usually hidden option to cancel the service.
7. Disguised ads
This dark pattern is known for its deceptive practice of cleverly camouflaging wolf ads in sheep-content clothing. Deceptive user experience designers create ads that resemble download buttons for desirable content, but upon clicking, users are bombarded with a barrage of unwanted ads. Often, these ads attempt to trick the users into signing up for a trial that will silently auto-renew unless canceled through a series of inconvenient steps.
8. Forced continuity
The forced continuity dark pattern is common with streaming platforms, such as Hulu and Netflix, that frequently offer free trials. It occurs when a company offers a free trial that automatically renews without providing clear or easy steps for the user to cancel the trial.
Companies often justify auto-renewing as a means to maintain uninterrupted service. It's really just a way for the company to slyly continue billing the customer for a service they may not want.
9. Price comparison prevention
If you're like most customers, you appreciate the opportunity to compare prices when shopping to help score the best deal. Landing a better deal isn't in the interest of most companies that use confusing language and presentation to hide their actual prices.
This dark pattern is common with retailers that use different pricing bundles to make price comparisons of items in the bundle nearly impossible.
10. Basket sneaking
Ecommerce sites commonly use this dark pattern. It occurs when a company unexpectedly adds items, usually upgrades or add-ons, to your shopping cart. If you fail to notice that the company has selected the upgrade or add-on for you and don't uncheck the opt-in selection, an additional charge will be processed for something you didn't want or choose willingly.
Prepare for more aggravation, as the company's designers won't make reversing this unwanted charge obvious or easy for you.
11. Trick questions
Language is at play in this dark pattern — and the more confusing, the better. Companies are aware that most people skim when reading customer agreements, so they use wording that may appear to convey one thing but actually convey another.
Deceptive wording can be combined with the old default approval trick to coax you into agreeing to more than you intended. For example, a company can ask for a donation you agree to, but you might miss the checkbox committing you to recurring contributions.
12. Roach motel
This dark pattern sounds ultra sinister — anything comparing consumers to roaches must come with extra unpleasantness. Unpleasant indeed, as a roach motel describes shady designer practices that make it easy to navigate where the company wants you to go but hard to find your way out.
This dark pattern is a favorite of ecommerce sites that focus on selling a single item, such as event tickets, which automatically subscribe you to a magazine subscription when you make a purchase. You must catch the default acceptance and decline to avoid paying for a subscription you never wanted.
What happens if you miss the default acceptance and you need to cancel that unwanted subscription? You will need to locate the small print directing you to download and print a form to fill out and return via snail mail.
Is there legislation against dark patterns?
Dark patterns employ deceptive language and design practices to mislead users, causing numerous problems for both consumers and companies. Shouldn't they be illegal, then? Well, they can be, depending on where you live and the specific nature of the dark pattern infractions.
Section 5 of the Federal Trade Commission (FTC) Act issues a general umbrella protection by prohibiting "unfair or deceptive acts or practices in or affecting commerce."[2] The FTC enforces other statutes, including the Restore Online Shoppers' Confidence Act (ROSCA), CAN-SPAM Act, and the Children's Online Privacy Protection Act, which many dark patterns could violate. Some dark patterns fall outside the FTC's existing regulations, and it is up to states to individually address the legality surrounding these patterns.
Legal enforcement of dark patterns remains complex, even under comprehensive privacy laws such as the CCPA, CPRA, and GDPR. Roberts explains: “I consider there to be three related challenges with regulating dark patterns under these and similar laws:
- Intent: Under the CPRA, a ‘dark pattern’ means ‘a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice, as further defined by regulation.’ The key issue with this definition is the necessity of intent inherent in the terms ‘designed or manipulated’. It’s often a challenge for regulators to be able to prove that an interface was designed with the intent to include a dark pattern – the line between a dark pattern and an effective marketing decision is often unclear.
- Subjectivity: Enforcement often relies on the interpretation of broad terms and concepts, like ‘freely given’ or ‘user-friendly’. These are inherently subjective and can vary across cultures and jurisdictions.
- Context/User Expectations: Both intent and subjectivity can be further complicated by context and user expectations. For example, a consent interface that feels misleading to a tech-savvy adult may be perfectly clear to a novice, or vice versa. This creates difficulty in setting uniform legal standards. Simply put, the people designing the interfaces and the people using the interfaces are often very different, so what is reasonable to one group may not be reasonable to another.”
California leads this effort with the California Consumer Privacy Act (CCPA), the first U.S. law to define dark patterns and ban their use to subvert or impair the process for customers opting out of the sale of their personal information. The California Privacy Rights Act (CPRA), which took effect in 2023, extends the ban on dark patterns by prohibiting their use in obtaining consent related to the processing of personal information and declares, "Consent obtained through the use of dark patterns doesn't constitute consent."[3]
Similarly, the General Data Protection Regulation (GDPR) prohibits dark patterns that trick EU citizens into giving consent. If violated, companies are required to pay fines.
How do dark patterns impact vulnerable groups?
Dark patterns and their deceptive tactics often target users with less digital experience or confidence, as Roberts emphasizes: “Dark patterns often have a disproportionate impact on vulnerable users because they exploit gaps in comprehension, confidence, or access.
- Older adults may struggle with complex navigation or deceptive UX due to unfamiliarity with newer technologies.
- Children may be especially susceptible to gamified or emotionally manipulative patterns (e.g., use of animation, superheroes)
- Individuals with lower digital literacy (e.g., non-native speakers or those with limited access to digital tools) may be more likely to accept defaults, miss opt-out opportunities, or misunderstand terms.”
How can UX designers ethically balance persuasive design techniques with user autonomy?
Designing ethically doesn’t mean avoiding persuasion altogether — it means respecting the user’s autonomy. Roberts elaborates on how businesses can safeguard against using dark patterns:
“When defending a UX against a dark pattern claim, we find the following to be helpful if done by the company:
- The UX was tested for comprehension, not merely marketing or sales conversion.
- The UX was designed with symmetry – just as easy to cancel or say 'no,' as it is to sign up or say 'yes.'
- The design uses plain language.
- The UX was designed for the target audience in mind – whether it be sophisticated users, the elderly, children, those with limited English proficiency, and so forth.”
Red flags consumers should look for
Roberts details what red flags consumers should look for when trying to spot a dark pattern during an online transaction or sign-up process.
- "Pre-checked boxes or pre-selected toggles
- Use of confusing language that requires you to re-read it again and again to understand it. An example is seeing a double negative, such as 'Don’t not share my data'
- A general feeling of pressure or artificial urgency to conclude the transaction
- Anything that sounds “too good to be true”
- Difficulty in finding or understanding the terms or rules of a transaction"
Why dark patterns ultimately don't work
The short-term use of dark patterns can deceive users into making unwanted purchases, signing up for services, and unknowingly disclosing personal information. Employing dark patterns might help boost metrics and appear temporarily beneficial for a company, but it can also irreversibly damage the all-important customer experience.
Customers feel betrayed and disrespected when they realize a company has used design trickery and deceitful wording to push them toward actions that favor the company's interests over their own. Using dark patterns often results in a frustrating user experience, damaged trust, poor brand reputation, and lost customers.
Dark patterns may appear to be a quick fix, but the ill will created when customers realize the company's deceit and manipulation is just not worth it. It's far better for companies to respect their customers and website visitors, providing them with clear and transparent choices that enhance the customer experience and foster loyalty over the long term.
FAQs
What are some examples of dark patterns?
Dark patterns are design practices that use presentation tricks and deceptive language to guide website or app users into doing something they don't necessarily intend to do. This action is ultimately beneficial for the parent company.
Dark patterns take many forms to influence users, and common examples include bait and switch, friend spam, misdirection, basket sneaking, cost hiding, forced continuity, disguised ads, trick questions, and confirm shaming.
Which dark patterns are currently most effective or hardest for consumers to detect?
According to William J. Roberts, a partner at Day Pitney LLP and Adjunct Professor of data privacy law at the University of Connecticut School of Law, the four trickiest dark patterns include:
- Confirm shaming: Guilt-based phrasing like ‘No thanks, I prefer to pay full price’ exploits emotional pressure, which users may not recognize as a design tactic.
- Obstruction or ‘Roach Motel’ patterns: Easy to sign up, but purposefully difficult to cancel. These are especially effective when companies use multiple steps, hidden links, or force customers to call a number that has long wait times.
- Forced continuity: Free trials that automatically convert to paid subscriptions with little or no reminder. Many users don’t realize what they’ve agreed to until charges appear.
- Deceptive toggles and pre-checked boxes: Users often miss these entirely, especially on mobile devices or when under time pressure.”
Why are they called dark patterns?
The term dark pattern was coined in 2010 by UX specialist Harry Brignull, who has dedicated a website to helping educate consumers about the dangers of deceptive design patterns and how to identify and avoid them.
The sinister name likely came from the deceptive and manipulative nature of the UX designer practices companies use to influence customer behavior in line with their best business interests.
Should dark patterns be illegal?
This question is subjective — depending on whether you're a company that can arguably benefit (in the short term) or a consumer who can suffer loss and frustration from dark patterns.
Most neutral parties agree that a dark pattern's deceptive design removes user choice and should be illegal. Consumers should have a clear and transparent choice when navigating a company's interface without any design or language sorcery driving them toward a preferred outcome.
The current legal shift toward banning dark patterns is grounded in Section 5 of the FTC Act. Several state laws, such as the California Consumer Protection Act and California Consumer Privacy Act (CCPA), have backed the trend by outlawing the use of dark patterns. Similarly, the GDPR prohibits dark patterns that trick EU citizens into giving consent.
Bottom line
Today, companies that use dark patterns to influence user behavior stand to lose far more than they gain. Dark patterns inherently disrespect customers by attempting to compromise their autonomous right to choose; moreover, they can be frustrating and annoying. User trust and customer loyalty can be lost when short-sighted, metric-driven companies employ dark patterns to influence their customers in a way that benefits their business.
Education is the best defense against deceptive user interface design, dark patterns, and fraudulent online practices. Modern consumers must be well-informed about recognizing and avoiding all types of e-commerce fraud, as well as how to adjust their privacy settings to protect themselves against these practices better.
Meet the experts
[1][3] California Consumer Privacy Act
[2] Federal Trade Commission Act, Section 5: Unfair or Deceptive Acts or Practices
/images/2023/03/24/best_ad_blockers_for_chrome.jpg)
/images/2023/02/03/best-vpn-services.png)
/images/2023/07/07/termly-review.png)
/images/2023/03/08/gavel-digital-code-privacy-laws.jpg)
/images/2023/02/08/what_are_dark_patterns.jpg)
/images/2023/02/01/what_is_the_right_to_be_forgotten.jpg)
/images/2023/01/06/virginia-privacy-law.jpeg)
/images/2022/07/08/how-to-avoid-gdpr-fines.jpg)