Decades of cases from the FTC and state attorneys general have challenged manipulative tactics used by unscrupulous businesses to get consumers to hand over their money or their sensitive data without a clear picture of the nature of the transaction. State and federal dockets are filled with actions against companies that used deceptive “free” offers, misleading negative options, unfair payment methods, or settings that subverted consumers’ privacy choices.
That’s bad news and it doesn’t get any better. No one doubts the many benefits to consumers brought by the online marketplace. But we’ve also seen the proliferation of digital “dark patterns” – a term used to describe a range of potentially manipulative user interface designs on websites and mobile apps. As commerce continues to move online, the questionable practices we’ve seen for years in the brick-and-mortar world have migrated there, too. What’s more, the virtual environment has proven to be a breeding ground for even more sophisticated dark patterns – practices creating greater risks to people by impairing consumer choice.
Often in partnership with state attorneys general, the FTC has brought multiple law enforcement actions challenging digital dark patterns. For example, the FTC has taken action against companies that used non-descript dropdown arrows or small icons to hide the full cost of the transaction, created labyrinth-like procedures when consumers want to cancel recurring charges, or even sneaked unwanted products into their online shopping carts without their knowledge. The FTC also issued an Enforcement Policy Statement Regarding Negative Option Marketing, warning companies against using illegal practices that trick or trap consumers into subscriptions.
But the battle against digital dark patterns is ongoing. That’s why the FTC convened its 2021 Bringing Dark Patterns to Light Workshop. The event brought together consumer advocates, members of Congress, researchers, legal experts, industry representatives, and federal and state law enforcers to assess the state of dark patterns. Following up on the workshop, the FTC published its Bringing Dark Patterns to Light Staff Report, a summary of what we know about dark patterns and a consideration of what needs to be done to protect consumers.
I. Dark Patterns: Synthesizing the Knowledge Base
User design specialist Harry Brignull coined the term “dark patterns” in 2010 to describe design practices that trick or manipulate users into making harmful choices. As the Dark Patterns Staff Report explains, dark patterns often take advantage of consumers’ cognitive biases to steer their conduct or to delay – or deny – access to information they need to make informed decisions. The Staff Report includes citations to research suggesting that dark patterns are highly effective at influencing consumer behavior.
Another disturbing finding cited in the Staff Report is the extent to which dark patterns used in combination can have an even more injurious impact on consumers. That doesn’t come as a surprise to experienced federal and state law enforcers. Many of the cases we have brought demonstrate the synergistic effect when companies use multiple dark patterns to subvert consumer choice. For example, in one recent action, the FTC alleged that operators of an online stock trading site used deceptive customer testimonials to lure consumers in, buried purported “disclaimers” in dense Terms and Conditions, positioned key information in ways that required consumers to scroll down the screen, and sold services on a subscription basis while making it difficult for people to cancel the recurring charges. Each illegal practice on its own harmed consumers, but the combination of dark patterns exacerbated the scale and severity of the injury.
The Staff Report cites other ways in which dark patterns in the digital world can pose heightened risks to consumers. The pervasive nature of data collection techniques, which allow companies to gather massive amounts of information about consumers’ identities and online behavior, means that businesses can adapt their ads to target a particular demographic or even a particular consumer’s interests. Furthermore, the online environment makes it much easier and more cost-effective for companies to experiment with variations in their advertising. Some may use that flexibility to create clearer and more effective marketing materials, but law enforcers are also likely to encounter companies that modify their ads for questionable purposes.
For example, for traditional television ads, it can be cost-prohibitive to produce multiple commercials, run them on different stations, evaluate their effectiveness, modify them based on their results in the marketplace, and then run revised ads. But that kind of experimentation is much easier in the online world and law enforcers have often seen it used to introduce more – and more deceptive – dark patterns into consumer transactions. Indeed, in recent FTC cases, companies have allegedly tested variations of online ads and opted for ones that yielded greater consumer response, even when the variations introduced misleading product claims.
As the Dark Patterns Staff Report suggests, the medium through which consumers access online information also affects the dark patterns they may encounter. Studies show that some dark patterns are more common in mobile apps than on websites. One contributing factor is the fact that the smaller screen of a smartphone makes it easier for companies to use dark patterns and harder for consumers to ascertain the truth. For example, the amount of scrolling that may be required on a small screen may make it less likely for consumers to spot information that companies have placed – often intentionally – below users’ typical field of vision. Dark patterns of that variety may have a disproportionately harmful impact on lower-income consumers who are more likely to rely on a mobile device as their sole or primary access to the internet.
II. Classifying Dark Patterns
Based on discussions at the Dark Patterns Workshop, additional empirical research, and law enforcement experience, the Staff Report divides some of the most common dark patterns into four categories. This framework may prove helpful to law enforcers in articulating the nature of the injury to consumers.
A. Design elements that induce false beliefs. Some dark patterns manipulate consumer choice by inducing false beliefs. A classic example of this type of dark pattern is an ad deceptively formatted to look like independent, editorial content. As early as 1917, the FTC took action against a newspaper “column” that appeared to offer “advice” on household appliances. In fact, it was an ad placed by a vacuum cleaner company. Fast forward to the TV era, and the FTC – often with state law enforcement partners – challenged infomercials that mimicked the format of news or entertainment programming.
As the FTC explained in its 2015 Enforcement Policy Statement on Deceptively Formatted Ads, “[A]dvertising and promotional messages that are not identifiable as advertising to consumers are deceptive if they mislead consumers into believing they are independent, impartial, or not from the sponsoring advertiser itself.” That’s because “knowing the source of an advertisement or promotional message typically affects the weight or credibility consumers give it.”
The Staff Report includes a visual example from a recent FTC law enforcement action. In that case, the complaint alleged that operators of a work-from-home scheme sent unsolicited email to consumers that included “from” lines that falsely claimed they were coming from news organizations like CNN or Fox News. Consumers who clicked on links embedded in the email were routed to bogus online “news” stories and eventually arrived at the defendants’ sales sites, which falsely promised that for an upfront fee, consumers would make money by working from home for only an hour a day.
Another example cited in the Staff Report concerns a law enforcement action against a company whose website purported to offer rankings of various lenders, often accompanied by comparison charts. The FTC says the website left consumers with the impression that the company had evaluated the top-listed lenders to be the best – a representation reinforced by the company’s claims that its rankings were “objective,” “honest,” “accurate,” and “unbiased.” But in reality, the company boosted lenders’ numerical rankings and positions on the charts based exclusively on how much they paid the company. Well aware of the impact that comparison placement had on consumers, the FTC says the company’s employees enticed lenders to pay more by touting statistics showing that consumers were more likely to click on companies in better positions.
What should companies do to avoid these kinds of dark patterns? The Staff Report suggests some compliance fundamentals:
- Companies should make certain that their online interfaces don’t create false beliefs. Marketers are responsible for the net impression their design choices convey to consumers, not just the literal wording of statements evaluated in isolation.
- If an ad strongly resembles independent editorial content, don’t assume that a “disclaimer” will overcome that deceptive net impression.
- Companies shouldn’t give the impression that rankings or reviews are objective or unbiased if they are affected by compensation.
- If a business learns that a particular design choice induces false beliefs, fix the problem immediately.
B. Design elements that hide or delay the disclosure of material information.
Some dark patterns operate by hiding or obscuring material information from consumers. How do marketers accomplish that? By burying key limitations or conditions in dense blocks of text, behind vague hyperlinks, in locations on a website that people may not notice, or in Terms of Service documents that consumers don’t see before they make a purchase. Mimicking deceptive practices found in some TV and print ads, digital marketers have been known to use “mouseprint” text, hard-to-find and even harder-to-read footnotes, and poor color contrast to conceal what they’re up to.
In one example cited in the Staff Report, the advertiser prominently advertised that loan applicants would receive a specific amount of money and would pay “no hidden fees.” In fact, alleged the FTC, the company deducted hundreds or even thousands of dollars in fees from the loans it disbursed. According to the complaint, the company hid the existence of those fees behind buttons consumers were unlikely to click during the online application process and buried a mention of fees in a list sandwiched between much more prominent text. Furthermore, in standard screen configurations and on mobile devices, material information appeared “below the fold” – meaning that consumers had to scroll down to see it.
Another variation of this dark pattern is “drip pricing,” the practice of luring consumers in with incomplete information about a product’s total cost and waiting until the end of the buying process to reveal mandatory charges that increase the actual cost to buyers. The Staff Report – along with a number of state attorney general offices – raised concerns that drip pricing makes it hard for consumers to comparison shop and manipulates them into paying fees that are either hidden entirely or not revealed until late in the transaction, after the consumer has already spent significant time selecting a product or service plan to purchase.
Drip pricing inflicts substantial injury upon consumers and competition. One study cited in the Staff Report compared consumer expenditures on a ticketing website that uses drip pricing with one that disclosed mandatory fees upfront. The study found that users who weren’t shown the ticket fees upfront ended up spending about 20% more. Drip pricing also can injure honest businesses that reveal their total prices at the outset. They can be at a significant disadvantage when up against competitors that lure consumers in with artificially low prices and then spring mandatory charges on them late in the transaction.
To avoid the use of dark patterns that conceal key information, companies should consider suggestions in the Staff Report:
- Include any unavoidable and mandatory fees in the upfront, advertised price.
- Don’t state or imply that fees are mandatory when they aren’t.
- When a company’s sales practices target a specific audience, take into account how design choices will be perceived by members of that group. For example, if a business markets a product to older adults, avoid visual elements that may be harder for them to read easily.
- Particularly where drip pricing involves a credit product, companies must make sure their practices don’t treat consumers differently on the basis of race, national origin, or another protected class. Recent FTC cases against brick-and-mortar businesses – including law enforcement brought in cooperation with states – alleged that the companies’ practices violated the FTC Act, the Equal Credit Opportunity Act, and state laws.
C. Design elements that lead to unauthorized charges. Another dark pattern familiar to law enforcers is the practice of tricking people into paying for goods or services they didn’t want or didn’t intend to buy. The FTC and state attorneys general have worked cooperatively on dozens of actions challenging this form of dark pattern.
One frequent example resulting in unauthorized charges is when a company deceptively offers a free trial period, but then automatically signs consumers up for pricy subscriptions and bills their credit or debit cards if they fail to cancel within a certain period – complicated terms that go undisclosed or are poorly communicated to consumers. Research cited in the Report reveals how disturbingly effective this form of dark pattern can be. Those results came as no surprise to the FTC. In addition to years of cases challenging negative option-related deception, the Commission convened a 2007 workshop analyzing negative option marketing and followed up with its Negative Options report. In 2010 Congress enacted the Restore Online Shoppers’ Confidence Act (ROSCA) which prohibits marketers charging for goods and services sold online using a negative option feature unless the seller: 1) clearly and conspicuously discloses all material terms of the transaction before obtaining the consumer’s billing information; 2) obtains a consumer’s express informed consent before charging the consumer’s account; and 3) provides simple mechanisms for a consumer to stop recurring charges. And yet companies still use dark patterns that violate both the FTC Act and ROSCA.
A related example cited in the Staff Report focused on the roadblocks a company imposed when consumers wanted to cancel a subscription for a children’s online learning site.
According to the FTC, despite the company’s promise of “Easy Cancellation,” many consumers couldn’t cancel even after repeated attempts at calling, emailing, and contacting the company through its online customer support form. The complaint alleges that the company rejected any cancellation attempt through those methods and instead required consumers to cancel only through a hard-to-find and difficult-to-navigate cancellation path. Ironically enough, the first page of the path didn’t tell people they had arrived at the right page to cancel. In fact, it didn’t even mention the word “cancellation.” In total, the company required people to wend their way through between six and nine screens to cancel. What’s more, as the complaint alleged, each of those screens included links and buttons that took people out of the cancellation path altogether.
It’s an example of what one panelist at the Dark Patterns Workshop termed “sludge” – “a high friction experience that, by its nature, causes people to become fatigued and give up.” In FTC parlance, it was an unfair practice under the FTC Act and a violation of ROSCA’s requirement that companies provide customers with a simple cancellation mechanism.
Another workshop panelist expressed particular concern with how companies can use dark patterns in children’s gaming apps, resulting in unauthorized charges to parents’ credit cards. Unfair payment practices regarding in-app purchases in kids’ games were the central issue in the FTC’s cases against three of the largest companies in the online arena – actions that resulted in millions of dollars in consumer refunds. For example, in one case, once the account holder downloaded the app and children began playing the game, kids could rack up charges ranging from $0.99 to $99.99 with a single click of a button and no parental involvement – often in the guise of game play. In the words of the trial judge, “[A] child may be prompted to use or acquire seemingly fictitious currency, including a ‘boatload of doughnuts, a can of stars, and bars of gold,’ but in reality the child is making an in-app purchase using real money.”
Applying principles derived from decades of consumer protection law, the Staff Report offers advice on what businesses should do to avoid this category of dark pattern:
- Companies shouldn’t hide key terms of the transaction in general Terms and Conditions pages, behind hyperlinks, or in pop-ups that consumers’ devices may block.
- In any transaction – but especially when dealing with apps and games used by kids – companies should get the express informed consent of the account holder to any charges.
- Procedures for getting consent should include an affirmative, unambiguous act by the consumer. Acceptance of a general Terms of Use document that contains unrelated information doesn’t constitute affirmative, unambiguous consent to a particular purchase.
- For transactions covered by ROSCA, companies must ensure they’re complying with the three primary requirements of the statute, including the provision related to a simple mechanism for cancellation. To meet this standard, sellers should provide cancellation mechanisms that are at least as easy to use as the method the consumer used to buy the product or sign up for the service. This means consumers should be able to cancel their subscription using the same method they employed to sign up in the first place.
D. Design elements that obscure or subvert privacy choices. According to panelists at the Dark Patterns Workshop, another pervasive dark pattern involves design elements that obscure or subvert consumers’ privacy choices. The FTC and the states began addressing privacy-related dark patterns long before the term “dark patterns” was coined, but concerns about this form of deceptive or unfair practice continue.
Workshop panelists observed that dark patterns that subvert consumer privacy preferences often take the form of a purported choice companies offer to consumers regarding their personal data. But in actuality, the choice is illusory and often nudges people toward increased data sharing. For example, one panelist pointed to the commonly used cookie consent dialogue that presents consumers with the option of whether to allow the company to set a cookie. Many companies make it easy to say yes, but require users to visit other pages to deny or modify cookie settings.
Panelists also discussed the dark pattern of using interfaces that choose as the default a setting that maximizes information collection and sharing, even when the collection of that data is unnecessary and may include highly sensitive information about users’ health, religious or political affiliation, or sexual orientation. In addition, the FTC Staff Report, A Look at What ISPs Know About You: Examining the Privacy Practices of Six Major Internet Service Providers, examined dark patterns in those companies’ user interfaces.
A law enforcement action brought by the FTC and a state attorney general further illustrates how dark patterns can subvert consumers’ privacy preferences. A complaint against a smart TV manufacturer alleged that the company used a default setting called “Smart Interactivity” that was described as a way for consumers to receive “program offers and suggestions,” but in reality allowed the company to collect their television viewing activity and share it with third parties. The complaint alleged that the company provided no notice of its practices to many of its customers and a hard-to-understand statement to others. The upshot: the company denied consumers the opportunity to make an informed choice about data sharing.
Other examples cited in the Staff Report relate to the activities of lead generators. In one law enforcement action, the FTC alleged that a lead generator conveyed a false affiliation with the United States military to get people thinking about careers in the armed forces to reveal personal information. Using text and visual cues, the company promised to use the information only for recruitment purposes. In fact, according to the complaint, the company sold the information as marketing leads to post-secondary schools for between $15 to $40 per lead.
The Staff Report recommends practices that would be more protective of consumers’ privacy choices:
- Companies should avoid default settings that lead to the collection and use of consumers’ information in ways they wouldn’t expect. Indeed, collecting information only when the business has a justified need would be more protective of consumer choice.
- Consumers shouldn’t have to navigate through multiple screens and dense verbiage to find a company’s privacy settings and learn about the data it collects and how it intends to use it. Rather than burying the information in a privacy policy or Terms of Service agreement, the better practice is to offer consumers choices at the time when they’re making decisions about their data.
- Companies should exercise particular care when sensitive consumer information is at stake – location, health information, and financial data, to name just a few examples.
- Lead generators must be honest about who they are, why they’re collecting consumer information, and what they plan to do with it. Furthermore, if the “product” a company sells includes sensitive data, they must take steps to screen prospective buyers and understand how the buyers intend to use it.
III. A Framework for the Future
Dark patterns raise special challenges for law enforcement agencies. Because dark patterns are covert, many consumers don’t realize they’ve been manipulated or misled and therefore may be less likely to report their experiences. Even when they learn of the deception, unwarranted feelings of embarrassment about being tricked may deter them from speaking out. That’s why we should encourage consumers to come forward either by reporting their experience to their state attorney general at consumerresources.org or telling their story at www.ReportFraud.ftc.gov.
Furthermore, experts are already sounding the alarm about the potential for pernicious dark patterns in new and evolving technologies, including augmented reality and virtual reality. Whether it was at the dawning of the Internet Age, the rise of mobile commerce and social media, or the emergence of privacy and data security as key consumer protection issues, federal and state agencies worked cooperatively to craft effective strategies for combating fraud and deception in emerging technologies. If our shared history teaches us anything, it’s that we protect consumers more effectively when we work together.
Sharing our own observations is an important step toward leveraging law enforcement and educational resources efficiently. We hope the Bringing Dark Patterns to Light Staff Report is a step in that direction. Be sure to check out Appendix A of the Staff Report, which includes a more detailed compilation of digital dark patterns. Appendix B features visual examples that can be used to alert consumers about how companies may try to use dark patterns to deceive them.
Other articles in this edition include: