Dark patterns, the dark side of design

This article was written for Readymag, The Ethical Issue 5/5Dark patterns are online user experiences that are intentionally designed to trick users into taking unintended actions. These actions rarely benefit the user, but rather serve the owner of the site or app — almost always for financial gain. This article by graphic designer, artist and art educator Eileen MacAvery Kane will explore different types of experiences, the role and responsibility of designers, the confusing online landscape created by dark patterns, and what actions can be taken to prioritize users in the online design process.


The term “dark patterns” was coined in 2010 by UX specialist Harry Brignull, describing online experiences that are intentionally designed to trick users into unintended actions. These actions rarely benefit the user, but do benefit the owner of the site or app, and usually for financial gain. The presence of these patterns makes the current digital landscape one of “user beware.” In this context, even UX designers with the best of intentions can find themselves designing for the dark side — unless they actively take steps to avoid it.

If you’re like me, you’ve probably experienced dark patterns in your daily life. Here are some at the most common:

Bait and Switch: Takes advantage of familiar actions and behaviors to substitute a new action, typically with some kind of in. This can create confusion or knee-jerk reactions, as when Microsoft’s upgrade to Windows 10 was initiated by clicking on an “X” — how you would normally close a window, not initiate an installation.

Disguised Ads: Ads are usually located in specific places on a web page and include graphics and copy that clearly promote a product or service. However, sometimes they are designed to look like action buttons for obtaining content. It’s up to the user to fully read the instructions and ignore misleading graphics. Sites like Dafont.com, that graphic designers use to access free fonts, are a prime example.

Roach Motel: Occurs when it becomes almost impossible to delete an account or unsubscribe from a mailing list. A prime example of this was when online donors for Trump weren’t aware that they were making weekly contributions to keep his struggling campaign afloat. One donor who lived on an income of less than $1,000 per month found out what they thought was a one-time donation of $500 had turned into $3,000 in less than 30 days.

Friend Spam: Occurs when a product asks for social media permissions under false pretenses and then spams your friend list. LinkedIn was fined $13 million dollars for this as part of a class-action suit in 2015.

Forced Continuity: Websites that offer free trials force users to submit a credit card to continue using a site for “free.”Failure to cancel within the allotted time frame leads to automated billing. On top of this, cancelling the subscription can be extremely difficult, akin to finding a needle in a haystack. Affinion Group is an international loyalty program manager that has paid millions of dollars in claims for unfair and deceptive trade practices. They face multiple class-action suits for misleading consumers using dark patterns.

These are just some of the “dark patterns” that have been identified — and more are being concocted every minute, which is hardly surprising given what a lucrative practice it can be. Users must educate themselves to successfully detect and avoid dark patterns. However, the onus shouldn’t just be on users — what is the role of the designer in dark patterns? Many believe ethics for graphic designers should be based on the idea of service, and that helping other people is a good thing to do. With knowledge and access to tools that can direct people’s attention and actions, we have a moral responsibility to avoid using dark patterns for ourselves or clients.

If we look specifically at the role of a UX/UI designer, the most basic user experiences are designed to help and direct. Experiences should be positive, easy, and intuitive. Like a carpenter building a set of stairs, the designer should build an experience that is supportive and safe. Dark patterns introduce a moral hazard to this mission, akin to building a set of stairs with booby traps that send users in a variety of confusing or pointless directions. A carpenter would surely be held accountable, as would the general contractor and architect; yet in the online environment we all currently inhabit, there is little accountability. Who should be held accountable? Should it be the individual UX/UI designer, or their employer? Top level management may not know the specifics of the dark patterns used to make an app or website more profitable. However, they are aware of the revenue being generated and prefer to turn a blind eye to the dark side of how it happens.

For users, there is a variety of private and public information on how to spot and avoid dark patterns. Cyber security and IT departments have created consumer awareness programs and professional development classes — these are increasingly made mandatory by employers. A non-profit organization, darkpatterns.org, features a “Hall of Fame” where they call out offenders. However, until there are legal and financial repercussions along with public shaming, change will be slow in coming.

There is reason to believe that more systematic prevention and punishment is on the way. California recently passed legislation banning dark patterns that trick users into giving away their personal data. The updated legislation is intended to strengthen the 2018 California Consumer Privacy Act (CCPA), which gives Californians the right to say no to the sale of their personal information. Legislators are concerned that this option can be buried with dark patterns. By banning them, California is striving to make sure consumers will not be confused or misled when exercising their right to privacy in online experiences. Unfortunately, the new regulation only bans dark patterns connected to the consumer’s “opt-out” choice, but at least it’s a start. In Europe, CNIL (Commission Nationale de l’Informatique et des Libertés) emphasized in their report, Shaping Choices in the Digital World, how dangerous manipulative and misleading interface design is and how it jeopardizes our rights and freedoms. Their report is a call to arms for the regulation of design, and a reminder of the need for informed and unambiguous consent.

Designers should adhere to a code of ethical conduct where transparency and respect for privacy are first and foremost. Next, they can educate themselves about dark patterns and examine their role in creating them by seeking out research on the topic and attending workshops. They can look for alternative solutions to offer their clients and impress upon them the advantages of building trust with users, rather than taking advantage. They should look at best practices created by apps like Duolingo, that allow users to sign up through their Google or Facebook accounts and quickly give them access to lessons; this stands in contrast to Rosetta Stone, who’s registration process requires several steps and payment information for a trial account.

Creating a customer-centric culture will build long-lasting relationships and engender trust. Over time, this will prove much more valuable than the short-term gains that result from dark patterns. While users can stem the tide, designers have a greater influence over the future of dark patterns, and whether or not the dark side will win.

Read More

How to Be Good

The notion of being a "good" designer has always been fraught with contradictions. Does it mean good technical skills, good concepts, good return on investment for clients, good intentions, or good design for the greater good? All are valid questions—in fact, history shows us that award-winning design may be both technically "good" while even advancing the most evil of causes.This past year the opening of "Design of the Third Reich" in the Netherlands caused quite a controversy. Proponents of the exhibit point to the academic world of design, where the art and design of totalitarian regimes is an established field of research. Critics find it's very nature offensive and have protested outside the museum since it opened, voicing concerns that it lacks proper social context and may fuel the fires of far-right ideologies.Perhaps the most current and critical area of ethical issues in design is in the area of product design and UX/UI. Mike Monteiro's article, "Dear Designer: Hope is Not Enough," leads with an anonymous quote from a Facebook employee where they state, "We’ve been behaving so badly that I hope the government comes in and regulates us."Monteiro’s article does offer hope when he talks about the 22,000 Google employees who staged a walkout in protest of their company's work with the government, and the Microsoft workers who protested against their company’s contract with ICE (Immigration and Customs Enforcement). However, when he discusses the unprecedented amount of data harvested from Google's 2.5 billion users and discusses the ethical responsibilities of designers who work there, he points to the elephant in the room and one of the biggest ethical dilemmas that today’s designers deal with—how to responsibly and ethically use this data in a manner that benefits users without compromising their privacy or society's well-being.Designers seeking to do good have always struggled with questions about who their client is and the products that they are promoting. The complicated world of data collection and product design has only added to the Pandora's box: as always, may those who open it beware.Sources:https://news.artnet.com/opinion/timo-de-rijk-nazi-design-1652641https://www.nytimes.com/2019/09/17/arts/design/nazi-design-den-bosch.htmlhttps://modus.medium.com/dear-designer-hope-is-not-enough-70509b196a46 

Read More