Learning from Persuasive and Emotional Design Studio
A paradox in designing for ‘good use’ rather than ‘just to sell the next product’
--
The studio gave me a chance to study different persuasive models used to persuade people to complete a task. This article is a comprehensive learning explaining different models.
The article is divided into 3 different sections — first explaining how businesses have used to leverage the power of design for their benefits. Second, explaining persuasive design methodologies and third, then understanding dark patterns in design.
The paradox in design will be covered in the dark patterns section
Persuasive Design Models
- Understanding what causes behaviour change? Through the use of the BJ Fogg model. (2008)
- Eric Schaffer highlighted usability is not enough and suggested the next wave in web design. (2009)
- Then comes the hook model by Nir Eyal, which is basically the extension of the BJ Fogg Model, mostly used by the tech world.
Understanding Business and Design
The design has been able to convince the business leaders that it can really help them to sell more. It has been an asset for them as marketing itself was not sufficient to sell more. The design has helped the businesses to understand the psychology of the people and make the product in such a way that it triggers their right part of the brain to purchase.
Invest in Design = Good Return on investment (ROI)
The two sides of the buying decision
The diagram below shows the two hemispheres of our brain to illustrate that rationale (left) and psychology (right) both influence our decision to buy.
Buying rationale can involve:
- Economics
- Risk
- Strategic Fit
- Compliance
- Politics
Buying psychology can involve:
- Hopes and fears
- Emotions and the ego
- Experiences and expectations
- Attitudes and beliefs
- Behaviours and styles
- Social and political
But, regardless of whether an individual purchases something out of necessity or desire, their emotions will always play a role in their purchasing decision in one way or another.
Logic often takes a backseat to emotions when it comes to purchasing.
That's why an e-commerce website is designed in such a way that makes customers feel good using persuasive design techniques.
What Causes Behavior Change?
The Fogg Behavior Model (FBM) shows that three elements must converge at the same moment for a behaviour to occur: Motivation, Ability, and a Prompt. When a behaviour does not occur, at least one of those three elements is missing.
The FBM makes it easier to understand behaviour in general. What was once a fuzzy mass of psychological theories now becomes organized and specific when viewed through the FBM.
The FBM shows that Motivation and Ability can be traded off (e.g., if motivation is very high, ability can be low). In other words, Motivation and Ability have a compensatory relationship to each other.
Let's understand how Ability and Motivation works:
Ability
As a designer, it’s typical for us to focus on ability first. That’s the part we usually have more control of. There are 6 basic elements of ability:
- Time
- Money
- Effort
- Cognitive load
- Social acceptance
- Routine
If you can influence these aspects, your user is far more likely to do the intended behaviour.
Persuasive techniques can be used for bad, too. Cashback on purchases is one way the ethical line gets blurred. When you have to jump through hoops to get your refund (filling in a long form, photocopying the receipt and mailing it), then the ‘ability’ to complete the cash-back is low. In this case, it’s likely that some buyers won’t bother to complete the action — which is exactly what the retailer wants. It hardly seems worth it for a Rs 180refund for a Rs 2000 purchase, for example.
This isn’t a mistake — it’s designed to maximise purchases and minimise rebates through cognitive science.
Motivation
In the cash back example, you start to see how motivation part comes into play. It might be easier for a designer to control the ‘ability’, but we can also influence motivation with a good understanding of human psychology.
People’s key motivators include:
- A desire for completion and order
- Delight and emotional connection
- Variable rewards
Understanding what motivates people is the powerful bit.
Fitbit, for example, is very effective in tapping into people’s desire for completion and order. Staying focused on being active and keeping fit is a hard behaviour to crack. It’s very easy to put off heading out for a run and just sit on the couch instead.
Fitbit knows this. By adding little achievements that need to be completed, like measuring 10,000 steps, they appeal to our natural tendency as humans to want to ‘complete’ the step goal.
These are powerful techniques. And the great thing about persuasive design is that its influence has the potential to do more than simply help us get people to use our product.
Beyond Usability: Designing Web Sites for Persuasion, Emotion, and Trust
In 2009, Eric Schaffer founded Human Factors International, suggested the next wave in Web Site design — persuasive design.
He mentioned that while usability is still a fundamental requirement for effective Website design, it is no longer enough to design sites that are simply easy to navigate and understand so users can complete transactions.
The future of great Web design is about creating customer engagement and commitment in a way that clearly impacts business results and measurable goals. Whether a Web site is an e-commerce, informational, or transactional, it must motivate people to make decisions online that lead to the conversion of one sort or another.
The mostly persuasive design has been used in e-commerce website to fulfil the business goals as per the user wants and needs.
Persuasive design pushes designers to clearly define a Web site’s purpose — and its persuasion objectives.
The persuasive design pushes designers to clearly define a Web site’s purpose — and its persuasion objectives.
For e-commerce sites, the objectives are to inspire their customers’ trust, engage them, and persuade them to buy their products or services.
For government sites, the persuasion objectives would likely be to convince citizens that the government is responsible, effective, and investing their money wisely.
For non-profit organisations, the objectives would be to engage customers and get them to support their causes with donations and through word-of-mouth and political support.
Only after identifying such persuasion objectives and articulating them precisely can we choose the appropriate techniques from our toolkit of persuasion technologies.
Usability is no longer enough
In short, usability stands for can do, but shouldn’t we be asking if they will do? For example, everyone can do sports, but does everyone do sports? (No.) We, as designers, must find a way to engage and persuade people to act.
MAKING PRODUCT A HABIT: THE HOOK FRAMEWORK
You wake up. You check Facebook, maybe your email. You have a shower, brush your teeth, make breakfast. Head in your car and go to work. Listen to the same playlist. Wear the same clothes in your wardrobe. And you don’t even think about it — you just do it.
Apple. Facebook. Twitter. Google. Instagram. These companies all have one thing in common — they create habits among their users. People use these products habitually on a daily basis and they’re so compelling that many of us struggle to imagine life before they existed.
The Hook Framework
The Hook connects your solution to the user’s problem with enough frequency to form a habit. The Hook framework has four components: Trigger, Action, Variable Reward, and Investment:
Entrepreneur and investor Nir Eyal, distilled this ability into a methodology he calls the “hook model”.
At its simplest form, the hook model describes how businesses can fundamentally change behaviour within their users, and create day-to-day habits around their products. The heart of the principle is that businesses should always seek to connect a user’s problem to your solution with enough frequency to make it a habit.
Let’s break that down:
Trigger:
This is the spark for a behavior that gets someone into a system. There are two types of triggers: external and internal. The external trigger alerts users with something like an email, link or icon. Internal triggers happen within the system and are formed as a user cycles through successive hooks while using the product.
Action:
This is the behavior taken when a reward is anticipated. For instance, the action of clicking on an image on your Facebook feed. When you click that image, you anticipate that you’ll be taken somewhere interesting, such as the latest listicle on Buzzfeed. This is an important part of the model because it draws upon usability design to drive users to take action.
Variable Reward:
This is the part of the model that allows you to create craving in users. Rather than using a conventional feedback loop, you can serve up a multitude of potential rewards to hold a user’s interest. For instance, Pinterest does this by showing you images that are relevant to your interests along with other things that might catch your eye.
Investment:
This is the part where the user now has to do some work. Think of it as giving back into the product, and that can take the form of time, data, effort, social capital or money. But this isn’t solely about swiping their credit cards. Investment is an action that will improve the product as well — such as inviting new people into the system, giving feedback on features, etc.
Habit forming is a continuous process.
Baking the hook model into your product
Many startups might be fooled into thinking they can worry about creating habitual behaviour after a product is released. Unfortunately, by then it’s too late, (and at that point, businesses might be tempted into creating cheap gimmicks like under-baked gamification in order to get users logging in every day for the superficial thrill of a reward).
True “hook” based product creation starts with UX design at the forefront. Startups need to begin early by imagining who their ideal habitual users would be, and then ask themselves some key questions:
- What are the characteristics of such a user?
- How often is “habitual”? While some social networks might want daily users, others might not expect users to be on for weeks at a time.
- Think about the average user. Not the super users, or the infrequent ones, but those who would make up the bulk of your user base.
How are your users getting hooked?
Businesses need to put in research into figuring out what had their users hooked in the first place. After all, users come to apps and products through different methods, and there could be multiple hooks for a single product.
However, until you know for sure what is hooking your users, you’re just guessing. Look into what’s bringing in your users and then double-down on the most promising strategies.
For instance, as Eyal himself points out, Twitter — through constant research — was able to determine new users who followed more users were more likely to stay on the service longer. As a result, Twitter started recommending popular accounts to new users in order to get them started. Once they did that, usage increased.
Such a discovery was only possible because Twitter was constantly iterating and developing its hook process.
Creating “hooks” is a process
Constant research and optimisation is required for creating hook-based products. But for businesses with products already in the market, triggers and rewards are the easiest ways to get started. Even creating triggers such as emails or notifications can make a big difference.
Dark Patterns
UI is the heart of website design and functionality. Dark patterns in UI are the tricks websites and apps use to trap users into signing up for or buying something accidentally. The purpose of dark patterns in UI design is to hide the real intentions of the website and/or company from the user until it’s too late.
Or
A dark pattern is a user interface carefully crafted to trick users into doing things they might not otherwise do, such as buying insurance with their purchase or signing up for recurring bills.
Normally when you think of “bad design,” you think of the creator as being sloppy or lazy — but without ill intent. Dark patterns, on the other hand, are not mistakes. They’re carefully crafted with a solid understanding of human psychology, and they do not have the user’s interests in mind.
Trick questions
Marketing emails use this tactic all the time. You’ve probably seen this before. After you register to access something on the web, you’re asked if you want to be placed on a mailing list. This particular approach is fairly standard but isn’t hugely effective because users have to take an explicit action to opt in. Chances are they’ll be in a hurry and a proportion of users won’t even notice this text. Some websites use mandatory radio buttons with neither option (yes or no) preselected. This way the user can’t get on to the next page without making an explicit choice. This in itself is still above-board. But if we think back to our anti-usability principles, we can see how not calling attention to this choice can be used to trick us into choosing something we don’t actually want.
Have you ever heard of a trammel net?
It’s a type of fishing net that is made up of two layers of different types of netting. The fish — or your user — can either get caught up by the first layer, or the second layer, or they can get stuck between the two. They’re banned in most kinds of commercial fishing, but it seems you can put them in your UIs without any legal repercussions.
There are different classifications of dark pattern namely Bait and Switch,Disguised Ads, Forced Continuity, Forced Disclosure among others.
A new video on the popular YouTube series Nerdwritter outlines some dirty design tactics embedded in several popular websites and apps. On Amazon’s website for example, the so-called “roach motel” tactic prevents users from deleting their account. It does this by burying that option deep in the site’s architecture, leading the user down a confusing maze of drop-down menus, only to arrive at a dead end where you have to convince a customer-service person to delete it for you.
The seven-minute explainer highlights the work of dark-UX crusader Harry Brignull. Through his website Dark Patterns, the UK-based user-experience consultant encourages the public to educate themselves on these sly tactics and call out the companies who use them.
Example:
When Apple released iOS 6, one of the few new features not enthusiastically promoted by the company was Identifier for Advertisers (IDFA) ad tracking. It assigned each device a unique identifier used to track browsing activity, information advertisers used to target ads. Even though IDFA is anonymous, it’s still unsettling to people who worry about privacy.
Fortunately, Apple included a way to disable the feature. You won’t find it in the privacy settings, however. Instead, you have to go through a series of obscure options in the general settings menu. Now, “General” is a crappy name for a menu item. It’s mainly a bucket of miscellaneous stuff that they didn’t know what to do with. In the “General” menu, select “About.” Down at the bottom of this menu, next to the terms of service and license items, there’s a menu item listed as “Advertising.”
If you haven’t been here before, the only option in the advertising menu, “Limit Ad Tracking” is probably selected “Off.”
But let’s take a closer look at the way this is worded. It doesn’t say “Ad Tracking — Off” it says “Limit Ad Tracking — Off”. So it’s a double negative. It’s not being limited, so when this switch is off, ad tracking is actually on.
Off means on!
The thing about dark patterns is that you design them from the exact same rulebooks that we use to enhance usability.
Nielsen’s 10 heuristics, probably one of the most well-known set of usability guidelines, date back to the early 1990s. If we take three of them and invert them, we can describe Apple’s UI strategy in the above example.
Visibility of system status.
Instead of showing key status information, hide it. Do this with unclear labels, obtuse navigation, and untimely messages.
Match between system and real world.
Instead of “speaking the user’s language,” the system should use “weasel wording” so that it appears to say one thing while it really says another.
User control and freedom.
Take advantage of your users’ natural capacity to make mistakes to have them accidentally complete actions that are beneficial to your objective.
The darkness comes into play because UX design choices are being selected to be intentionally deceptive. To nudge the user to give up more than they realize. Or to agree to things they probably wouldn’t if they genuinely understood the decisions they were being pushed to make.
To put it plainly, dark pattern design is deception and dishonesty by design.
Dark pattern = Privacy Breach
Dark patterns used to obtain consent to collect users’ personal data often combine unwelcome interruption with a built in escape route — offering an easy way to get rid of the dull looking menu getting in the way of what you’re actually trying to do.
Brightly colored ‘agree and continue’ buttons are a recurring feature of this flavor of dark pattern design. These eye-catching signposts appear near universally across consent flows — to encourage users not to read or contemplate a service’s terms and conditions, and therefore not to understand what they’re agreeing to.
It’s ‘consent’ by the spotlit backdoor.
This works because humans are lazy in the face of boring and/or complex looking stuff. And because too much information easily overwhelms. Most people will take the path of least resistance. Especially if it’s being reassuringly plated up for them in handy, push-button form.
At the same time dark pattern design will ensure the opt out — if there is one — will be near invisible; Greyscale text on a grey background is the usual choice.
Deceptive Designs
Some deceptive designs even include a call to action displayed on the colorful button they do want you to press — with text that says something like ‘Okay, looks great!’ — to further push a decision.
Likewise, the less visible opt out option might use a negative suggestion to imply you’re going to miss out on something or are risking bad stuff happening by clicking here.
The horrible truth is that deceptive designs can be awfully easy to paint.
Where T&Cs are concerned, it really is shooting fish in a barrel. Because humans hate being bored or confused and there are countless ways to make decisions look off-puttingly boring or complex — be it presenting reams of impenetrable legalese in tiny greyscale lettering so no-one will bother reading it combined with defaults set to opt in when people click ‘ok’; deploying intentionally confusing phrasing and/or confusing button/toggle design that makes it impossible for the user to be sure what’s on and what’s off (and thus what’s opt-out and what’s an opt-in) or even whether opting out might actually mean opting into something you really don’t want…
Friction is another key tool of this dark art:
For example designs that require lots more clicks/taps and interactions if you want to opt out. Such as toggles for every single data share transaction — potentially running to hundreds of individual controls a user has to tap on vs just a few taps or even a single button to agree to everything. The weighing is intentionally all one way. And it’s not in the consumer’s favor.
Deceptive designs can also make it appear that opting out is not even possible. Such as default opting users in to sharing their data and, if they try to find a way to opt out, requiring they locate a hard-to-spot alternative click — and then also requiring they scroll to the bottom of lengthy T&Cs to unearth a buried toggle where they can in fact opt out.
Facebook used that technique to carry out a major data heist by linking WhatsApp users’ accounts with Facebook accounts in 2016.
Despite prior claims that such a privacy u-turn could never happen. The vast majority of WhatsApp users likely never realised they could say no — let alone understood the privacy implications of consenting to their accounts being linked.
E-commerce sites also sometimes suggestively present an optional (priced) add-on in a way that makes it appear like an obligatory part of the transaction.
Such as using a brightly coloured ‘continue’ button during a flight check out process but which also automatically bundles an optional extra like insurance, instead of plainly asking people if they want to buy it.
Or using pre-selected checkboxes to sneak low-cost items or a small charity donation into a basket when a user is busy going through the check out flow — meaning many customers won’t notice it until after the purchase has been made.
Airlines have also been caught using deceptive design to upsell pricier options, such as by obscuring cheaper flights and/or masking prices so it’s harder to figure out what the most cost effective choice actually is.
Subscribing unsubscribing
Dark patterns to thwart attempts to unsubscribe are horribly, horribly common in email marketing. Such as an unsubscribe UX that requires you to click a ridiculous number of times and keep reaffirming that yes, you really do want out.
Often these additional screens are deceptively designed to resembled the ‘unsubscribe successful’ screens that people expect to see when they’ve pulled the marketing hooks out. But if you look very closely, at the typically very tiny lettering, you’ll see they’re actually still asking if you want to unsubscribe. The trick is to get you not to unsubscribe by making you think you already have.
Another oft-used deceptive design that aims to manipulate online consent flows works against users by presenting a few selectively biased examples — which gives the illusion of helpful context around a decision. But actually, this is a turbocharged attempt to manipulate the user by presenting a self-servingly skewed view that is in no way a full and balanced picture of the consequences of consent.
At best it’s disingenuous. More plainly it’s deceptive and dishonest.
We all are talking that Persuasive design, when used responsibly, can leverage a good understanding of cognitive science to add value to a user’s experience, and increase user engagement.
But this is the paradox in itself.
Paradox example: to let users unhook - Digital Well-being
Above all else, it is our responsibility to ensure users retain their right of choice. The challenge remains open.