Saturday, September 22, 2018

PERSONAL SPECIAL.... How Technology is Hijacking Your Mind — from a Magician and Google Design Ethicist PART I


How Technology is Hijacking Your Mindfrom a Magician and Google Design Ethicist PART  I

“It’s easier to fool people than to convince them that they’ve been fooled.”Unknown.
I’m an expert on how technology hijacks our psychological vulnerabilities. That’s why I spent the last three years as a Design Ethicist at Google caring about how to design things in a way that defends a billion people’s minds from getting hijacked.
When using technology, we often focus optimistically on all the things it does for us. But I want to show you where it might do the opposite.
Where does technology exploit our minds’ weaknesses?
I learned to think this way when I was a magician. Magicians start by looking for blind spots, edges, vulnerabilities and limits of people’s perception, so they can influence what people do without them even realizing it. Once you know how to push people’s buttons, you can play them like a piano.

That’s me performing sleight of hand magic at my mother’s birthday party
And this is exactly what product designers do to your mind. They play your psychological vulnerabilities (consciously and unconsciously) against you in the race to grab your attention.
I want to show you how they do it.
Hijack #1: If You Control the Menu, You Control the Choices
Western Culture is built around ideals of individual choice and freedom. Millions of us fiercely defend our right to make “free” choices, while we ignore how those choices are manipulated upstream by menus we didn’t choose in the first place.
This is exactly what magicians do. They give people the illusion of free choice while architecting the menu so that they win, no matter what you choose. I can’t emphasize enough how deep this insight is.
When people are given a menu of choices, they rarely ask:
·         “what’s not on the menu?”
·         “why am I being given these options and not others?”
·         “do I know the menu provider’s goals?”
·         “is this menu empowering for my original need, or are the choices actually a distraction?” (e.g. an overwhelmingly array of toothpastes)

How empowering is this menu of choices for the need, “I ran out of toothpaste”?
For example, imagine you’re out with friends on a Tuesday night and want to keep the conversation going. You open Yelp to find nearby recommendations and see a list of bars. The group turns into a huddle of faces staring down at their phones comparing bars. They scrutinize the photos of each, comparing cocktail drinks. Is this menu still relevant to the original desire of the group?
It’s not that bars aren’t a good choice, it’s that Yelp substituted the group’s original question (“where can we go to keep talking?”) with a different question (“what’s a bar with good photos of cocktails?”) all by shaping the menu.
Moreover, the group falls for the illusion that Yelp’s menu represents a complete set of choices for where to go. While looking down at their phones, they don’t see the park across the street with a band playing live music. They miss the pop-up gallery on the other side of the street serving crepes and coffee. Neither of those show up on Yelp’s menu.

Yelp subtly reframes the group’s need “where can we go to keep talking?” in terms of photos of cocktails served.
The more choices technology gives us in nearly every domain of our lives (information, events, places to go, friends, dating, jobs)the more we assume that our phone is always the most empowering and useful menu to pick from. Is it?
The “most empowering” menu is different than the menu that has the most choices
But when we blindly surrender to the menus we’re given, it’s easy to lose track of the difference:
·         “Who’s free tonight to hang out?” becomes a menu of most recent people who texted us (who we could ping).
·         “What’s happening in the world?” becomes a menu of news feed stories.
·         “Who’s single to go on a date?” becomes a menu of faces to swipe on Tinder (instead of local events with friends, or urban adventures nearby).
·         “I have to respond to this email.” becomes a menu of keys to type a response(instead of empowering ways to communicate with a person).

All user interfaces are menus. What if your email client gave you empowering choices of ways to respond, instead of “what message do you want to type back?” (Design by Tristan Harris)
When we wake up in the morning and turn our phone over to see a list of notificationsit frames the experience of “waking up in the morning” around a menu of “all the things I’ve missed since yesterday.” 

A list of notifications when we wake up in the morninghow empowering is this menu of choices when we wake up? Does it reflect what we care about? 
By shaping the menus we pick from, technology hijacks the way we perceive our choices and replaces them with new ones. But the closer we pay attention to the options we’re given, the more we’ll notice when they don’t actually align with our true needs.
Hijack #2: Put a Slot Machine In a Billion Pockets

If you’re an app, how do you keep people hooked? Turn yourself into a slot machine.
The average person checks their phone 150 times a day. Why do we do this? Are we making 150 conscious choices?

How often do you check your email per day?
One major reason why is the #1 psychological ingredient in slot machines: intermittent variable rewards.
If you want to maximize addictiveness, all tech designers need to do is link a user’s action (like pulling a lever) with a variable reward. You pull a lever and immediately receive either an enticing reward (a match, a prize!) or nothing. Addictiveness is maximized when the rate of reward is most variable.
Does this effect really work on people? Yes. Slot machines make more money in the United States than baseball, movies, and theme parks combined. Relative to other kinds of gambling, people get ‘problematically involved’ with slot machines 3–4x faster according to NYU professor Natasha Dow Schull, author of Addiction by Design.

But here’s the unfortunate truthseveral billion people have a slot machine their pocket:
·         When we pull our phone out of our pocket, we’re playing a slot machine to see what notifications we got.
·         When we pull to refresh our email, we’re playing a slot machine to see what new email we got.
·         When we swipe down our finger to scroll the Instagram feed, we’re playing a slot machine to see what photo comes next.
·         When we swipe faces left/right on dating apps like Tinder, we’re playing a slot machine to see if we got a match.
·         When we tap the # of red notifications, we’re playing a slot machine to what’s underneath.
Apps and websites sprinkle intermittent variable rewards all over their products because it’s good for business.
But in other cases, slot machines emerge by accident. For example, there is no malicious corporation behind all of email who consciously chose to make it a slot machine. No one profits when millions check their email and nothing’s there. Neither did Apple and Google’s designers want phones to work like slot machines. It emerged by accident.
But now companies like Apple and Google have a responsibility to reduce these effects by converting intermittent variable rewards into less addictive, more predictable ones with better design. For example, they could empower people to set predictable times during the day or week for when they want to check “slot machine” apps, and correspondingly adjust when new messages are delivered to align with those times.
Hijack #3: Fear of Missing Something Important (FOMSI)

Another way apps and websites hijack people’s minds is by inducing a “1% chance you could be missing something important.”
If I convince you that I’m a channel for important information, messages, friendships, or potential sexual opportunitiesit will be hard for you to turn me off, unsubscribe, or remove your accountbecause (aha, I win) you might miss something important:
·         This keeps us subscribed to newsletters even after they haven’t delivered recent benefits (“what if I miss a future announcement?”)
·         This keeps us “friended” to people with whom we haven’t spoke in ages (“what if I miss something important from them?”)
·         This keeps us swiping faces on dating apps, even when we haven’t even met up with anyone in a while (“what if I miss that one hot match who likes me?”)
·         This keeps us using social media (“what if I miss that important news story or fall behind what my friends are talking about?”)
But if we zoom into that fear, we’ll discover that it’s unbounded: we’ll always miss something important at any point when we stop using something.
·         There are magic moments on Facebook we’ll miss by not using it for the 6th hour (e.g. an old friend who’s visiting town right now).
·         There are magic moments we’ll miss on Tinder (e.g. our dream romantic partner) by not swiping our 700th match.
·         There are emergency phone calls we’ll miss if we’re not connected 24/7.
But living moment to moment with the fear of missing something isn’t how we’re built to live.
And it’s amazing how quickly, once we let go of that fear, we wake up from the illusion. When we unplug for more than a day, unsubscribe from those notifications, or go to Camp Groundedthe concerns we thought we’d have don’t actually happen.
We don’t miss what we don’t see.
The thought, “what if I miss something important?” is generated in advance of unplugging, unsubscribing, or turning offnot after. Imagine if tech companies recognized that, and helped us proactively tune our relationships with friends and businesses in terms of what we define as “time well spent” for our lives, instead of in terms of what we might miss.
CONTINUES IN PART II

No comments: