How Technology Hijacks People’s Minds — from a Magician and Google’s Design Ethicist

Estimated reading time: 15 minutes

I’m an expert on how technology hijacks our psychological vulnerabilities. That’s why I spent the last three years as Google’s Design Ethicist caring about how to design things in a way that defends a billion people’s minds from getting hijacked.

When using technology, we often focus optimistically on all the things it does for us. But I want you to show you where it might do the opposite.

Where does technology exploit our minds weaknesses?

I learned to think this way when I was a magician. Magicians start by looking for blind spots, edges, vulnerabilities and limits of people’s perception, so they can influence what people do without them even realizing it. Once you know how to push people’s buttons, you can play them like a piano.

That’s me performing sleight of hand magic at my mother’s birthday party

And this is exactly what product designers do to your mind. They play your psychological vulnerabilities (consciously and unconsciously) against you in the race to grab your attention.

I want to show you how they do it.

Hijack #1: If You Control the Menu, You Control the Choices

1-kW01thCZaWQyq0A08hSj5Q (1)

Western Culture is built around ideals of individual choice and freedom. Millions of us fiercely defend our right to make “free” choices, while we ignore how we’re manipulated upstream by limited menus we didn’t choose.

This is exactly what magicians do. They give people the illusion of free choice while architecting the menu so that they win, no matter what you choose. I can’t emphasize how deep this insight is.

When people are given a menu of choices, they rarely ask:

  • “what’s not on the menu?”
  • “why am I being given these options and not others?”
  • “do I know the menu provider’s goals?”
  • “is this menu empowering for my original need, or are the choices actually a distraction?” (e.g. an overwhelmingly array of toothpastes)
 How empowering is this menu of choices for the need, “I ran out of toothpaste”?

For example, imagine you’re out with friends on a Tuesday night and want to keep the conversation going. You open Yelp to find nearby recommendations and see a list of bars. The group turns into a huddle of faces staring down at their phones comparing bars. They scrutinize the photos of each, comparing cocktail drinks. Is this menu still relevant to the original desire of the group?

It’s not that bars aren’t a good choice, it’s that Yelp substituted the group’s original question (“where can we go to keep talking?”) with a different question (“what’s a bar with good photos of cocktails?”) all by shaping the menu.

Moreover, the group falls for the illusion that Yelp’s menu represents acomplete set of choices for where to go. While looking down at their phones, they don’t see the park across the street with a band playing live music. They miss the pop-up gallery on the other side of the street serving crepes and coffee. Neither of those show up on Yelp’s menu.

Yelp subtly reframes the group’s need “where can we go to keep talking?” in terms of photos of cocktails served.

The more choices technology gives us in nearly every domain of our lives (information, events, places to go, friends, dating, jobs) — the more we assume that our phone is always the most empowering and useful menu to pick from. Is it?

The “most empowering” menu is different than the menu that has the most choices. But when we blindly surrender to the menus we’re given, it’s easy to lose track of the difference:

  • “Who’s free tonight to hang out?” becomes a menu of most recent people who texted us (who we could ping).
  • “What’s happening in the world?” becomes a menu of news feed stories.
  • “Who’s single to go on a date?” becomes a menu of faces to swipe on Tinder (instead of local events with friends, or urban adventures nearby).
  • “I have to respond to this email.” becomes a menu of keys to type a response (instead of empowering ways to communicate with a person).


All user interfaces are menus. What if your email client gave you empowering choices of ways to respond, instead of “what message do you want to type back?” (Design by Tristan Harris)

When we wake up in the morning and turn our phone over to see a list of notifications — it frames the experience of “waking up in the morning” around a menu of “all the things I’ve missed since yesterday.”

A list of notifications when we wake up in the morning — how empowering is this menu of choices when we wake up? Does it reflect what we care about? (credit to Joe Edelman)

By shaping the menus we pick from, technology hijacks the way we perceive our choices and replaces them new ones. But the closer we pay attention to the options we’re given, the more we’ll notice when they don’t actually align with our true needs.

Hijack #2: Put a Slot Machine In a Billion Pockets

If you’re an app, how do you keep people hooked? Turn yourself into a slot machine.

The average person checks their phone 150 times a day. Why do we do this? Are we making 150 conscious choices?

How often do you check your email per day?

One major reason why is the #1 psychological ingredient in slot machines:intermittent variable rewards.

If you want to maximize addictiveness, all tech designers need to do is link a user’s action (like pulling a lever) with a variable reward. You pull a lever and immediately receive either an enticing reward (a match, a prize!) or nothing. Addictiveness is maximized when the rate of reward is most variable.

Does this effect really work on people? Yes. Slot machines make more money in the United States than baseball, movies, and theme parkscombined. Relative to other kinds of gambling, people get ‘problematically involved’ with slot machines 3–4x faster according to NYU professor Natasha Dow Shull, author of Addiction by Design.

But here’s the unfortunate truth — several billion people have a slot machine their pocket:

  • When we pull our phone out of our pocket, we’re playing a slot machineto see what notifications we got.
  • When we pull to refresh our email, we’re playing a slot machine to see what new email we got.
  • When we swipe down our finger to scroll the Instagram feed, we’replaying a slot machine to see what photo comes next.
  • When we swipe faces left/right on dating apps like Tinder, we’re playing a slot machine to see if we got a match.
  • When we tap the # of red notifications, we’re playing a slot machine to what’s underneath.

Apps and websites sprinkle intermittent variable rewards all over their products because it’s good for business.

But in other cases, slot machines emerge by accident. For example, there is no malicious corporation behind all of email who consciously chose to make it a slot machine. No one profits when millions check their email and nothing’s there. Neither did Apple and Google’s designers want phones to work like slot machines. It emerged by accident.

But now companies like Apple and Google have a responsibility to reduce these effects by converting intermittent variable rewards into less addictive, more predictable ones with better design. For example, they could empower people to set predictable times during the day or week for when they want to check “slot machine” apps, and correspondingly adjust when new messages are delivered to align with those times.

Hijack #3: Fear of Missing Something Important (FOMSI)

Another way apps and websites hijack people’s minds is by inducing a “1% chance you could be missing something important.”

If I convince you that I’m a channel for important information, messages, friendships, or potential sexual opportunities — it will be hard for you to turn me off, unsubscribe, or remove your account — because (aha, I win) you might miss something important:

  • This keeps us subscribed to newsletters even after they haven’t delivered recent benefits (“what if I miss a future announcement?”)
  • This keeps us “friended” to people with whom we haven’t spoke in ages (“what if I miss something important from them?”)
  • This keeps us swiping faces on dating apps, even when we haven’t even met up with anyone in a while (“what if I miss that one hot match who likes me?”)
  • This keeps us using social media (“what if I miss that important news story or fall behind what my friends are talking about?”)

But if we zoom into that fear, we’ll discover that it’s unbounded: we’ll always miss something important at any point when we stop using something.

  • There are magic moments on Facebook we’ll miss by not using it for the 6th hour (e.g. an old friend who’s visiting town right now).
  • There are magic moments we’ll miss on Tinder (e.g. our dream romantic partner) by not swiping our 700th match.
  • There are emergency phone calls we’ll miss if we’re not connected 24/7.

But living moment to moment with the fear of missing something isn’t how we’re built to live.

And it’s amazing how quickly, once we let go of that fear, we wake up from the illusion. When we unplug for more than a day, unsubscribe from those notifications, or go to Camp Grounded — the concerns we thought we’d have don’t actually happen.

We don’t miss what we don’t see.

The thought, “what if I miss something important?” is generated in advance of unplugging, unsubscribing, or turning off — not after. Imagine if tech companies recognized that, and helped us proactively tune our relationships with friends and businesses in terms of what we define as “time well spent” for our lives, instead of in terms of what we might miss.

Hijack #4: Social Approval

 Easily one of the most persuasive things a human being can receive.

We’re all vulnerable to social approval. The need to belong, to be approved or appreciated by our peers is among the highest human motivations. But now our social approval is in the hands of tech companies (like when we’re tagged in a photo).

When I get tagged by my friend Marc (above), I imagine him making aconscious choice to tag me. But I don’t see how a company like Facebook orchestrated him doing that in the first place.

Facebook, Instagram or SnapChat can manipulate how often people get tagged in photos by automatically suggesting all the faces people should tag (e.g. by showing a box with a 1-click confirmation, “Tag Tristan in this photo?”).

So when Marc tags me, he’s actually responding to Facebook’s suggestion, not making an independent choice. But through design choices like this,Facebook controls the multiplier for how often millions of people experience their social approval on the line.

Facebook uses automatic suggestions like this to get people to tag more people, creating more social externalities and interruptions.

The same happens when we change our main profile photo — Facebook knows that’s a moment when we’re vulnerable to social approval: “what do my friends think of my new pic?” Facebook can rank this higher in the news feed, so it sticks around for longer and more friends will like or comment on it. Each time they like or comment on it, I’ll get pulled right back.

Everyone innately responds to social approval, but some demographics (teenagers) are more vulnerable to it than others. That’s why it’s so important to recognize how powerful designers are when they exploit this vulnerability.

Hijack #5: Social Reciprocity (Tit-for-tat)

  • You do me a favor, now I owe you one next time.
  • You say, “thank you”— I have to say “you’re welcome.”
  • You send me an email— it’s rude not to get back to you.
  • You follow me — it’s rude not to follow you back. (especially for teenagers)

We are vulnerable to needing to reciprocate others’ gestures. But as with Social Approval, tech companies now manipulate how often we experience it.

In some cases, it’s by accident. Email, texting and messaging apps are social reciprocity factories. But in other cases, companies exploit this vulnerability on purpose.

LinkedIn is the most obvious offender. LinkedIn wants as many people creating social obligations for each other as possible, because each time they reciprocate (by accepting a connection, responding to a message, or endorsing someone back for a skill) they have to come back through where they can get people to spend more time.

Like Facebook, LinkedIn exploits an asymmetry in perception. When you receive an invitation from someone to connect, you imagine that person making a conscious choice to invite you, when in reality, they likely unconsciously responded to LinkedIn’s list of suggested contacts. In other words, LinkedIn turns your unconscious impulses (to “add” a person) into new social obligations that millions of people feel obligated to repay. All while they profit from the time people spend doing it.

Imagine millions of people getting interrupted like this throughout their day, running around like chickens with their heads cut off, reciprocating each other — all designed by companies who profit from it.

Welcome to social media.

After accepting an endorsement, LinkedIn takes advantage of your bias to reciprocate by offering *four* additional people for you to endorse in return.

Imagine if technology companies had a responsibility to minimize social reciprocity. Or if there was an “FDA for Tech” that monitored when technology companies abused these biases?

Hijack #6: Bottomless bowls, Infinite Feeds, and Autoplay

                YouTube autoplays the next video after a countdown

Another way to hijack people is to keep them consuming things, even when they aren’t hungry anymore.

How? Easy. Take an experience that was bounded and finite, and turn it into a bottomless flow that keeps going.

Cornell professor Brian Wansink demonstrated this in his study showing you can trick people into keep eating soup by giving them a bottomless bowl that automatically refills as they eat. With bottomless bowls, people eat 73% more calories than those with normal bowls and underestimate how many calories they ate by 140 calories.

Tech companies exploit the same principle. News feeds are purposely designed to auto-refill with reasons to keep you scrolling, and purposely eliminate any reason for you to pause, reconsider or leave.

It’s also why video and social media sites like Netflix, YouTube or Facebookautoplay the next video after a countdown instead of waiting for you to make a conscious choice (in case you won’t). A huge portion of traffic on these websites is driven by autoplaying the next thing.

Facebook autoplays the next video after a countdown

Tech companies often claim that “we’re just making it easier for users to see the video they want to watch” when they are actually serving their business interests. And you can’t blame them, because increasing “time spent” is the currency they compete for.

Instead, imagine if technology companies empowered you to consciously bound your experience to align with what would be “time well spent” for you. Not just bounding the quantity of time you spend, but the qualities of what would be “time well spent.”

Hijack #7: Instant Interruption vs. “Respectful” Delivery

Companies know that messages that interrupt people immediately are more persuasive at getting people to respond than messages delivered asynchronously (like email or any deferred inbox).

Given the choice, Facebook Messenger (or WhatsApp, WeChat or SnapChat for that matter) would prefer to design their messaging system to interrupt recipients immediately (and show a chat box) instead of helping users respect each other’s attention.

In other words, interruption is good for business.

It’s also in their interest to heighten the feeling of urgency and social reciprocity. For example, Facebook automatically tells the sender when you “saw” their message, instead of letting you avoid disclosing whether you read it(“now that you know I’ve seen the message, I feel even more obligated to respond.”) By contrast, Apple more respectfully lets users toggle “Read Receipts” on or off.

The problem is, while messaging apps maximize interruptions in the name of business, it creates a tragedy of the commons that ruins global attention spans and causes billions of interruptions every day. This is a huge problem we need to fix with shared design standards (potentially, as part of Time Well Spent).

Hijack #8: Bundling Your Reasons with Their Reasons

Another way apps hijack you is by taking your reasons for visiting the app (to perform a task) and make them inseparable from the app’s business reasons(maximizing how much we consume once we’re there).

For example, in the physical world of grocery stories, the #1 and #2 most popular reasons to visit are pharmacy refills and buying milk. But grocery stores want to maximize how much people buy, so they put the pharmacy and the milk at the back of the store.

In other words, they make the thing customers want (milk, pharmacy) inseparable from what the business wants. If stores were truly organized to support people, they would put the most popular items in the front.

Tech companies design their websites the same way. For example, when you you want to look up a Facebook event happening tonight (your reason) the Facebook app doesn’t allow you to access it without first landing on the news feed (their reasons), and that’s on purpose. Facebook wants to convert every reason you have for using Facebook, into their reason which is to maximize the time you spend consuming things.

In an ideal world, apps would always give you a direct way to get what you want separately from what they want.

Imagine a digital “bill of rights” outlining design standards that forced the products that billions of people used to support empowering ways to navigate towards their goals.

Hijack #9: Inconvenient Choices

We’re told that it’s enough for businesses to “make choices available.”

  • “If you don’t like it you can always use a different product.”
  • “If you don’t like it, you can always unsubscribe.”
  • “If you’re addicted to our app, you can always uninstall it from your phone.”

Businesses naturally want to make the choices they want you to make easier, and the choices they don’t want you to make harder. Magicians do the same thing. You make it easier for a spectator to pick the thing you want them to pick, and harder to pick the thing you don’t.

For example, let’s you “make a free choice” to cancel your digital subscription. But instead of just doing it when you hit “Cancel Subscription,” they force you to call a phone number that’s only open at certain times.

 NYTimes claims it’s giving a free choice to cancel your account

Instead of viewing the world in terms of choice availability of choices, we should view the world in terms of friction required to enact choices.

Imagine a world where choices were labeled with how difficult they were to fulfill (like coefficients of friction) and there was an FDA for Tech that labeled these difficulties and set standards for how easy navigation should be.

Hijack #10: Forecasting Errors, “Foot in the Door” strategies

Facebook promises an easy choice to “See Photo.” Would we still click if it gave the true price tag?

People don’t intuitively forecast the true cost of a click when it’s presented to them. Sales people use “foot in the door” techniques by asking for a small innocuous request to begin with (“just one click”), and escalating from there (“why don’t you stay awhile?”). Virtually all engagement websites use this trick.

Imagine if web browsers and smartphones, the gateways through which people make these choices, were truly watching out for people and helped them forecast the consequences of clicks (based on real data about what it actually costs most people?).

That’s why I add “Estimated reading time” to the top of my posts. When you put the “true cost” of a choice in front of people, you’re treating your users or audience with dignity and respect.

In a Time Well Spent internet, choices would be framed in terms of projected cost and benefit, so people were empowered to make informed choices.

TripAdvisor uses a “foot in the door” technique by asking for a single click review (“How many stars?”) while hiding the three page form behind the click.

Summary And How We Can Fix This

Are you upset that technology is hijacking your agency? I am too. I’ve listed a few techniques but there are literally thousands. Imagine whole bookshelves, seminars, workshops and trainings that teach aspiring tech entrepreneurs techniques like this. They exist.

The ultimate freedom is a free mind, and we need technology to be on our team to help us live, feel, think and act freely.

We need our smartphones, notifications screens and web browsers to be exoskeletons for our minds and interpersonal relationships that put our values, not our impulses, first. People’s time is valuable. And we should protect it with the same rigor as privacy and other digital rights.

Tristan Harris was Product Philosopher at Google until 2016 where he studied how technology affects a billion people’s attention, wellbeing and behavior. 

For more information and get involved, check out and This piece is cross-posted on Medium.

Tech Companies Design Your Life, Here’s Why You Should Care

Four years ago, I sold my company to Google and joined the ranks there. I spent my last three years there as Product Philosopher, looking at the profound ways the design of screens shape billions of human lives – and asking what it means for them to do so ethically and responsibly.

What I came away with is that something’s not right with how our screens are designed, and I’m writing this to help you understand why you should care, and what you can do about it.

I shouldn’t have to cite statistics about the central role screens play in our lives. Billions of us turn to smartphones every day. We wake up with them. We fall asleep with them. You’re looking at one right now.

Of course, new technologies always reshape society, and it’s always tempting to worry about them solely for this reason. Socrates worried that the technology of writing would “create forgetfulness in the learners’ souls, because they [would] not use their memories.” We worried that newspapers would make people stop talking to each other on the subway. We worried that we would use television to “amuse ourselves to death.”


“And see!” people say. “Nothing bad happened!” Isn’t humanity more prosperous, more technically sophisticated, and better connected than ever? Is it really that big of a problem that people spend so much time staring at their smartphones? Isn’t it just another cultural shift, like all the others? Won’t we just adapt?

Invisibility of the New Normal

I don’t think so. What’s missing from this perspective is that all these technologies (books, television, radio, newspapers) did change everything about society, we just don’t see it. They replaced our old menus of choices with new ones. Each new menu eventually became the new normal – “the way things are” – and, after our memories of old menus had faded into the past, the new menus became “the way things have always been.”


Ask a fish about water and they’ll respond, “what’s water?”

Consider that the average American now watches more than 5.5 hours of television per day. Regardless of whether you think TV is good or bad, hundreds of millions of people spend 30% of their waking hours watching it. It’s hard to overstate the vast consequences of this shift– for the blood flows of millions of people, for our understanding of reality, for the relational habits of families, for the strategies and outcomes of political campaigns. Yet for those who live with them day-to-day, they are invisible.

So what best describes the nature of what smart phones are “doing” to us?

A New “Perfect” Choice on Life’s Menu

If I had to summarize it, it’s this: Our phone puts a new choice on life’s menu, in any moment, that’s “sweeter” than reality.

If, at any moment, reality gets dull or boring, our phone offers something more pleasurable, more productive and even more educational than whatever reality gives us.

And this new choice fits into any moment. Our phone offers 5-second choices like “checking email” that feel better than waiting in line. And it offers 30-minute choices like a podcast that will teach you that thing you’ve been dying to learn, which feels better than a 30-minute walk in silence.

Once you see your phone this way, wouldn’t you turn to it more often? It always happens this way: when new things fill our needs better than the old, we switch:

  • When cheaper, faster to prepare food appears, we switch: Packaged foods.
  • When more accurate search engines appear, we switch: Google.
  • When cheaper, faster forms of transportation appear, we switch: Uber.


So it goes with phones.

But it also changes us on the inside. We grow less and less patient for reality as it is, especially when it’s boring or uncomfortable. We come to expect more from the world, more rapidly. And because reality can’t live up to our expectations, it reinforces how often we want to turn to our screens. A self-reinforcing feedback loop.

And because of the attention economy, every product will only get more persuasive over time. Facebook must become more persuasive if it wants to compete with YouTube and survive. YouTube must become more persuasive if it wants to compete with Facebook. And we’re not just talking about ‘cheap’ amusement (aka cat videos). These products will only get better at giving us choices that make every bone in our body say, “yeah I want that!”

So what’s wrong about this? If the entire attention economy is working to fill us up with more perfect-feeling things to spend time on, which outcompete being with the discomfort of ourselves or our surroundings, shouldn’t that be fantastic?


Clearly something is missing from this picture. But what is it?

Maybe it’s that “filling people up,” even with incredible choices on screens somehow doesn’t add up to a life well lived. Or that those choices weren’t what we wished we’d been persuaded to do in the bigger sense of our lives.

With design as it is today, screens threaten our fundamental agency. Maybe we are “choosing,” but we are choosing from persuasive menus driven by companies who have different goals than ours.

And that begs us to ask, “what are our goals?” or how do we want to spend our time? There are as many “good lives” as there are people, but our technology (and the attention economy) don’t really seem on our team to give us the agency to live according to them.

A Whole New Persuasive World

And it’s about to get a lot worse. Virtual Reality and Augmented Reality will offer whole new immersive realities that are even more persuasive than physical reality.


When you could have sex with the person of your dreams, or fly through jungles in the Amazon rainforest while looking over at your best friend flying next to you, who would want to stick with reality?

By the way, this isn’t your usual “look, VR is coming!” prediction. This is the real deal. Facebook recently spent $2 billion to buy Oculus Rift, and hopes to put them in every home for this holiday season. Just like the late 1980’s when suddenly everyone you knew had a Nintendo.

Acknowledging the Problem

So we have a fundamental misalignment between what the attention economy is competing to produce (more perfect, persuasive choices that fit into any moment), the design of our phones, and the aspirations people have for their lives (their definition of “the good life”).


So what’s missing from the design of our phones? I like to use the metaphor of ergonomics. When you think of ergonomics, you might think of boring things like how a cup fits into someone’s hand, but it’s way more than that.

If regular design is about how we want things to work, ergonomics is concerned with failure modes and extremes: how things break under repetition, stress or other limits. And the goal of ergonomics is to create an alignment between those limits, and the goals people have for how they want to use it.

10 Handle diameter

For example, an ergonomically designed coffee mug aligns the natural fatigue of forearm muscles during use (as a person “lifts” it to sip) with how frequently people want to use it, so they still can lift it successfully with repetition.

What does this have to do with phones?

Our minds urgently need a new “ergonomics,” based on the mind’s limited capacities, biases, fatigue curves and the ways it forms habits. The attention economy tears our minds apart. With its onslaught of never-ending choices, never-ending supply of relationships and obligations, the attention economy bulldozes the natural shape of our physical and psychological limits and turns impulses into bad habits.

Just like the food industry manipulates our innate biases for salt, sugar and fat with perfectly engineered combinations, the tech industry bulldozes our innate biases for Social Reciprocity (we’re built to get back to others), Social Approval (we’re built to care what others think of us), Social Comparison (how we’re doing with respect to our peers) and Novelty-seeking (we’re built to seek surprises over the predictable).

Millions of years of evolution did a great job giving us genes to care about how others perceive us. But Facebook bulldozes those biases, by forcing us to deal with how thousands of people perceive us.

This isn’t to say that phones today aren’t designed ergonomically, they are just ergonomic to a narrow scope of goals:

  • for a single user (holding the phone)
  • for single tasks (opening an app)
  • for individual choices

And a narrow scope of human physical limits:

  • how far our thumb has to reach to tap an app
  • how loud the phone must vibrate for our ear to hear it

So what if we expanded the scope of ergonomics for a more holistic set of human goals:

  • a holistic sense of a person
  • holistic sense of how they want to spend their time (and goals)
  • holistic sense of their relationships (interpersonal & social choices)
  • an ability to make holistic choices (including opportunity costs & externalities)
  • an ability to reflect, before and after

…and what if we aligned these goals with a more holistic set of our mental, social and emotional limits?

A New Kind of Ergonomics

Let’s call this new kind of ergonomics “Holistic Ergonomics”. Holistic Ergonomics recognizes our holistic mental and emotional limits [vulnerabilities, fatigue and ways our minds form habits] and aligns them with the holistic goals we have for our lives (not just the single tasks). Holistic Ergonomics is built to give us back agency in an increasingly persuasive attention economy.

Joe Edelman and I have taught design workshops on this, calling it, or designing to empower people’s agency.

It includes an interpersonal ergonomics, to “align” our social psychological instincts with how and when we want to make ourselves available to others (like in my TED talk), so that we can reclaim agency over how we want to relate to others.

Just like an ergonomic coffee mug is safe to live by, even under repetition, over and over again, without causing harm to ourselves or others, in a Time Well Spent world our phones would be designed with Holistic Ergonomics, so that even under repetition, over and over again, our phones do not cause harm to ourselves or others — our phones become safe to live by. They support our Agency.

How to Change the Game


Right now, two companies are responsible for the primary screens that a billion people live by. Apple and Google make the two dominant smartphone platforms. Facebook and Microsoft make leading Virtual and Augmented Reality platforms, Oculus and Hololens.

You might think that it’s against the business models of Apple and Google to facilitate people’s agency, which might include making it easier to spend time off the screen, and use apps less. But it’s not.

Apple and Google, like all companies, respond to what consumers demand.

When Privacy became important to you, they responded. They developed new privacy and security features, and it sparked a whole new public conversation and debate. It’s now the most popular concern about technology discussed in media.

When Organic food became important to you, they responded too. Walmart added it to their stores.

We need to do the same thing with this issue. Until now, with this experience of distraction, social media, and this vague sense that we don’t feel good when we use our phones for too long, there’s been nothing to rally behind. It’s too diffuse. We receive so many incredible benefits from tech, but we’ve also been feeling like we’ve been losing ourselves, and our humanity?

But we’re naming it now.

What’s at stake is our Agency. Our ability to live the lives we want to live, choose the way we want to choose, and relate to others the way we want to relate to them – through technology. This is a design problem, not just a personal responsibility problem.

If you want your Agency, you need to tell these companies that that’s what you want from them– not just another shiny new phone that overloads our psychological vulnerabilities. Tell them you want your Agency back, and to help you spend your time the way you want to, and they will respond.

I hope this helps spark that bigger conversation.

Distracted in 2016? Reboot Your Phone with Mindfulness

This piece has also been cross-posted to Medium.

People often tell me that starting a movement for a whole new type of technology that’s built to help people spend their time well sounds nice (and naïvely ambitious), but what are the things I can do to have a more mindful relationship with my devices right now?”

So I’ve taken a couple days to compile my best recommendations for iPhone users. The tips below are meant to:

  • Minimize Compulsive Checking & Phantom Buzzes
  • Minimize Fear of Missing Something Important
  • Minimize Unconscious Use
  • Minimize “Leaky” Interactions (“leaking out” into something unintended)
  • Minimize Unnecessary Psychological Concerns generated by the screen.

Note: These recommendations are for people who live by their smartphone (not casual users), and they’re based on findings from psychology and behavioral science.

Tip #1: Create Your “Essential” Home Screen 

We check our phone 150 times a day, and each time we unlock to see that grid of apps and red badges signaling everything we’ve missed, it immediately triggers a whole set of thoughts, feelings and concerns in our mind.

Continue reading →

What is Time Well Spent (Part I): Design Distinctions

Organic, LEED Certified, Time Well Spent(?)

I’m often asked, so what is Time Well Spent software? How do we define it? What does it mean to design for it?

Time Well Spent is a consumer movement to shift what we want from the companies who make our technology. Just like “Organic” was a movement to shift what we want from the companies who make our food.

With Time Well Spent, we want technology that cares about helping us spend our time, and our lives, well – not seducing us into the most screen time, always-on interruptions or distractions.

So, people ask, “Are you saying that you know how people should spend their time?” Of course not. Let’s first establish what Time Well Spent isn’t:

  • It is not a universal, normative view of how people should spend their time
  • It is not saying that screen time is bad, or that we should turn it all off.
  • It is not saying that specific categories of apps (like social media or games) are bad.

I thought I’d put some of the possible differences of Time Well Spent into a chart – to distinguish how a company would design an app, website or service differently:

Aspect Before Time Well Spent After Time Well Spent
Stance “I care about engaging users as much as possible. It’s up to them to stop when they want to stop, up to them unsubscribe when they want to unsubscribe and up to them to uninstall my app if they find it too distracting. How much they get out of using my product is up to them.” “I care about serving people’s ideal lives and helping them spend time well. It’s my responsibility to help people continually get the most out of what I offer, including to stop when they want to stop and help them unsubscribe when they no longer benefit.”
Ability to Disconnect  “Users want to be connected and reachable 24/7, all the time. If they want to disconnect, they can always leave their phone at home or uninstall my app.” (All or Nothing choice) “Users have a right to set boundaries between their work and personal life, bound their use according to their preferences, and set aside time to focus. I will design to empower them to create these spaces.” (Choice within being Connected)
Quality of Attention “Frequent brief, bursty, interruptive use is no better/worse than long and continuous uses. If interruptive use is what gets people to click, that’s what I’ll maximize.” “People’s attention is sacred. I will design to help people attend to one thing at a time, minimize task-switching, interruptions, and other unnecessary choices.”
Measuring Success “I measure success in # transactions (clicks, shares, visits, swipes, sales, rooms booked, messages sent)” “I measure success in net positive contributions to people’s lives.” (time reading articles they were glad to spend, places they were glad to stay, people they were glad to meet).
Greenwashing “I talk about my product in terms of catchy one-liners about how it benefits humanity.” “I talk about my product with humility, doubt and self-examination to see its full range of impacts more clearly– both positive and negative.”
Design Goal “I design to help users complete tasks and transact.” “I take into consideration whether completing those tasks would add up to time well spent (for them), and I design to empower them to tell the difference.”
Respect “Users are sheep I can influence, put through conversion funnels, and get to do what I want. It’s up to them to say when they’ve had enough.” “Users are people whose time, attention, relationships and lives I respect. I care whether I’m bringing them closer or further away from the life they want to live.”
Model of User Behavior “Users are only doing what they want and freely choose to do. “I deeply influence what users are doing, feeling and thinking with my design choices – I can’t not influence people’s choices.”
Influencing Psychological Instincts “I design to make people’s psychological instincts and biases work for me.” (I set defaults adversarially to most benefit me. Users can change it if they want.) “I design to make people’s instincts work for them, not against them.” (I choose default settings that most benefit them.)
Minimizing Psychological Externalities “It’s their fault if my product adds new looping concerns, feelings of guilt, fear of missing something, or other stressful thought patterns to users’ minds.” “It’s my responsibility to minimize psychological externalities that arise from using my product.”
Menus and Framing Effects “People make choices rationally. How I frame and organize choices won’t change what people choose.” “How I frame and organize choices (sorting by recency, price, rating, or using different words) deeply influences what people choose. I will frame choices by what’s most empowers and matters to them in the long-run.”

These distinctions also apply differently for different kinds of technologies.

  • Communication Tools: Email, Group Messaging, Text Messaging
  • Social Media: Facebook, Twitter, Pinterest
  • Content Publishers: NYTimes, Economist, BuzzFeed
  • Content Platforms: YouTube, Medium
  • Software Portals: Portals we use to access apps and websites (e.g. Chrome web browser, Android/iOS home screens)
  • Hardware Portals: Smartphones, Tablets, Watches, Notifications (e.g. iPhone, Samsung Galaxy Edge)

I’ll go into great detail about these distinctions in future posts, along with specific examples of they show up in current products and potential new ones!

To learn more, check out or for details on future meet-ups and events!

Is Technology Amplifying Human Potential, or Amusing Ourselves to Death?


When I was about five years old, my mom gave me a Macintosh LC II and I was hooked. Not to Facebook or the Internet, they didn’t exist yet. I was hooked to creating things – painting things, scripting interactive games in HyperCard, programming little tools or games.

Like the technology visionaries of the 1970’s and 80’s like Doug Engelbart, Alan Kay and Steve Jobs, I optimistically believed computers could be “bicycles for our minds” and amplify human potential.

And they did empower us. But today, in the year 2015, “empowerment” rarely feels like my day to day experience with technology.

Instead I feel constantly lured into distractions. I get sucked endlessly into email, distracting websites. I get bulldozed by interruptive text messages, back and forth scheduling, or find myself scrolling a website in a trance at 1am.

I feel like I’m caught in a whirlpool of “Amusing Ourselves to Death,” as author Neil Postman predicted 30 years ago. In the book, Postman contrasts two dystopian visions of the future. George Orwell’s 1984, where power is expressed directly through Big Brother, oppressively restricting people’s freedoms. And Aldous Huxley’s Brave New World, where power is expressed indirectly, by saturating people with so many delightful distractions that they can’t see their oppression. Where people “come to adore the technologies that would undo their capacities to think.”

In Postman’s own words:

George OrwellAldous Huxley

What Orwell feared were those who would ban books.
What Huxley feared was that there would be no reason to ban a book, for there would be no one who wanted to read one.

Orwell feared those who would deprive us of information.
Huxley feared those who would give us so much that we would be reduced to passivity and egoism.

Orwell feared that the truth would be concealed from us.
Huxley feared the truth would be drowned in a sea of irrelevance.

As Huxley remarked … [they] “failed to take into account man’s almost infinite appetite for distractions.”

– Neil Postman, Amusing Ourselves to Death (1982)

It’s scary how true this feels today. Huxley was concerned about what’s irresistible to our instincts. Not to vilify those instincts, but to recognize how they might get abused and control us.


Just like we have built-in gustatory instincts for salt, sugar and fat that are incredibly useful for survival on the Savannah but abused by our modern food environmentHuxley knew we have built-in instincts for novelty, curiosity gaps, social approval and fear of missing something important. These instincts are useful to have on the Savannah but our media environment adversarially exploits them to keep us glued to screens.

So why is our experience of the Internet and computing going this way? Towards distraction, and away from empowerment?

It’s because we live in an attention economy.

The attention economy means that no matter what a technology company aims to create – an informative news site like the New York Times, a meditation app to help people meditate, a social network, or an addictive game – they win by getting people to spend time.  What starts as an honest competition to make useful things that people spend time on, devolves into a race to the bottom of the brain stem to maximize the time we spend.

  • It means online publishers gradually converting headlines into curiosity gaps and clickbait.
  • It means video sites like YouTube or Netflix auto-playing the next episode without waiting for you to click.
  • It means businesses sending push notifications and email to pretend you are miss something important if you don’t check.

And we’re vulnerable to these mechanisms. Knowing their tricks doesn’t inoculate us to their efficacy.

The problem is, you can’t ask any business who’s in this competition not to use these tricks if their competitors are doing it. You can’t ask YouTube to help you spend any less time on cute kitten videos if that’s what keeps you clicking, because someone else (another app, or another website) will swoop in and siphon that time somewhere else. Stock prices depend on keeping engagement numbers high. It’s only going to get worse as businesses compete.

We’re not going to get out of this situation until we change the thing for which these companies compete. From the currency of “Time Spent” to something else.

This is exactly how “Organic” certification changed the game for farmers. By defining and standardizing what makes “safe” cultivation practices (no pesticides), the farmers who wanted to do what’s “good for us” no longer got undercut by farmers who used unsafe pesticides to achieve lower prices. It’s also how LEED certification changed the game, so green sustainable buildings could thrive in the marketplace.

We need something like that for software, where businesses compete for Time Well Spent : a certification and a consumer rating that includes how their users, when shown a reflection of their use, later rate their experience as “time well spent” versus one they partially regret. Like Yelp reviews, but for experiences.

Imagine a world where “Time Well Spent” determines your stock price, your popularity in app stores, your ranking in news feeds, your ability to attract talented employees, your media attention, and funding possibilities. It’s a B-Corp movement for technology.

I used to believe we could just ask software designers to take on moral responsibility for how they shape the billions of minutes and hours of other people’s lives. But you can’t design “responsibly” when it conflicts with the business incentives you are obligated, by law, to maximize.

This is a long road, but we can get there by starting a new conversation. Instead of having the old conversation about self-control and waiting for cultural norms to adapt automatically, direct your friends and family to a new conversation.

Let’s set incentives to create a world where the Internet and my devices amplify human potential again, and where we can trust-fall into the whirlpool of technology and know that it is on our team to help us spend our time, and our lives, well.

For more information and get involved, check out and

Is your web browser a credit card for your time?

The web browser isn’t just a tool that gets you from A to B, it’s also a medium that shapes the kinds of choices you make.

Introducing the Medium. And the Message

Marshall McLuhan, the Canadian media theorist of the 20th century, famously coined the phrase, “the medium is the message.” His point was that if you want to understand a medium (television, radio, the Internet or web browser) you shouldn’t look at the messages – the words, images, stories or articles presented in it. You should look at how the medium organizes perception and choices.

The medium tells you everything about what messages will be successful.

Television organizes perception and choice primarily around visual images, so…

  • Shows with explosive visual images will be more rewarded than shows with dull images
  • TV news shows with attractive broadcasters and dramatic footage will be more rewarded than news shows with unattractive broadcasters who tell stories in a soothing voice.

On radio, however, the unattractive broadcaster with a soothing voice will win.

Money is a Medium

Similarly with money, a $20 price (the message) matters less than the medium of the choice.

Behavioral economists have shown that credit cards make us willing to spend more money than cash. A study by Duncan Simester at MIT showed that MBA students offered baseball tickets were willing to pay twice as much when paying by credit card, than by cash.

Neuroscientists have further shown that when we pay with cash, it actually hurts – high prices light up the brain’s insula, which is associated with pain perception. But when we pay with credit cards, it doesn’t light up our pain centers.

Credit cards invite us to avoid feeling the pain of paying, and to forget how much money we actually have.

Cash invites us to consciously feel how much we spend.

Of course, none of this is new.

  • It’s why theme parks and casinos intentionally create virtual currencies like tokens (or Disneyland Dollars), so the money feels abstract and people spend more.
  • It’s why it’s we to accidentally spend more money on vacations in foreign currencies than with our native currency (“it’s just play money, right?”)
  • It’s why restaurants are taught not to put the $ sign next to each item, and not line up all the prices to the right, so diners choose based on description and not price.

A $20 price (the message) isn’t separate from medium it comes in. The medium is the message.

The Web Browser as a Credit Card

So what does this have to do with technology? And why does this matter at all?

Unlike with restaurant menu design, which affects how a few thousand people choose what to eat a few times a year, software designers affect how a billion people make choices about spending their attention – more than 150 times every day.

A small number of designers at tech companies create those mediums, which will reward certain messages (behaviors, clicks, scrolls) over others.

And today, web browsers are designed like credit cards. They make it easy to “swipe” the credit card for our time and take out a loan against our future selves.

They make it easy to be swipe our credit card for more time than we intended, by getting lost in an infinitely scrolling feed. They make it easy to click something we wish we hadn’t clicked later.

And they do it for hundreds of millions of people every day:


Deep “time debt”

But it doesn’t have to be this way. Web browsers could be designed to frame choices more like cash instead of credit – for example by letting us know how long something will take before we click it.

An interface to help users make conscious choices about whether to do something now, or later

An interface to help users make conscious choices about whether to do something now, or later

Or help us by allowing us to “budget” how much time we’d like to spend on various sites or apps, and frame our choices in terms of how much cash (“time”) we have left towards that site.

But what about Conflict of Interest?

And unlike credit card companies who profit by getting consumers to spend more money than they actually have, none of the four major web browser makers – Google (Chrome), Microsoft (Internet Explorer), Mozilla (Firefox) and Apple (Safari) – make money by getting people to open things they don’t have time for*

Likewise, none of the major Email clients – Gmail, Y! Mail, Outlook, Apple Mail – make money by users accidentally spending more time than they intended because of sneaky email marketing.

Email clients could reframe credit card like choices with cash choices:


You can start to see how these small, nuanced design choices, at the scale of how a billion people spend their attention, quickly become significant moral choices.

I’ll be tackling these moral questions in future posts.



*: 1) It’s true that display advertising is a source of Google revenue, but it’s significant compared to search revenue – equivalent to being penny-wise, pound-foolish. 2) Pages that users open in tabs but never view aren’t treated as true “impressions” by ad networks anyway.