What's Now San Francisco
November 14, 2019
A Path to Humane Technology
Human attention is finite. There are only 24 hours of the day and one person can only turn his or her attention to any one thing for some portion of their waking hours. It’s a zero-sum game. Human attention is therefore valuable. Organizations will go to great lengths to get the attention of a potential customer or voter. And Tristan Harris believes that many technology companies have gone too far. They are tapping into base desires and fears hardwired into our primitive animal brains to keep people hooked on social media. The result is what he calls “human downgrading,” which is leading to a wide range of social ills, from depression to political polarization. Harris is a world expert on how technology steers us all, the former Design Ethicist at Google who left to co-found the Center for Humane Technology, and he’ll be our next featured guest at What’s Now: San Francisco.
Harris wants to use our event to make a clear diagnosis of the problem caused by the attention economy that the world faces right now. Even if tech platforms took the radical step of turning over the ownership of data to users, the human downgrading would continue. The reforms needed have more to do with how products are designed and the choices we do and don’t give users. He also wants to lay out for the first time publicly the humane technology principles that his organization are developing. These principles provide a roadmap for cultural change and are aimed at both existing tech companies and companies just starting out. But they also might apply to any business or organization that designs user experiences and engages the public. We then want to start a conversation about how to drive changes throughout the industry with a diverse range of technologists we expect will gather. As always, come early to meet your peers, and stay after to keep the discussion going over good food and drink. And if you can’t make it, you can always watch the livestream or watch the video after.