The Washington Post is one of the 2017 winners of the Journalism 360 Challenge. See all the winners.
Project Description
In 2016, 62 percent of Americans got news on social media — a source that is tailored to our interests, opinions and beliefs. Dubbed the “echo chamber,” social media makes it easier than ever to avoid news that we don’t agree with and to isolate ourselves from divergent opinions. Which makes us wonder: What happens in the brain when we read stories that affirm or contradict our opinions? Can our facial expressions reveal bias?
With augmented reality, we have an opportunity to put readers in the driver’s seat and empower them to untangle this complex issue with humor, empathy and research personalized to their lives. To do this, the reader holds their phone in selfie-mode and the phone camera recognizes their face, much like a Snapchat lens. Next, the reader sees a series of images or statements, some affirming their beliefs, others contradicting, and their facial expressions are analyzed based on scientific studies. Finally, the reader learns about their own bias through annotated replays of their reactions.