Image courtesy of Flickr.

According to a leaked internal report, Facebook had been bragging to potential advertisers in Australia that the company could target teens who were feeling “worthless,” “insecure,” and like a “failure” based on the posts and photos they shared. After The Australian broke the news of the report, followed by the Guardian and other outlets, the company issued an apology.

It then seemed to deny the substance of the report, saying in a statement that Facebook did not “offer tools to target people based on their emotional state.”

But whatever the merits of the claims made in the Facebook memo, the issue is real and it’s much larger than Facebook. It’s the latest development in what marketers and designers call “personalization” — and consumer advocates call “surveillance.”

People are averaging over three and a half hours a day on their phones, according to data provided to Thrive from the app-tracking app Moment, and all that usage on mobile and on desktop is creating spools of data that, through the growth of artificial intelligence, is opening up a world in which companies could learn insights about you that you may not know about yourself. Like life insurance companies mining your selfies to predict how long you have to live, for one.

If it’s used properly, tech like this may lead to the cure for cancer. But it also raises darker questions, as our emotions, movements, and faces are all being tracked.

It’s about understanding what forces are influencing your actions, says Mike Ananny, assistant professor of communication and journalism at the University of Southern California. If a company is sensing your emotions and then manipulating you into behaving or believing in some way, all without your full knowledge and consent, then, he says, “you’ve lost a fundamental freedom from coercion.”

Sure, advertisers have always manipulated people’s emotions, but this is different: “In an era when content can be personalized for you, in real-time, without your knowledge, these techniques start to be deceptive, manipulative, and coercive,” says Ananny. What’s at stake is a fundamental right to understand what’s influencing you and why.”

A growing body of academic research is showing just how much our mental states are baked into what we tweet, snap, and ‘gram. “We’re not as complex as we like to think,” Christopher Danforth, a statistician at the University of Vermont, told Thrive Global over email. He and his colleagues have built an algorithm that can tell if people are depressed based on their Instagram filters at a better rate than a general practitioner’s screening. One element of the algorithm: more black-and-white shots were associated with a greater chance of being depressed.

Another study of his, this one mining Twitter data, found that patterns in tweets could predict depression or post-traumatic stress disorder even before either of those conditions were clinically diagnosed. Part of the goal for the research is “to explore what is feasible using data in the public domain,” Danforth says. “If a few academics can do it, companies with more resources and less consumer protection protocols certainly can as well.”

In a way, this is the latest manifestation in a very old trend. Michael Zimmer, an internet ethics scholar at the University of Wisconsin-Milwaukee, told me that what’s being called “capitalistic surveillance” has been around as long as capitalism itself, from monitoring employee productivity to running consumer behavior studies and market research. But in recent years, this has all become more pervasive, he says, as well as more invisible. Amazon analyzes your clicks, and with the Amazon Echo Look, it wants to archive photographs of your outfits — which, with the aid of machine learning, could lead to the capability of telling if you’re pregnant or depressed. Target famously figured out how to forecast customers’ pregnancies. Google, of course, built an empire with targeting ads based on your search history.

Zimmer says that while Facebook’s business model has always been about delivering super-segmented demographics to advertisers — “I can target an ad to all 20-to-25-year-old females in a specific zip code who indicate they like Coldplay and posted about Hillary Clinton in the past 90 days,” he says — the thing that’s troubling here is what looks like targeting young people in a vulnerable state. “Once we start adding every single possible characteristic to one’s marketing profile, it is inevitable that someone will figure out how to measure moods, mental states, and similar sensitive aspects of our lives, and then attempt to monetize that,” he adds.

Screenshot of Tumblr’s “Everything okay?” pop-up window.

This knowledge can directly benefit users, too. Zimmer points out that Tumblr, for instance, will prompt users with an “Everything okay?” pop-up window if they search for things like “eating disorder” or “suicide,” complete with hotlines and support websites. The startup Affectiva has built technology to recognize emotions in people’s faces, and it’s being used in consumer research as well as in video games. In one psychological thriller, the game gets harder the more anxious or nervous you look. Crucially, the company reportedly requires firms using its tech to make users explicitly opt-in, enabling informed consent. Car manufacturers are developing imaging technology that detects drowsiness in drivers.

But, as Zimmer says, however it’s used, the surveillance of people’s physical and mental states is only going to increase. “We can already envision the ability for Starbucks to throw an ad on my phone (or car dashboard) when I get near a location, and it will only be more pernicious when they can tap into my fitness tracker or the valence of my latest Tweet to determine whether I’m energized or down in the dumps,” he says, “and lure me in accordingly.”

Originally published at journal.thriveglobal.com

Author(s)

  • DRAKE BAER is a deputy editor at Business Insider, where he leads a team of 20+ journalists in covering the shifting nature of organizations, wealth, and demographics in the United States. He has been a senior writer at New York Magazine, a contributing writer at Fast Company, and the director of content for a human resources consultancy. A speaker at the Aspen Ideas Festival and other conferences, he circumnavigated the globe before turning 25. Perception is his second book.