I left Vögisheim – the tiny village of 500 people I grew up in – after graduating from high school and today live in New York City where I am a professor at Columbia University.
A difference like day and night. Unlike back home in the village, I barely know my neighbors. And they barely know me. We say hi to each other when we meet in the corridor. But they don’t know what I do for work. They don’t know my friends and family. And they certainly don’t know anything about my deepest fears or aspirations.
But as it turns out, you don’t have to live in a small, rural community to have someone watch and influence every step you take and choice you make. That’s because we all have digital neighbors.
Think of it this way: the data-crawling digital equivalent to my sixty-year-old neighbor Klaus reads my Facebook messages, observes which news I read and share on X/Twitter, collects my credit card purchases, tracks my whereabouts via my smartphone’s GPS sensor, and records my facial expressions and casual encounters using some 50 million public cameras across the United States.
In the same way my neighbors became expert snoopers and puppeteers over time, computers can translate seemingly mundane, innocuous information about what we do into highly intimate insights about who we are and ultimately prescriptions of what we should do.
I call this process of influencing people’s thoughts, feelings, and behaviors based on their predicted psychological characteristics psychological targeting. And I’ve been studying it—and practicing it—for over a decade now.
My colleagues and I have published numerous articles showing how computers—powered by machine learning and AI—can get to know you intimately. It doesn’t matter which psychological trait or data source you pick. For example, algorithms can tell whether you are excited, sad, sociable, or anxious by tapping into your phone’s microphone or camera. They can predict your income from your social media posts. And they can tell whether you are likely to develop depression or suffer from schizophrenia by tracking your GPS location.
But that’s only half the story. I’ve spent most of my career tackling the glaring “So what?” question. What does it mean that computers can peek into our psychology and understand what lies below the surface of the behaviors they can observe? What does it mean for you and me? And for society at large? It doesn’t take much imagination to understand that psychological targeting, in the wrong hands, could be a powerful weapon.
When I was a teenager, I struggled with low self-esteem. I wanted nothing more than to belong and be liked. But it was my best friend who was popular, not me. I became very good at hiding my self-doubts from the other people in the village, putting on a facade that bordered on arrogance. On the outside, I was strong and confident. Inside, I doubted myself. I shared these feelings in my diary.
If I were a teenager today, I would probably ask Google for advice. “How can I become more popular?” “How do I feel better about myself?” These questions would build up in my search history. And the resulting profile could easily be used against me. In 2017, Facebook was accused of predicting depression among teenagers and selling this information to advertisers.1 No easier target than insecure, struggling teens. Pretty gloomy.
But let’s look at this in a more positive light. What if we could use psychological targeting to help millions of people lead healthier and happier lives? My research, for example, has been used to predict and prevent college dropouts, guide low-income individuals toward better financial decisions, and detect early signs of depression.
Yes, that’s right. The very thing I accused Facebook of doing in the “gloomy” section could also be a real opportunity. Depression affects approximately 280 million people around the world. Every year, about 1 million of them commit suicide. That’s more people dying from the consequences of depression than from homicide, terror attacks, and natural disasters together.
What makes these numbers particularly upsetting is that depression is treatable. The problem is that many people are never diagnosed. Even if they are, the diagnosis often arrives too late. It is much harder to fight your way back from the bottom of the valley than from the initial descent.
What if, instead of selling you out to advertisers, we used the insights into your mental health profile to build an early warning system? GPS records or tweets could alert you to changes in your behavior that resemble patterns observed in other people suffering from depression. It’s not only a chance to detect depressive symptoms early (before they develop into a full, clinical depression) but also to offer personalized advice or resources.
We might observe that you are not interacting with your friends as much anymore, or you’re spending a lot more time at home than usual. Why not encourage you to reach out to a few of your friends or spend some time in the park nearby. And, if necessary, provide you with contact details of a few therapists in the area that might be of help.
Predicting and influencing mental health outcomes is merely one of many examples demonstrating the power of psychological targeting. What if we could make education more engaging, help people achieve their fitness goals, or facilitate a more constructive dialogue across the political divide?
For the better part of my academic career, I’ve felt somewhat helpless and lost in the tension between the perilous and promising sides of analyzing personal data. Was I in the camp of techno pessimists arguing that technology fails to deliver on its promises and actively harms humanity? Or was I in the camp of techno optimists who believe in a bright future where technology helps us become better versions of ourselves?
I often felt like a hypocrite—excited about new findings, with this nagging feeling that, in the wrong hands, those findings could have horrible consequences. Or vice versa, talking to media about the dangers of psychological profiling, while fearing I was backstabbing my students and industry partners who saw the potential promises of psychological profiling.
It wasn’t until a Christmas trip back home (and after multiple rounds of mulled wine) that I realized how similar my current struggle was to my experience in the village—constantly torn between the desire to break free and the appreciation for what my community had to offer. The more I thought about this analogy (in a sober state), the more glaringly obvious it became.
I was dealing with a new manifestation of a tension that has been part of the human experience for centuries. How much of our private lives are we willing (or even happy) to disclose to those around us? How much of our privacy and autonomy are we willing to give up for the security and strength provided by the collective?
What this all comes down to is power. In the same way my neighbors had an easy time convincing me to do chores for them because they knew I was a crowd pleaser, understanding your psychological needs, preferences, and motivations gives others power over you. Power to influence your opinions, emotions, and ultimately behavior. Sometimes this is good; sometimes it’s bad.
But life in the village taught me that whether we win or lose is—at least in part—up to us. Even though I never had full control over my life, I still managed to navigate the ups and downs. As a kid, I had no idea how the village operated. But over time, I learned more about the system I was embedded in. I understood people’s motivations, figured out who was talking to whom, and learned who could be trusted with information.
Once I understood the game that was played and had a clear sense of what I wanted out of it, I learned to play it to my advantage. Suddenly, I was winning more than I was losing.
We need to do the same—and more—for the digital village. We need to understand the players that control the current data ecosystem, figure out how they use our personal data for and against us, and identify the leverage we have (or need) to come out on top.
But merely becoming better at playing the game won’t be enough. We need to redesign it.

Reprinted by permission of Harvard Business Review Press. Adapted from MINDMASTERS: The Data Driven Science of Predicting and Changing Human Behavior by Sandra Matz. Copyright 2025 Sandra Matz. All rights reserved.