S3E2: Transcript
Citizenship in a Networked age with Vint Cerf, Nuala O’Connor, and Michael Wear

Tavia Gilbert: Welcome to Stories of Impact. I’m producer Tavia Gilbert, and in every episode of this podcast, journalist Richard Sergay and I bring you conversation about the newest scientific research on human flourishing, and how those discoveries can be translated into practical tools.

This season of the Stories of Impact podcast explores the timely question of what it means to be a good citizen in a networked age. And to begin to answer that essential question, we’ll explore the relationship between tech and citizenship, from the perspective of tech, policy, and theology.

We’ll begin the conversation with Vint Cerf, Google’s vice president and chief Internet evangelist, and one of the world’s recognized “fathers of the Internet.” The work of Vint Cerf and others in developing web-enabled digital technology has had a revolutionary impact on virtually every part of and every person in modern society. So it’s fair to say that he shares responsibility for where we are today—the good, the bad, and the ugly.

Cerf spoke with Richard about his own role in the genesis of the digital age, as well as his hopes and concerns about the future of the technology that, in fewer than 40 years, has transformed from a niche tool used by academics to a powerful information-sharing platform the majority of the people on the planet can harness to serve their own individual agendas.

Here’s Vint Cerf:

Vint Cerf: I certainly didn’t anticipate what happens when you get billions of people including the general public into this environment. Within the span of about a decade or so, billions of people have become addicted to their smartphones and its ability to both inject and collect information. That’s a phenomenon that we didn’t experience in previous technology revolutions.

Tavia Gilbert: Of course, Cerf recognizes huge benefits that have come from the advancement of technology. He says:

Vint Cerf: Existing AI algorithms, many of them, have in fact produced just astonishingly beneficial results.

Tavia Gilbert: Such results include speech understanding, speech generation, automatic, real-time language translation. But it’s also brought about the ability to perpetuate online bullying, spread disinformation campaigns and election interference, or weaponized nationalism, contributed to radicalism online, and the result is a threat to the stability of liberal democracy. So, it’s not all good, and Vint Cerf has been around to witness the evolution as long as anyone. So what does he see as the greatest impact of Internet technology on Citizenship in a Networked Age?

Vint Cerf: The network erases distance and erases distinctions among different geopolitical boundaries. And that’s part of its strength in a way, because it means that it doesn’t matter what time zone you’re in and how far away you are, you can be part of a cohort of people who are working together, talking together, arguing together.

People who are scattered around the world in different geopolitical locations may find themselves more bound together by their common interests than they are necessarily by their citizenship by country. “Netizens” is a wonderful term for that.

We are actually seeing a shift back away from this notion of it doesn’t matter what country I live in, my interests and your interests coincide, we are part of a cohort of compatible interests. We’re starting to see the resurgence of nationalism in our geopolitical world. And I think that is actually creating waves in this online environment. It’s re-imposing geopolitical boundaries that I had hoped the Internet would erase.

Tavia Gilbert: Despite that increase in insular nationalism, Cerf himself understands the imperative of personal and collective responsibility, how far those ties can extend, and the negative outcomes that follow ambivalence about the impact of one’s own digital choices in the material world.

Vint Cerf: Whatever you do and say can have a very material impact on a human being that you have to think about, you should feel responsible and think about what you’re doing and how, what the impact is. Now unfortunately, there are other people who have no ethics at all and in fact they deliberately put things up on the net that are potentially harmful. And the difficulty we have in that environment is figuring out where the perpetrators are, where the victims are, and the worst problem is the one where there is an international boundary between the two of them. What cooperative agreements exist between two countries to deal with the harmful actions that have been taken? And we don’t have tools yet for dealing with that. We don’t have common treaties, either bilateral or multilateral, and that’s what we need.

I had wished that this online world would highlight for everyone how co-dependent we are on each other’s actions. And I think that that message is still struggling to get through, and it’s becoming, it’s dissipated in part by the rise of nationalistic views.

Tavia Gilbert: Cerf still has high hopes that technology will connect us, rather than tear us apart.

Vint Cerf: Well, I would like to think that as human beings on a common planet whose existence we depend upon—we don’t have any other place to go, so this the only planet we have—and what I had hoped the Internet could do, and I still hope it can do, is to draw people’s attention to the consequences of their actions on others, not just in their own countries but on the planet. This, it would be a pity to squander the collective ability to recognize problems and work together to solve them that the Internet offers, instead of fragmenting it into a bunch of very self-serving environments.

Figuring out how people decide to adopt moral obligation is way beyond technology. It’s a question that our society needs to answer. This is a sociological question, it’s not a technological one. The Internet may be helpful, but as I’ve already pointed out, it may be harmful, too, because of the misinformation that flies around in the net.

It’s not going to induce the kinds of ethical and moral obligation that I hope people will feel. It only can be helpful if they feel it, and give them an avenue for exercising those obligations. But it doesn’t force them to do it at all.

All we can hope is that the people who end up using these technologies choose to do things that are beneficial for themselves and others and not harmful. But we know, because we all read Shakespeare’s plays that are 400 years old, that people have motivations that are not necessarily always beneficial. And so we have to be prepared for that.

Tavia Gilbert: Technology has its place, but understanding its limits is key, Cerf says, both to appropriately work with technology as it exists today, and to work with the technology of the future, which will be a dramatically increasing reliance on the algorithms of artificial intelligence.

Vint Cerf: Well, it is easy to imagine humans relying on computer-based analysis and decision, in the belief that that’s more accurate than any human being could do. We can fool ourselves very, very quickly to assume that it must be right because it’s the computer, and they never make mistakes. The important lesson is they do make mistakes and we need to understand how to detect that.

I’m actually enthusiastic about the use of these technologies, because they have been remarkably adept at dealing with diagnosis and dealing with the correlation and machine translation and the like. But I’m equally nervous about our lack of knowledge of the ways in which these things could fail. And so we need to be very thoughtful about detecting these kinds of failures.

To do things that I could not do as a human being, just at the scale at which these machines can do work, to analyze billions of text or images and what have you, I couldn’t do that on my own. And so I see this as a tool. What I don’t want is to rely blindly on the output of these various algorithms without having a deeper appreciation for the ways in which those algorithms could be misleading. And so this is another moral obligation that we have, is to not allow ourselves to be dictated to by machines.

I think some of these algorithms can be useful as advice givers, they can talk about correlations. But they often don’t necessarily speak to causality, and that’s why for really important life-determining decisions I would prefer to have human beings in the loop.

If we allow ourselves to become overly dependent on machine-driven decision-making, then we may foreclose a whole bunch of possible options that we should have and could have considered. And so once again I think that it’s important for us not to allow algorithms to dictate what we decide to do, they should inform, but they shouldn’t be the sole determinants of decisions that we make.

Tavia Gilbert: In addition to imperfect decision-making, the ability of virtually anyone to publish anything, from anywhere, at any time online, while that freedom has wonderful aspects, has also had a troubling impact.

Vint Cerf: This leads to at least two very significant problems. The first one is that you now have the onus on your shoulders for trying to figure out what’s good quality information and what isn’t. It’s, it’s critical thinking, you have to actually ask yourself, where did this information come from? Is there any corroborating evidence for the claims and assertions that are being made? Might there have been some motivation for putting this information up to persuade you to do something or you change your mind about something?

And certainly the 2016 elections and other things, not in the US, but in Europe and elsewhere demonstrate that the openness and the reduced barriers for generation of content, poses this big hazard for those of us who are trying to consume it. So we’re now forced to apply critical thinking which is work, it’s real work, and not everyone is prepared to put the work into qualifying the content that they’re encountering. But now it’s much harder, because there are so many different potential sources.

So that’s one problem. The second problem is that because of the reduced barriers to the injection of content into the system, you now invite a whole lot of malicious activity—people who just want to inject noise into the system, for like vandalism. The opportunity to do that is so cost-free that people do that. And so we’ve exacerbated the problem because we’ve made it so easy to put it, put information into the system that you and I might encounter.

The problem we have is that someone can use a botnet or other mechanical mechanism to make it look as if lots of people believe that this is important information even though it’s just an algorithm running a whole bunch of computers that somebody has gotten control over. So we, in our zeal to lower the barrier for access to information and to sharing of information, we’ve also introduced Pandora’s Box. So now we have to learn how to deal with that.

Not only do the providers of the information, the media through which the information arrives have to think about this problem, but as the recipients of that information, you and I need to think about that problem too. So being a citizen is actually harder work in cyberspace than it might have been in our earlier incarnations.

Tavia Gilbert: Another problem—the internet’s most powerful content generators, like Twitter, Facebook, Instagram, and other social media sites, are built to reward content that moves farther and farther toward the fringe.

Vint Cerf: What’s interesting about the dynamic is that it’s often extreme content that generates the biggest reactions. It may very well be that these social media feed extremism if the metric that you’re looking for is the one that garners the biggest metric number.

So we may have inadvertently built into some of the online media, social media, feedback loops that drive extreme behavior. And I think it’s incumbent on us to understand that and to recognize it and to cope with it. And whether “us” is just you and me recognizing that we’re being, reacting to this feedback loop, or whether we’re the providers of the service recognizing that we’ve created those feedback loops, I think the important part of the moral landscape here is to recognize some of those phenomena and to in fact deal with them.

Tavia Gilbert: Like Vint Cerf, our next guest, Nuala O’Connor, sees both benefit and peril in our current relationship with technology, as well as our responsibility for moving with urgency toward digital literacy and awareness, individually and collectively. Empowering citizens is at the heart of O’Connor’s work. As the president and CEO of Center for Democracy and Technology, she fights for privacy, freedom of expression, freedom of association, freedom from government surveillance, and any number of online rights issues like net neutrality and copyright protection. And she needs citizens to fight alongside her. The involvement of citizens in the creation of their own institutions is at the heart of O’Connor’s definition of citizenship:

Nuala O’Connor: Citizenship to me is the active participation of an individual in the organizing construct of social or political institutions in their community. And in the Internet space, we are in a major conflict about how the Internet will be governed and managed and whose values will prevail. Will they be ones of democratic openness, or will they be a more top-down government directive? And I’m not quite sure we have won that battle yet.

The Internet has been the great democratizer of individual voice and of many forms of freedoms. With freedom comes responsibility for all of us as individuals and as leaders of companies or organizations. The potential, I think, for Internet and Internet-enabled technologies to further democratic institutions of governance and values and principles is still unparalleled in other technologies. I mean, not since the printing press has there been a voice-enabling and -amplifying mechanism that so profoundly reshaped the relationship between the individual and the information they seek to disseminate or that they seek to receive. But with that destabilization have come all sorts of unintended consequences.

The conversation we are having right now about conversation online, about who is responsible for the creation of community and community norms and norms around how information is elevated or disseminated, and frankly how the very algorithm or the very institutions that run the spaces online, which are largely still private sector spaces, what responsibilities and roles and rights they have about the information that they share with their individual citizens or users or end users.

The Internet has profoundly reshaped the relationship of individual to state, individual to company, individual to information, I think in largely positive ways. But the unintended consequences of the greater access to information and greater ability to reach more listeners means every piece of information is given equal weight on the Internet.

And so I think there’s a really great conversation happening right now about journalism and the Internet, and what are the new dissemination norms, and who should bear both responsibility and profit from those models?

I think at a minimum we are privileged to live in a country where people have the right to vote and should exercise it. But even more, that we now have information at our fingertips about how agencies at the federal, state, local, municipal level are running, and we all need to exercise our duty of care.

We’re having a national conversation about democracy—are you for it or against it? And I think it’s time to take a stand. I’m certainly for it. I think there’s no better organizing function or principle to how humans agree to disagree. The Internet unfortunately makes that faster and louder, and I think it may in some cases not always elevate, when I say the best, I don’t necessarily mean the best speech in terms of how it is presented or even if I agree with it or not, but simply productive to furthering constructive dialogue and positive outcomes for the democratic institutions that I believe we all hold dear.

Tavia Gilbert: What are the positive aspects of the hyperconnectivity of the modern digital era?

Nuala O’Connor: I do like that the Internet and Internet portals make people feel and actually be more connected to decision makers and lower the barriers, lower the differential between people in positions of power and people who live far, far away from Washington, DC, for example.

What it also means though is that people can get really incorrect information that, what’s the old, is it Twain who said, you know, “a lie can go around the world before the truth puts its pants on”? The challenge I think for all of us is to be both better consumers of information and also maybe discerning readers. And so I think we’ve got a huge road ahead of us in terms of media literacy and digital literacy for an informed electorate.

At the Center for Democracy and Technology we are organized around the principles that individual human beings deserve dignity and agency and autonomy.

Tavia Gilbert: But isn’t the maturity of our democracy enough to ensure the continuation of individual rights, whether or not we introduce algorithms into our decision-making?

Nuala O’Connor: I don’t think that 200-plus years into this experiment we are actually all that mature a democracy. I think also every generation needs to regenerate its idea and its construct of what democracy means and looks like. And I think that technology, in this case, the sweeping changes of the current version of the Internet which is an Internet-enabled everything, and the future versions of artificial intelligence and really sophisticated and embedded technologies in the dashboards of your car, in the walls of your house, the walls of your child’s school room—these are consequential changes in how we relate to the world around us and how we relate to the institutions that we believe serve us.

Embedded in not only the algorithm but the architecture and the infrastructure of these systems and the institutions that are a part of the Internet ecosystem are eventually the biases of the creators themselves, the individuals who program the computers or set up the systems or whatever.

I think we need to stress test these algorithms, these systems, these architectures as they become so opaque and so embedded in our lives that we take for granted that these devices we are creating and embedding in our lives, are they serving all of our needs, and are they serving equality and democratic principles? Or are they embedding and reinforcing biases that we all have clung to for many years of our lives?

I think the more diverse human beings and experiences we have at the table, including political viewpoints and economic viewpoints and points of view about where in the world and what values need to be served, enhances and enriches the creations that we are offering in the technological world. What I worry about is, as these, again. as these devices or decisions become embedded and opaque to the end user or so fast and so automated, which is a benefit of the technology, but also a peril, that we don’t even have the time to question, or that we’re not even aware that what we are being fed or what we are receiving, whether content or action, is profoundly different than someone of a different race or gender or ethnicity.

I think we really are having a conversation now, at least in the United States, and in many parts of the world, about the role that intermediaries on the Internet play, whether they are intentional or unintentional intermediaries of information.

The challenge, I think, for private sector actors in this space is, your first order interest is to serve customers and shareholders and create value in a capitalist system. But there are second and third order interests of your customers. And if one of the second or third order interests is creation of democratic values and furtherance of true and accurate or productive discourse or values that align with democratic values and institutions either in the United States or elsewhere, then a real lens and a real kind of microscope has to be put to the output as well as the inputs of the algorithmic decision-making of the AI.

There have been people calling for a greater scrutiny of and greater accountability, starting with transparency, over how systems and major networks are programmed. And I think social and human rights constructs and values need to be embedded not only after the fact but really from the very beginning of software development life cycles.

I think there is some social responsibility, some corporate civics responsibility for Internet companies, just as there has been social and corporate and moral responsibility for industrial companies in the Industrial Age.

Tavia Gilbert: What’s the impact in the digital era on national boundaries, or governance across different national ideologies?

Nuala O’Connor: The Internet has broken down geographic barriers, has broken down traditional constructs of government and nationality in potentially very positive ways. It’s also scary for people who run governments and who want to continue to maintain control over their geographic boundaries and citizens.

And that’s I think part of the, the real tension we’re seeing right now with data localization laws or mandates that the Internet must be, stop at the borders of a particular country. And I think the really existential crisis between the U.S. and China on whose norms and governance will govern the Internet and how people talk and communicate. And again I don’t think that’s a settled answer that the US will prevail, or that there will continue to be one Internet.

That’s the idealized goal that may somewhat be in peril as more governments feel threatened by the extraterritorial reach, not only of largely US-based multinational companies but also of their own citizens to get information and reach sources of power outside of traditional government structures. I think it’s a huge source of tension right now. I don’t think it’s a, it’s a settled question.

I saw a statistic and I’m gonna get it wrong, but it was citing major world religions that were much larger in terms of population than any one national country. And I think there were, you know, Christianity and Buddhism and Judaism and Facebook. Right? So Facebook alone had 2.5 billion users, I think, in the last year. That profoundly destabilizes traditional notions of who is governing.

It also begs the question of, who’s governing? Because the private sector actor in this case, the platform creator, gets to set the rules, and rightly so, it’s a private sector actor. But when you are setting rules and norms about how 2.5 billion people communicate with each other, that is a very, very heavy burden to bear. And also you are crossing countries and languages and norms and ways people relate to each other that may be very different than our own, coming from the US, and may not be fully understood.

I think it’s no longer satisfactory to say, “Well I just built it, I don’t have to think about how it’s going to be used.” I think that’s an inadequate answer for anyone who creates anything whether it’s a house or a family or a, a device or a car or thing. I think you are responsible for what you put out into the world to the extent that you are able to at least articulate that it is there to serve some greater good.

Where I would put some responsibility on the purveyors or platforms or others who benefit from the dissemination of news and information to be editors, to be better editors and better stewards of information, I hope we are not living in a post truth world, and I do think that facts are knowable.

So I think we are really going to have to reshape what look like journalistic ethics and morals and standards and, and think about what our job is not only as the receiver of news and information but those companies and institutions that are disseminators of information as well.

Tavia Gilbert: O’Connor also cautions us against assuming we can take for granted our public institutions and norms.

Nuala O’Connor: People are always surprised to see how recent kind of public education and public norms about an informed electorate are, and I think we’re at a point, we’re at an inflection point in the United States about national narrative.

I think it’s a worthwhile, although difficult, conversation about, how do we have a national narrative about not just the founding but who we are as a country, wherever, whatever country you’re in, that is at once respectful but also not bound by its history and inclusive of diversity? And then how do we inculcate that respect for diversity and also pride of place in a national education system, while also respecting that there are, there are great differences in people’s attitudes about the subjects that we think of as core curriculum?

But I think all citizens need to be educated and canny consumers, not only of the information they see online, but of the actual technology itself, simply even knowing that the algorithm could have bias or that this device that you’re putting in your house might be collecting information about you and used to make judgments about you and your next purchase or your next interaction with it.

I think we need more awareness. And when I say media and digital literacy I don’t just mean about the content, I mean about the consequences of what we are adopting in our daily lives.

While, yes, there’s polarization, there is also a great deal of energy I think on both sides of the political spectrum and hopefully maybe in the middle as well, people are realizing that I think we have taken some of our fundamental institutions of democracy for granted. I think there is an energy around hopefully not only federal but state and local governance as well and a breaking down of a perceived barrier that it’s always the province of the rich or the wealthy or the white or whatever or the male. And I think we’ve seen some of that in the most recent round of elections in Congress, but hopefully we’re seeing it at the local school board level as well. And I am short-term pessimistic and long-term optimistic that we will get this right. But I think it’s taking all hands on deck from whatever vantage point you’re in whether it’s government, private sector, individual consumer of information or just simply a citizen.

There is no question in my mind, democracy as a construct is in peril, not only in the United States but around the world. I’m hopeful that we agree that an open and representative democracy is a healthy and productive form of governance that brings the opportunity for equality for all people.

Tavia Gilbert: Michael Wear, our final guest in today’s episode, would call virtues like equality and opportunity part of human flourishing. Wear is a leading strategist, speaker, and practitioner at the intersection of faith, politics, and public life. He’s the founder of Public Square Strategies, LLC, and the former Religious Affairs Director for President Obama’s reelection campaign. This scholar sees the realignment of relationship between humans and technology, and between fellow citizens, as a vital part of supporting human flourishing.

Michael Wear: I think there are many components to human flourishing, but I really think predominantly of two. The first is the ability to live within integrity, the ability to be an integrated and honest person, and then second would be the ability to create towards the good of others, the ability to actually create in a way that helps others flourish. And those two together I think help make up a pretty good model of human flourishing.

Tavia Gilbert: If that’s the model for human flourishing, then, the negative feelings and behaviors stirred up by manipulated and targeted media, or by politicians, for example, aren’t just stressful for modern society, they undermine our ability to flourish.

Michael Wear: Certainly a lack of suffering is a component of flourishing. Feelings of scarcity, feelings of fear are great pressures on being able to live with integrity and to be honest and truthful.

Tavia Gilbert: So if inciting fear, competition, the sense that life is a zero-sum game, are all more possible because of modern technology, what does the antidote of healthy modern citizenship offer?

Michael Wear: A spirit of neighborliness, the ability to listen. The ability to reach outside of self or parochial interests and tie your fate to those that you’re in community with. I think an ability to promote the affirmation of human dignity and advance justice and not seek politics and not seek the public realm as merely a place to go to seek self-affirmation and self-realization. It sounds obvious to say, but citizenship is not an individualistic endeavor, and it’s not supposed to be only internally focused.

Now certainly as we enter public, we have even a responsibility I’d say to represent our own self-interest, but we must conceive of our self-interests within the broader whole. Entering the public in a way that is solely about the acquisition of power for yourself or for your team, is highly destructive. A lot of the forces of 21st century American life leads toward exactly that, leads towards a sort of insulated, self-interested form of civic engagement. But the ideal citizen in this age will be able to build up the sort of internal resources to transcend that.

The Apostle Paul’s letter to the Galatians—it’s a book in the New Testament. In that book Paul is writing to essentially a polarized community. He says, he goes through, the community was dealing with false teachers, the people in the community were kind of pursuing their own ends and arguing with one another, and he says this amazing thing that just sort of goes against every impulse that I think we have in this age. I mean right now, think about the advice you give to the American public that are bitterly divided, tribalistic, and a lot of the advice you hear is, separate them, they need to kind of find out how to just not kill one another. What Paul writes of the Galatians is that, he says they ought to bear one another’s burdens and that in doing so they’ll show the love of Christ.

We have a social contract here in America. We have, just by nature of liberal democracy on either side of the Atlantic, there is a common obligation we have to one another, and it’s not just to live together, it’s not just to ideally not just even work through the political process to acquire the most power we can and try and get the most we can for ourselves, but it’s actually to see in people who politically disagree with us people who also deserve to be heard, people whose interests deserve to be respected by the political process.

And so the ideal citizen will find a way to invite the best expression of even interests that they disagree with into the political conversation because they realize that politics is not just about them, it’s not just about what they need. It’s about the community together.

In a representative democracy, you don’t choose to have political influence. You’re invested in it just by virtue of being a citizen. You have political responsibility, and the only choice you have is whether to steward it well or not. And so in the modern era there is both increased distance and the ability to withdraw from the political process, the political process, I think, seems removed from broad swaths of the public. At the same time, the sort of responsibility is as direct as ever given the sort of innovations of democracy that allow for really clear ways of input from the public into the political process.

I’m even more concerned, though, about that question of integrity. The way that our networked age allows a sort of feigned knowledge, a sort of feigned community. The ability to hint at personal understanding without the actual relationship being there. To be able to break down people into a number of decision points but never really being in the flow of their lives. We see this in politics. We see this just in our social media lives and that have impacted our most personal relationships. And it leads people to both strive for that connection but you know through a form that that will never lead us to the peaks of human relationship. And so there is a real tension there, and that undermines the integrity of the person, to be seeking something through a forum and through a medium that at the end isn’t going to be able to facilitate that absolute connection that we desire.

There is a way in which this age has helped people to find communities they would have never found before, to find people that they view as like them in ways that have never been possible. I think there are forces and interests that see those very communities building and are finding ways to manipulate them for their own purposes.

And so this thing that feels very personal, in politics with false information being put out to sort of promote uh, tribal controversies that take on a life of their own even though they’re not grounded in fact, but facts don’t seem to matter when it confirms your own identity and the deepest things that you believe to be true about the world. And even when they’re corrected it’s like well, well that instance wasn’t true, but the point it was aiming towards was right. So even, even if the facts don’t match that, the aim of the lie was correct.

Now, I don’t want to deny the power—there are voices that could have never been heard 20 years ago that now have an ability to impact our democracy, have the ability to impact public conversations, culture, entertainment, media, news sources, in a way that would have never been possible.

There’s a certain power, a certain democratic sort of aspect to technology, it’s just opened it up for a whole bunch of people, but again those very same tools that provide democratic access are also used by interests and organizations with interests to manipulate people’s affections.

Tavia Gilbert: The tools we have available to manipulate people’s affections and to rapidly spread information are dangerous, and Wear looks to history for modern-day perspective:

Michael Wear: Gatekeepers that had to be weakened and undermined in Germany but in a networked age you can just simply go around the gatekeepers in a way that that, wasn’t as possible, you had to be the gatekeeper in order to manipulate people in the way that the Nazi regime did.

Now you don’t—so in political communications there’s often conversation about sort of going around the messenger. So why even communicate through these, through these gatekeepers, in the political case they’re mostly talking about journalists, when you can reach the voter directly with a message that’s directly tailored toward them. Based on what? Well not personal knowledge, often. Not personal inter-relational knowledge, but a series of decision points they’ve made in their lives. What magazines they subscribe to, what their income is, what their racial and ethnic background is. That, again, feigns that kind of familiarity that is then leveraged for, not just relational understanding, but towards a, towards a specific end.

Through the use of technology there’s ways for the citizen to access decision-makers and ways for decision-makers to get some sort of sense about who they’re trying to reach, about who they’re representing, in a way that just wasn’t possible before. It’s no longer an age of just doing focus groups with 20 and extrapolating that onto an entire public. You can now get individual information that’s based in an interpretation of reality, an interpretation of the facts, that allows a responsiveness that just wasn’t wasn’t possible before. That’s, that’s something that pushes us towards sort of a greater democracy. And I think that the downside is a, the inauthenticity of it all. There can be a dehumanization that takes place when people are sort of boiled down to the data points that are available to decision-makers, which isn’t all of them. So we’re actually sort of creating profiles of people, and I’m not just talking about politics, I’m talking advertising, I’m talking about marketing, I’m talking about, I mean this has reached into medicine. I mean this is reached into our– this is modern life.

There’s a dehumanization that takes place that can, that can just sort of throw things off, that leads people to, to not be dealt with with integrity and so it’s very difficult to respond with integrity when so often the interactions that we receive aren’t dealing with us as persons but has a set of data points.

To, to the extent that the algorithms are correct, it leads campaigns to talk to the voters in a way that’s sensitive to their needs that they may have never found before, that may have never heard from a campaign before, that would have never been invited to participate in the democratic process in the same way. But where this can lead and where it is in many aspects is a dehumanization of the decision-making process.

And I’d say we need to have other metrics involved in the mix than just the utilitarian, “does this help us reach the short term goal that’s directly in front of us?” Technology can make a see out father, it can help us see people that we would have never seen before. It can also close in our vision. It can also put the top on our conversations and lead to enclosed sort of thinking that doesn’t allow us to look up and think of, think of higher things. Think of our, our better angels.

Tavia Gilbert: In Wear’s perspective, a moral conscience isn’t just nice or preferable, they’re imperative for just communities and healthy, sustainable futures:

Michael Wear: A citizen with a moral conscience will say no to things that are against their personal interest, or yes to things that are against their personal interest, because it is in line with a moral code, because it is in line with what is beneficial for the community, in line with the common good. To be a citizen with a moral conscience means that you don’t just go to politics seeking the maximum material gain in the shortest term possible.

And so I’ve come to the conclusion that there are a whole range of structural, sort of technocratic things that we could do to help nudge the system in a better direction. But without a reformation of, without a sort of strengthening of the American civic character and the civic character of individual Americans, the structures are actually responsive to that. And little tweaks will not be able to withstand the actual desires of the American people.

So for instance, we know, through data, through AI, we know that A, the American people say that they don’t like negative advertising and B, that negative advertising is far more effective in political campaigns than positive ads. And so it seems as though the American people have a crisis of conscience in that area and so many others, where they know what they ought to desire, and yet they can’t help but be swayed by those baser things.

The ways in which people’s sensitivities and proclivities and preferences could be manipulated to sort of activate their conscience for insincere reasons. I think the short term effect of that is either a, sort of an echo chamber mentality where you shut yourself, it’s just too, too disturbing to think that your conscience could be so easily manipulated and so you ignore sort of contrary facts. Or a sort of apathy, just a sort of, I get, you know I get fired up about something that seemed like it was true and real and important, and eight hours later I found out, find out it was a hoax. For an individual that can happen three times in a week, and at some point, the desire to be civically engaged, the desire to care about those around you, you either have to choose what side you’re on and close yourself off, even the contrary facts. Or all the contradicting information and the selling to the conscience, the American conscience leads, leads to apathy.

There is a growing sense that the very things that we thought we could use to strengthen us are actually undermining us, and now we’re trying to in some ways find a happy medium, in some ways there’s a completely alternative reaction which is how do we wean ourselves and isolate ourselves from the network as much as possible?

Tavia Gilbert: Despite the instability and chaos of the modern, networked age, Wear hears a call not to isolate himself, but to persist in working toward the common good.

Michael Wear: To move from the sort of material to the spirit of the endeavor of the enterprise is that we’ve ceased even looking towards the common good. So you know there’s an idea in reform theology about approximate justice. Sort of the idea that well we ought to accept that through politics we’re never going to achieve perfect justice, but the Christians’ aim is towards getting this close as we can. That just because we can’t achieve perfect justice, the proximate justice to get as close as we can with this faithful means as possible is the aim.

What’s, what’s really important to understand is that politics feeds into culture, and culture feeds into politics. They’re all pulling on each other. And so many of the forces are leading us to think in a self-interested way first, in a way that’s detached from community. I mean so, in so many areas of our life the self-interested and individualistic are being affirmed and that, that leads to real problems when you then try and take those same people into a political realm that has to be in, that has to be focused on more than just the individual in order to function properly.

Morality demands more of us than legalism, demands more of us of merely following the law. It’s about the orientation of our hearts, and AI certainly isn’t going to do that for us. So technology can be wielded for the good, and is right now. How it will end up, it really depends on what appetites the people have. What they’re most eager to say yes to, and what they’re willing to say no to. That’ll determine fate. But yes there are bright pockets of technology doing wondrous things to bring people together and make people feel heard and to orient people towards the common good in ways that just weren’t possible before.

Tavia Gilbert: Vint Cerf also leaves us with a hopeful offering: No matter how good machines get at analyzing data, he has yet to see evidence that machines can compete with our greatest creative strengths.

Vint Cerf: Humans are astonishingly good at abstracting the real world, building models, and then reasoning about how those models work and what would happen if they changed the model. By which I mean change the real world to match a different model, predicting what might happen. Speculation, imagination, invention, innovation, comes from this ability to generalize, analyze, and change. No machine to my knowledge has that capability.

Tavia Gilbert: And Nuala O’Connor is working on ways to serve humanity in big and small initiatives, reorienting toward the common good. In her own family, that is found in redefining the relationship between humans and technology, and finding times to step away from it.

Nuala O’Connor: On and off, I’ve tried for something called the digital Sabbath which I think is a remarkable construct. And when we have held true to it, it has allowed for peace and space and dialogue in my household. That’s bringing back conversation and respect.

Tavia Gilbert: I think we’re in some trouble. But it’s comforting to think that wise people all over the world are working on the question of, “What is good digital citizenship?” And, at least in the United States, in the next ten days, we’ll see Americans’ civic participation, as we place our votes.

I was raised by parents who taught me about the importance of public institutions and liberal democracy. They took me with them whenever they went to vote, so that key behavior was learned early. The value of public education, science, and community participation have always been cornerstones of my personal value system.

But until hearing today from Vint Cerf, Nuala O’Connor, and Michael Wear, I admit that I had not considered that, despite my ambivalence about much available technology, part of my modern civic responsibility is to understand it, and to work toward technology’s productive use as a tool for the common good, and against its power to undermine democracy, equality, and human rights. I’m glad you’re here with me, coming into a deeper understanding about the rights and responsibilities we have as citizens.

We’ll be back in two weeks for another timely episode on Citizenship in a Networked Age, with Sir Paul Collier, professor of economics and public policy at the Blavatnik School of Government, University of Oxford, and the author of several books, including The Future of Capitalism: Facing the New Anxieties. Here’s Paul, talking about the need to come together as a community to cope with those anxieties:

Paul Collier: What we certainly can draw a straight line between is this ability to see ourselves as a “we,” not as a one group looking at others and saying, you’re “they.” And so the ability to see everybody is a common we, and then yes, an ability repeatedly to forge new common purposes and then work towards achieving them. And as you do that, you start to become confident that yes, we can do this sort of thing.

Tavia Gilbert: We’ll bring you more from that conversation with Paul Collier next time. In the meantime, if you liked today’s Story of Impact, we’d be grateful if you’d take a moment to subscribe to the podcast, rate and review us, and if you’d share or recommend this program to someone you know. That support helps us reach new audiences. For more stories and videos, please visit storiesofimpact.org.

This has been the Stories of Impact podcast, with Richard Sergay and Tavia Gilbert. This episode written and produced by Talkbox and Tavia Gilbert. Assistant producer Katie Flood. Music by Aleksander Filipiak. Mix and master by Kayla Elrod. Executive Producer Michele Cobb.

The Stories of Impact podcast is generously supported by Templeton World Charity Foundation.

Author(s)

  • Richard Sergay is an award-winning veteran network television journalist and senior media executive who spent much of his career at ABC News. He reported on major domestic and international stories for World News, Nightline and Good Morning America and ABC Radio. Richard completed a six-year assignment as Bureau Chief and Correspondent based in South Africa covering the end of White rule and Apartheid, as well as the release of Nelson Mandela from prison and the ensuing peace negotiations. After the South Africa assignment, Richard began a new beat for ABC News – the first for any major network --  focused on the digital revolution unfolding in the U.S.
  • Writer and producer of several nonfiction podcasts with a global audience, Tavia Gilbert is the acclaimed narrator of more than 650 full-cast and multi-voice audiobooks, Booklist’s Audiobook Narrator of the Year, and a multi–Audie Award-winner, including for Best Female Narrator. She is also the creator of The Abels, a scripted podcast in collaboration with the BBC. Tavia holds a BFA in Acting from Cornish College of the Arts and an MFA in Creative Nonfiction from Vermont College of Fine Arts, where she is in the Writing & Publishing faculty.