Editor’s note: This content was originally published for the Christian Economic Forum’s 2019 Global Event. This content is shared with Denver Institute for Faith & Work with consent from CEF.
Artificial intelligence (AI) stands in the midst of a renaissance. Once underfunded and haunted by failed, grandiose claims of all-powerful machines,1 AI has transformed into the hot technology du jour thanks to prodigious funding, hype, and success.2 Be it smarter autocorrect, autonomous vehicles, or algorithmically generated cooking recipes, the past few years has made AI attractive in the eyes of consumers.3 Today AI brings forth images of Alexa and Siri, not the Terminator and HAL. Many applications of AI are indeed beneficial and even benign, yet the advertised improvements to quality of life obfuscate an important risk to this century: the challenge to privacy caused by algorithmic mimicry.
Algorithmic mimicry is the concept that a statistical model can predict what you will see, think, feel, say, and do given some stimulus. Imagine that inside a Google data center lies a black box. This box, given some input, accurately predicts the response you might have. This box learns and reflects you more accurately when given more data. When you visit BibleGateway.com, the box learns to respond more like an evangelical. Read Brietbart or watch MSNBC? This box learns how you vote. Already corporations like Google, Amazon, and Facebook have these boxes, in the form of AI models, powered by their oceans of user data.4
More frightening still, what should we as citizens expect the future to look like when corporations and governments test language, pricing, advertising, and whatever they wish, against mathematical facsimiles of ourselves to understand how we might respond? We should expect we might act however they want. Algorithmic mimicry gives malicious and beneficial entities alike the ability to optimize and fine tune propaganda until desired outcomes become statistically likely, all without informed consent. Nowhere is this horror of AI more apparent than the People’s Republic of China, where its new “social credit score” demonstrates a new frontier in controlling a populace.5
The Orwellian scenarios brought about by algorithmic mimicry grow in severity in direct proportion to the amount of data available to the AI model.6 5G represents the final missing piece for collecting enough data on users to make algorithmic mimicry unassailable.
5G, the next generation of wireless Internet, promises 100 times higher data rates, and 70 times faster transmission, all with less power consumption; this will lead to a proliferation of devices everywhere.7 Sometimes called the Internet of Things, these devices like your thermostat will be reliably Internet connected in the future. By way of example, your thermostat of the future might be aware of your GPS coordinates, allowing your house to cool when you leave and warm by the time you return from work. What can we expect if these connected devices have privacy policies that allow them to monitor the home? We can expect them to share what they hear with law enforcement.8 We can even expect advertisements for a divorce attorney when noticing higher levels of yelling in the kitchen, complete with a statistically “fair” set of divorce terms generated by algorithm.9
If past is prologue, we can only assume entities will apply collected data to their models of citizens for economic and political advantage. Even worse, these soon ubiquitous devices, when combined with AI, might be optimized so citizens do not even notice their subtle influence. In short, the combination of omnipresent, distributed, and Internet-connected data gathering devices with centralized, fast, and accurate AI modeling represent the coming deathblow to privacy.
In March 2019, Thomas Friedman claimed, “The world has gone from flat, to fast, to deep.”10 AI and 5G, when combined, are the catalytic mechanism by which his so called “deep technology” age will accelerate. Deep surveillance, deep propaganda, deep learning: This new, deep world will model, predict, and shape its citizens, customers, and users. What does the deep technology age already look like? It looks like Target knowing your daughter is pregnant before you.11 It looks like computer-generated pornography where any person you desire will have sex with any other person you desire, including yourself.12 It looks like refusing to sell you a train ticket because of the friends you keep.13
As believers, how might we respond to this dystopian future? There are three means we should leverage to defend against these negative outcomes: our ballot, wallet, and wisdom.
First, believers should show up and vote for politicians who vigorously defend privacy. Historically, some politicians have shown themselves to be highly disincentivized to defend privacy.14 Often governments have the most to gain by violating the privacy of their citizens, remaining reticent to limit institutions that give them control and understanding of their citizenry.15 For example, microtargeting of propaganda on Facebook for political gain likely represents the greatest advance in political technology since the attack ad. In contrast, Christians will likely lose when governments can quickly identify effective evangelical messages.16
Second, believers should choose products, corporations, and services that understand the true economics of data. That data is an asset is a false assumption. Privacy only flourishes when data is seen as a liability on the balance sheet. When data is viewed as an asset, corporations do everything within their power to retain, augment, catalogue, and share data with one another. Ubiquitous, connected devices represent the collection of the raw material, while AI and statistical modeling represent the refining process. Instead, when data is viewed as a liability, companies store only what is absolutely necessary, destroying the rest as soon as possible. No company keeps toxic waste around for nostalgia. By trading our data for “free” services, we have established a terrifying and fallen economic incentive. Companies must extract revenue from the only resource given to them: the data of their users.17 Data storage costs declining exponentially only exacerbates this effect, as nothing ever becomes too expensive to keep.18 Believers must look to the future and demand ephemerality. You may have nothing to hide today, but it would be wise to remember that what you do now might need to be hidden tomorrow. Finally, the most powerful tool believers have in the struggle for privacy is wisdom—applied knowledge from the Word. While Solomon teaches that “there is nothing new under the sun,”19 which are humbling words for any technologist to hear, Scripture has much to say about discretion. In John 2, Jesus Himself seemed to know when to hide information from others: “But Jesus on his part did not entrust himself to them, because he knew all people.”20 We must be shrewd in understanding those moments we might be gifting data to those opposed to peace for the least, last, and lost. Believers must pray for insight and Christ’s discretion when engaging with technology the world brings forth.
Hear More From Becker Polverini at Business for the Common Good
Share this article
This post was published December 16, 2019
Becker is the CEO at PKC. Before starting PKC, Becker worked at Microsoft Research, Bridgewater, the Columbia University Intrusion Detection Systems Lab, and building secure communication systems in war zones. He has published research in cybersecurity and machine learning and graduated with a Computer Science degree from Princeton University. Becker sits on the board of Dreams InDeed International and was formerly Chair of the Computer Science Advisory Council at CalState Dominguez Hills.