Help us raise $225,000 for NCR!

We have raised $38,541 from 245 supporters. We are 17.1% of the way to our goal!

Hope for an AI doomer: 'Laudato Si' ' responds to today's technology with promise

Illustration of robot interacting with columns of numbers and letters.

(Pixabay/geralt) 

by Scott Hurd

View Author Profile

Join the Conversation

Send your thoughts to Letters to the Editor. Learn more

Editor's note: This is the second of a two-part series on Laudato Si' and artificial intelligence. You can read the first part here

Today's sensational headlines about the threats of AI never fail to fill me with alarm, or even dread. But in Pope Francis' 2015 encyclical Laudato Si', I encounter a kindred spirit. 

Francis may not be a card-carrying "AI doomer" as I am, but he has strong reservations about the effects of the "technocratic paradigm" upon human beings, our cultures and the planet we share. His words offer not just a prophetic perspective on what he sees unfolding, but also timely prescriptions on how to move forward, both for tech titans and ordinary tech users — which is pretty much everybody, including doomers like me. 

Francis pleads for the tech world to slow down, not by returning "to the Stone Age," but by considering the impact of what is being developed. "How can a society plan and protect its future," he asks, "amid constantly developing technological innovations?" 

Silicon Valley "disruptors" and "accelerationalists" may "move fast and break things," in the (in)famous words of Facebook's Mark Zuckerberg, to crush the competition and woo investors, but they can fail to consider what, or who, might be disrupted or broken in the process. 

Sam Altman, CEO of Open AI, released ChatGPT before some thought it was safe or ready, and he's now CEO of a company whose value exceeds the GDP of over half the world's nations. As Laudato Si' warns, "The alliance between the economy and technology ends up sidelining anything unrelated to its immediate interests" — which typically center on money and power. 

Technological products are not neutral. They can shape society, our self-understanding and how we navigate through the world. At worst, Laudato Si' warns, they condition lifestyles "along the lines dictated by the interests of certain powerful groups." 

President Joe Biden hosts a meeting on artificial intelligence June 20, 2023, at The Fairmont hotel in San Francisco.

President Joe Biden hosts a meeting on artificial intelligence June 20, 2023, at The Fairmont hotel in San Francisco. (Flickr/Official White House Photo/Adam Schultz) 

Hard questions need to be asked when developing them: "What will it accomplish?" and "What are the risks?" Technologists cannot answer these alone, for "the specialization … makes it difficult to see the larger picture." They might insist that they seek to "save humanity" and to construct an AI that's "aligned" with humanity's goals. But just what are those goals, and what are we being saved from, or for? 

What's needed, Francis implores, is "a broad, responsible, scientific and social debate … capable of considering all the available information." Because "a technology severed from ethics will not easily be able to limit its own power." 

That's why Laudato Si' calls for tech power to be restrained. Unless limits are imposed, it warns, "the techno-economic paradigm may overwhelm not only our politics but also freedom and justice." It can even "undermine the sovereignty of individual nations." 

Yet in its fight against restraints, in 2022 the tech industry spent $69 million lobbying Capitol Hill lawmakers, lending credence to the encyclical's assertion that "our politics are subject to technology and finance." 

The ChatGPT app is seen on a phone placed atop a keyboard in this photo taken in Rome March 8.

The ChatGPT app is seen on a phone placed atop a keyboard in this photo taken in Rome March 8. (CNS/Lola Gomez) 

In the face of this, Francis exhorts, public pressure needs to force "decisive political action," and global regulatory norms are required "to impose obligations and prevent unacceptable actions." We can't bury our heads in the sand, "pretending that nothing will happen," he says. If we do, unbridled technology "ends up considering any practice whatsoever as licit."

One possibility Francis suggests is "boycotting certain products." In this vein, I refuse to use Open AI's ChatGPT — a product of theft, having been trained on massive amounts of copyrighted published material without permission or compensation. Not to mention the conditions endured by Kenyan workers who helped build it, being subjected to, as Laudato Si' puts it, "what they would never do in developed countries or the so-called first world." 

But now that Microsoft has invested billions into OpenAI, must I stop using Outlook and Teams and LinkedIn (owned by Microsoft)? I can't function professionally without them. And what about Google and Facebook, both of which I've used for years? I've never had an account on X/Twitter, but Francis himself posts there frequently. 

It seems that some degree of enmeshment is unavoidable for many, as both organizations and individuals fear being left behind by the AI avalanche: Something I feel acutely, as I advance on 60. 

Woman wears virtual glasses using artificial intelligence.

Virtual glasses using artificial intelligence are seen in this photo. (CNS/Reuters/Yves Herman) 

Resistance might involve refusal to own stock in irresponsible AI developers — like some divest from fossil fuels — a costly principled stance, given that tech firms are some of today's best stock market performers. To walk away from AI is to walk away from cash. Others might choose to remain invested, not to increase their wealth, but to use their shareholder power to push for change. 

And might there be protests at AI data centers by those who care for creation, given the colossal amounts of energy they consume? The bottom line, insists Laudato Si', is the "great need for a sense of social responsibility on the part of consumers."

The need for social responsibility, implores the encyclical, is part of a greater need: that "we look for solutions not only in technology but in a change of humanity." Tech saturation inhibits us "from learning how to live wisely, to think deeply and to love generously," and we cannot expect tech to benefit us if "humanity loses its compass." 

"The existence of laws and regulations is insufficient" to help us shape a more hopeful future. What's needed is a "bold cultural revolution" that involves discarding our "misguided lifestyle" and changing how we relate to tech, which we "have the freedom needed to limit and direct" toward positive ends that are "healthier, more human, more social, more integral." Because if we don't, we'll suffer further "anxiety" and "loss of the purpose of life," leaving us to seek "new forms of escapism to help us endure the emptiness."

"Our technical prowess has brought us to a crossroads," concludes Laudato Si'. "(W)hat is at stake is our own dignity" and "the ultimate meaning of our earthly sojourn." Yet there is "always a way out," thanks to the "authentic humanity" that "seems to dwell in the midst of our technological culture, almost unnoticed." 

"No system" — including AI — "can completely suppress our openness to what is good, true and beautiful, or our God-given ability to respond to his grace at work deep in our hearts." 

Such a vision gives hope to this AI doomer, who finds it prescient that years before ChatGPT awakened many to the power and perils of AI, Laudato Si' prophetically taught that care of our common home requires a strident concern about the impacts of tech. And this warning came none too soon. Because, as Sam Altman has promised, what's coming next will make ChatGPT look "very quaint." 

Latest News

Advertisement

1x per dayDaily Newsletters
1x per weekWeekly Newsletters
1x per quarterQuarterly Newsletters