Technocrats Want Us to Pray to Machines

If God is dead, praying to machines is permitted—perhaps even necessary. And if God is not dead? Well, you can pray to machines, anyway. That appears to be the technocratic plan as we move into the future. As with many such dreams, it overlooks our need for other human beings, with all the drama and messiness that entails. That need is innate, as is our yearning for transcendence.

As progress marches on, people are being severed from their organic communities and the traditional rites that hold them together. Don’t worry, though. There’s an app for that.

Last April, as Covid restrictions were being lifted, Robert Jones at PRRI discovered Facebook had quietly rolled out a “prayer posting” feature for its religious users. As Gizmodo reported on June 3, the platform now provides a “pray” button to click whenever a prayer request is posted to a faith-based group. It’s analogous to the vapid “like” icon, except the “pray” button is supposedly directed heavenward.

The Gizmodo writer, Shoshana Wodinsky, correctly notes that “prayer posts” allow this data-hungry corporation to dig deeper into human souls—the grieving mother, the repentant adulterer, the doubting Thomas. One obvious reason is to bombard the faithful with targeted ads. The spiritual data is also being harvested to add to detailed dossiers on millions of people. Along with many other tech platforms, Facebook uses these abstract digital doubles to predict and direct future behavior.

Once you know exactly what the faithful are after, it’s possible to create the perfect artificial god, like a carefully carved puzzle piece sliding into place.

A Facebook spokesperson explained, “During the COVID-19 pandemic we’ve seen many faith and spirituality communities using our services to connect, so we’re starting to explore new tools to support them.” A more accurate statement would be “We’re exploring new tools to probe and manipulate our users.”

However one interprets the Covid lockdowns, their effect has been to separate us from each other, as well as from our communal traditions. The unbroken continuity of the ancient rites—Christian, Jewish, Muslim, Hindu, Buddhist, Sikh—was severed in an instant. Across the planet, communion with the divine was forced online, digitized, and sifted for content.

The spiritual effects of this policy are unclear, but the psychological impact is well known. It’s a grim amplification of cultural trends already underway. For decades, tech companies have positioned themselves between human beings and the objects of our deepest longing. As we’re peeled apart and isolated, digital devices are provided to fill that void.

A recent Associated Press poll found that nearly a fifth of adults in America—totaling 46 million—say they have “just one person or nobody they can trust for help in their personal lives.” Looking at young people, a recently published longitudinal study conducted on 217 students at Dartmouth College found that over the past year their depression and anxiety rates have shot through the roof.

Since last fall, two Dartmouth students have committed suicide. Two others perished from unknown causes. Of course, none have died from Covid.

The methodology of the Dartmouth study is of particular interest. Each student installed a StudentLife app on his or her smartphone to collect “sensing data” lifted from GPS trackers, accelerometers, and lock/unlock status. This data was used to analyze the students’ stress levels and sleep patterns, and to infer mood.

To no one’s surprise, the researchers concluded the Covid crisis wreaked havoc on the kids’ mental health. You could ask any of their mothers and she’d probably tell you the same, but who needs maternal intuition when scientists have “smartphone sensing data”? The fact that the initial lockdown policies were largely informed by the flawed Imperial College computer simulation only increases the irony.

As we survey the resulting antisocial environment, an important question remains: how can anyone help unstable souls through troubled waters when they’re forced into isolation—or worse, when they choose to remain isolated?

In the Old Normal, a caring friend or concerned adult might sit down and talk a person through it. Primitive techniques such as eye contact, empathy, and hugging might be employed. No need for that now, though. There are plenty of apps to simulate interpersonal connection.

The Woebot is the most successful to date, having been recently approved by the FDA and boosted by the New York Times. The way it works is that patients cuddle up with their smartphones and text their innermost troubles to this touchscreen therapist. Over time, its AI algorithms come to know that person inside out. According to corporate promotional materials, “Woebot’s breakthrough is its ability to form a therapeutic bond with users...we’re defining what it means to connect positively with technology in the modern world.”

According to a recent study—published in the same journal as the Dartmouth paper—researchers determined that Woebot can achieve a “human-level bond” within 3-5 days. They claim this is on par with a human therapist. Apparently, this “relational agent...could mark a foundational step toward purely digital solutions’ ability [sic] to meet surging demand for mental health care.”

A number of therapeutic apps are already available. As atomization persists alongside ad-driven normalization, the demand will only grow. It’s just a matter of time before similar machine learning programs are incorporated into robotic humanoids who listen to existential crises and dispense automated gems of wisdom. In fact, that precedent is already established.

Over the past few years, robotic priests have popped up in various parts of the world. One of them stands in a 400-year-old Buddhist temple in Kyoto, Japan. This million-dollar monstrosity, named Mindar, is a silicone incarnation of the enlightened goddess Kannon. The temple’s human priest defends its existence by correctly noting that secularized Buddhists have abandoned the eight-fold path for more worldly endeavors.

“This robot will never die,” the monk told an enthusiastic Vox reporter, “it will just keep updating itself and evolving. With AI, we hope it will grow in wisdom to help people overcome even the most difficult troubles. It’s changing Buddhism.”

This bizarre shift extends across many faiths, including the Protestant retro-bot BlessU-2; the talking Catholic icon SanTO (Sanctified Theomorphic Operator); a mechanical Ganesh performing aarti in India; and Xian’er in China—a cartoonish Buddhist monk holding a touchscreen. Its stated purpose is to “reach out to people who are more connected to their smartphones than their inner being.”

Call me old-fashioned, but these digital demi-gods strike me as utterly profane. They’re the product of cynical minds with no sense of the sacred. And yet, their technophilic boosters pose a serious question: if divinity transcends the physical realm, what’s the difference between the corporeal human form and an articulate machine?

As a free-thinking Christian, immersed in various spiritual communities, the answer seems obvious to me. These creatures have no soul. Look into their plastic eyes, and a yawning abyss gazes back. It’s like a smartphone incarnate with nobody on the line. Or maybe it’s just me.

A new generation is being conditioned to see robots as sentient, and organic beings as mere bio-machines. As they come to accept the profane as sacred, the spiritualization of social media interaction, therapeutic apps, and holy androids will be the norm. In a literal sense, the iconic arrow may become the object of worship.

Like it or not, the future is coming on fast. But if this dystopian landscape leaves you feeling depressed, don’t despair. You can always find comfort in a Woebot.

Joe Allen covers technology for the War Room: Pandemic. His work has appeared in The Federalist, ColdType, This View of Life, The American Spectator, IBCSR: Science on Religion, Disinformation, and elsewhere. Follow him @JOEBOTxyz and

IMAGE: Mindar, the Buddhist teacher in Kyoto. YouTube screen grab.

To comment, you can find the MeWe post for this article here.

If you experience technical problems, please write to