Technocrats Want Us to Pray to Machines

Contact Your Elected Officials
American Thinker Header

If God is dead, praying to machines is permitted—perhaps even necessary. And if God is not dead? Well, you can pray to machines, anyway. That appears to be the technocratic plan as we move into the future. As with many such dreams, it overlooks our need for other human beings, with all the drama and messiness that entails. That need is innate, as is our yearning for transcendence.

As progress marches on, people are being severed from their organic communities and the traditional rites that hold them together. Don’t worry, though. There’s an app for that.

Last April, as Covid restrictions were being lifted, Robert Jones at PRRI discovered Facebook had quietly rolled out a “prayer posting” feature for its religious users. As Gizmodo reported on June 3, the platform now provides a “pray” button to click whenever a prayer request is posted to a faith-based group. It’s analogous to the vapid “like” icon, except the “pray” button is supposedly directed heavenward.

The Gizmodo writer, Shoshana Wodinsky, correctly notes that “prayer posts” allow this data-hungry corporation to dig deeper into human souls—the grieving mother, the repentant adulterer, the doubting Thomas. One obvious reason is to bombard the faithful with targeted ads. The spiritual data is also being harvested to add to detailed dossiers on millions of people. Along with many other tech platforms, Facebook uses these abstract digital doubles to predict and direct future behavior.

Once you know exactly what the faithful are after, it’s possible to create the perfect artificial god, like a carefully carved puzzle piece sliding into place.

A Facebook spokesperson explained, “During the COVID-19 pandemic we’ve seen many faith and spirituality communities using our services to connect, so we’re starting to explore new tools to support them.” A more accurate statement would be “We’re exploring new tools to probe and manipulate our users.”

However one interprets the Covid lockdowns, their effect has been to separate us from each other, as well as from our communal traditions. The unbroken continuity of the ancient rites—Christian, Jewish, Muslim, Hindu, Buddhist, Sikh—was severed in an instant. Across the planet, communion with the divine was forced online, digitized, and sifted for content.

The spiritual effects of this policy are unclear, but the psychological impact is well known. It’s a grim amplification of cultural trends already underway. For decades, tech companies have positioned themselves between human beings and the objects of our deepest longing. As we’re peeled apart and isolated, digital devices are provided to fill that void.

A recent Associated Press poll found that nearly a fifth of adults in America—totaling 46 million—say they have “just one person or nobody they can trust for help in their personal lives.” Looking at young people, a recently published longitudinal study conducted on 217 students at Dartmouth College found that over the past year their depression and anxiety rates have shot through the roof.

Since last fall, two Dartmouth students have committed suicide. Two others perished from unknown causes. Of course, none have died from Covid.

The methodology of the Dartmouth study is of particular interest. Each student installed a StudentLife app on his or her smartphone to collect “sensing data” lifted from GPS trackers, accelerometers, and lock/unlock status. This data was used to analyze the students’ stress levels and sleep patterns, and to infer mood.

To no one’s surprise, the researchers concluded the Covid crisis wreaked havoc on the kids’ mental health. You could ask any of their mothers and she’d probably tell you the same, but who needs maternal intuition when scientists have “smartphone sensing data”? The fact that the initial lockdown policies were largely informed by the flawed Imperial College computer simulation only increases the irony.

As we survey the resulting antisocial environment, an important question remains: how can anyone help unstable souls through troubled waters when they’re forced into isolation—or worse, when they choose to remain isolated?

In the Old Normal, a caring friend or concerned adult might sit down and talk a person through it. Primitive techniques such as eye contact, empathy, and hugging might be employed. No need for that now, though. There are plenty of apps to simulate interpersonal connection.

The Woebot is the most successful to date, having been recently approved by the FDA and boosted by the New York Times. The way it works is that patients cuddle up with their smartphones and text their innermost troubles to this touchscreen therapist. Over time, its AI algorithms come to know that person inside out. According to corporate promotional materials, “Woebot’s breakthrough is its ability to form a therapeutic bond with users…we’re defining what it means to connect positively with technology in the modern world.”

According to a recent study—published in the same journal as the Dartmouth paper—researchers determined that Woebot can achieve a “human-level bond” within 3-5 days. They claim this is on par with a human therapist. Apparently, this “relational agent…could mark a foundational step toward purely digital solutions’ ability [sic] to meet surging demand for mental health care.”

By Joe Allen

Read Full Article on AmericanThinker.com

Biden Doesn't Have Americans Best Interest At Heart