When AI Automates Relationships | TIME

Publish date: 2024-10-02

As we assess the risks of AI, we are overlooking a crucial threat. Critics commonly highlight three primary hazards—job disruption, bias, and surveillance/privacy. We hear that AI will cause many people to lose their jobs, from dermatologists to truck drivers to marketers. We hear how AI turns historical correlations into predictions that enforce inequality, so that sentencing algorithms predict more recidivism for Black men than white ones. We hear that apps help authorities watch people, such as Amazon tracking which drivers look away from the road.

What we are not talking about, however, is just as vital: What happens to human relationship when one side is mechanized? 

The conventional story of AI’s dangers is blinding us to its role in a cresting “depersonalization crisis.” If we are concerned about increasing loneliness and social fragmentation, then we should pay closer attention to the kind of human connections that we enable or impede. And those connections are being transformed by an influx of technology.

As a researcher of the impact of technology on relationships, I spent five years observing and talking to more than 100 people employed in humane interpersonal work like counseling or teaching, as well as the engineers automating and administrators overseeing it. I found that the injection of technology into relationships renders that work invisible, makes workers have to prove they are not robots, and encourages firms to overload them, compressing their labor into ever smaller increments of time and space. Most importantly, no matter how good the AI, there is no human relationship when one half of the encounter is a machine.

At the heart of this work is bearing witness. “I think each kid needs to be seen, like really seen,” Bert, a teacher and private school principal, told me. (All names in this article have been changed to protect privacy.) “I don’t think a kid really gets it on a deep level. I don’t think they are really bitten by the information or the content until they feel seen by the person they’re learning from.”

Many people depend on seeing the other clearly to make their contribution: clients healing, students learning, employees staying motivated and engaged, customers being satisfied. I came to call this witnessing work “connective labor,” and it both creates value and, for many, is profoundly meaningful. Pamela, an African-American teacher in the Bay Area, recalled how her own middle school teacher took the time to find out that her selective mutism was a response to her family moving incessantly. “I thought, ‘I want to be that teacher for my kids in this city. I want to be the teacher that I wanted, and that I needed, and that I finally got.’”

Yet this labor is nonetheless threatened by automation and AI. Even therapy, one of the professions most dependent on emotional connection between people, has seen inroads from automated bots, from Woebot to MARCo. As Michael Barbaro noted on The Daily when ChatGPT3 responded to his query about being too critical: “Ooh, I’m feeling seen—really seen!”

Read More: Do AI Systems Deserve Rights?

Technologists argue that socioemotional AI addresses problems of human performance, access and availability, which is a bit like the old joke about the guests at a Catskills resort complaining about the food being terrible—and such small portions!  It is certainly true that human connective labor is fraught, full of the risk of judgment and misrecognition—as Pamela repeatedly faced until she met the middle school teacher who finally listened to her. Yet the working conditions of connective labor shape people’s capacity to see the other.

“I don’t invite people to open up because I don’t have time,” said Jenna, a pediatrician. “And that is such a disservice to the patients. My hand is on the doorknob, I’m typing, I’m like, ‘Let’s get you the meds and get you out the door because I have a ton of other patients to see this morning.’”

Veronica demonstrates for us some of the costs of socioemotional AI. A young white woman in San Francisco, she was hired as an “online coach” for a therapy app startup, to help people interact with the app. She was prohibited from giving advice, but the clients seemed happy to think of the coaches as private counselors. “I loved feeling like I had an impact,” she said.

Yet, despite both the personal significance and emotional wallop of the work, Veronica’s own language joined in minimizing her effect. She “loved feeling like I had an impact,” but quickly followed that with “Even though I wasn’t really doing anything. I was just cheering them on and helping them work through some hard things sometimes.” Just as AI obscures the invisible armies of humans that label data or transcribe audio, it erases the connective labor of the human workers it relies upon to automate.

Veronica also found herself facing a new existential task: proving that she was human.  “A lot of people were like, ‘Are you a robot?’” she told me. I asked her how she countered that impression. “I basically just tried to small talk with them, ask another question, maybe share a little bit about myself if it was appropriate.”  In essence, Veronica’s connective labor—normally the quintessential human activity—was not enough to convey her humanness, which she had to verify for a clientele accustomed to machines.

Finally, Veronica may have found the work moving, humbling, and powerful, but she left because the firm increased the client roster to untenable levels. “Toward the end they were trying to model everything using algorithms, and it’s just like, you can’t account for the actual emotional burden of the job in those moments.” Already convinced the coaches were nothing but handmaidens to the app, the firm piled on new clients heedlessly. 

In the midst of a depersonalization crisis, “being seen” is already in too short supply. The sense of being invisible is widespread, animating working-class rage in the U.S. and abroad, and rife within the social crises of the “deaths of despair,” suicide and overdose deaths that have radically lowered life expectancy.

While many remain close to family and friends, there is one kind of relationship that has changed:  the “weak ties” of civic life and commerce.  Yet research shows that these ties help to knit together our communities and contribute to our health. A 2013 UK study entitled “Is Efficiency Overrated?” found that people who talked to their barista derived well-being benefits more than those who breezed right by them.

The solution that Big Tech offers to our depersonalization crisis is what they call personalization, as in personalized education or personalized health.  These advances seek to counter the alienating invisibility of standardization, so that we are “seen” by machines. But what if it is important—for us and for our social fabric—not just to be seen, but to be seen by other people? 

In that case, the working conditions of jobs like those of Bert, Jenna, and Veronica are consequential. Policies to limit client or student rosters and hours worked would help reduce overload for many groups, from medical residents to public school teachers to domestic workers, as would a National Domestic Workers Bill of Rights recently proposed in Congress.

We should also reign in some of the pervasive enthusiasm for data analytics, as its data entry requirements routinely fall on the very people charged with forging connections.  Just as important is the looming imposition of new technologies taking aim at connective labor. At the very least, socioemotional AI should be labeled as such so we know when we are talking to a robot, and can recognize—and choose—human-to-human connections.  Ultimately, however, we all need to take responsibility for protecting the human bonds in our midst, because these are the unheralded costs of the AI spring.

AI is often sold as a way of “freeing up” humans for other, often more meaningful work. Yet connective labor is among the most profoundly meaningful work humans do, and technologists are nonetheless gunning for it. While humans are imperfect and judgmental to be sure, we also know that human attention and care are a source of purpose and dignity, the seeds of belongingness and bedrock of our communities; yet we tuck that knowledge away in service to an industry that contributes to our growing depersonalization. What is at risk is more than an individual or their job, it is our social cohesion—the connections that are a mutual achievement between and among humans.

ncG1vNJzZmismaKyb6%2FOpmZwaGFlf3mEjrCfnqZdlrZurdStpqaZpJrAbr7EpZitoZ%2BjwKm1z6xknqujlsZw