With a recent explosion of AI tools that claim to do everything from writing news articles to composing eulogies, tech companies seem to be desperately trying to convince people that AI-generated content is the future. Human beings, like those who have described other technology trends as inevitable disagree.
According to a new paper from researchers at Ohio State University, using AI generators to write personal correspondence like letters and text messages actually results in profoundly negative reactions from friends and colleagues who receive them. In the study, 208 adults were presented with one of three scenarios involving a fictional “good friend” named “Taylor.” The participants were instructed to reach out to Taylor about their situation–needing emotional support, a conflict with a colleague, or an upcoming birthday–and then rate how they felt about Taylor’s reply.
In cases where Taylor used an AI tool to edit the reply, participants responded far more negatively than if they were told the reply was written by Taylor alone. People who received an AI-assisted response recorded having much lower opinions about Taylor, and were far less likely to agree that Taylor “meets my needs as a close friend” or “likes me as a close friend.” Overall, the study’s authors wrote, “using AI assistance led participants to perceive the friend expended less effort, reducing participants’ relationship satisfaction and increasing uncertainty.”
“After they get an AI-assisted message, people feel less satisfied with their relationship with their friend and feel more uncertain about where they stand,” said Bingjie Liu, the study’s lead author and an assistant professor at Ohio State University, in a news release accompanying the paper.
This research is part of a larger study on how technology affects people’s perceptions about their relationship. Previous studies have analyzed the usage of the Facebook “like” button, finding that receiving likes in lieu of a thoughtful crafted response causes people to see their relationship with the liker more negatively. In both these cases and those in Liu’s paper, these reactions seem to be tied to our perception of effort: if someone responds with likes or AI-generated text, it speaks to a larger inequity or imbalance in the relationship that leaves the receiving party feeling distrustful and hurt.
To be fair, the participants also responded negatively when told that Taylor’s response had received non-AI forms of assistance, such as getting a friend to help draft the message. But AI tools like ChatGPT have made it far easier to produce semi-believable machine-generated content, and the fallout can be seen not only in creative industries and the internet, but in human relationships too.
“In conclusion, we found that in relational maintenance, people distinguish internal and external effort in technology-mediated communication,” wrote the paper’s authors. It is internal effort, which is valued by people and contributes to their well-being in relationships. AI augmentation, although saves users’ effort, from the perspective of their partners, it may reduce perceived effort and thereby compromise relationship satisfaction and increase partner uncertainty.”
The post Your Friends Will Hate You If You Use AI to Write Texts, Science Confirms appeared first on VICE.