A study was published recently in the Journal of Social and Personal Relationships found that using artificial intelligence applications to help craft a message to a friend is not a good idea especially if your friend finds out about the use of AI. I’m sure this applies to using AI applications to help craft a message to an ex as well and even using a coach like me to draft your texts to your ex.
Researchers found that people believe that a friend who used AI assistance to write them a message didn’t put forth as much effort as a friend who wrote a message themselves.
The study involved 208 adults who participated online. Participants were told that they had been good friends with someone named Taylor for years. They were given one of three scenarios: They were experiencing burnout and needed support, they were having a conflict with a colleague and needed advice, or their birthday was coming up.
Participants were then told to write a short message to Taylor describing their current situation in a textbox on their computer screen.
All participants were told Taylor sent them a reply. In the scenarios, Taylor wrote an initial draft. Some participants were told Taylor had an AI system help revise the message to achieve the proper tone, others were told a member of a writing community helped make revisions, and a third group was told Taylor made all edits to the message.
In every case, people in the study were told the same thing about Taylor’s reply, including that it was “thoughtful.”
Still, participants in the study had different views about the message they had supposedly received from Taylor.
Those who received a reply helped by AI rated what Taylor did as less appropriate and more improper than did those who received the reply that was written only by Taylor.
AI replies also led participants to express less satisfaction with their relationship, such as rating Taylor lower on meeting “my needs as a close friend.”
This perception may be understandable, but the effect goes beyond the message itself, said Bingjie Liu, lead author of the study and assistant professor of communication at The Ohio State University.
“After they get an AI-assisted message, people feel less satisfied with their relationship with their friend and feel more uncertain about where they stand,” Liu said.
It wasn’t just the use of technology that turned people off
One possible reason is that people may not like the AI-aided response could be that people think using technology is inappropriate and inferior to humans in crafting personal messages like these. But results showed that people responded just as negatively to responses in which Taylor had another human — a member of an online writing community — help with the message.
“What we found is that people don’t think a friend should use any third party — AI or another human — to help maintain their relationship,” Liu said.
The lower participants rated Taylor’s effort by using AI or another person, the less satisfied they were with their relationship and the more uncertainty they felt about the friendship.
Effort is very important in a relationship
People want their partners or friends or ex to put forth the effort to come up with their own message without help — from AI or other people. They want to know how much you are willing to invest in your friendship and if they feel you are taking shortcuts by using AI to help, or relying on another person to help craft a message, that’s not good.
In the study, people were more uncertain about their relationship with Taylor if they received the AI-aided response, being less certain about the statement “Taylor likes me as a close friend.”
Of course, most people won’t tell a friend that they used AI to helped craft a message, Liu said. But she noted that as ChatGPT and other services become more popular, people may start doing a Turing Test in their minds as they read messages from friends and others.
The phrase “Turing Test” is sometimes used to refer to people wondering if they can tell whether an action was taken by a computer or a person.
“It could be that people will secretly do this Turing Test in their mind, trying to figure out if messages have some AI component,” Liu said. “It may hurt relationships.”
The answer is to do your own work in relationships
“Don’t use technology just because it is convenient. Sincerity and authenticity still matter a lot in relationships.”
I 100% agree that sincerity and authenticity matter a lot in relationships. This is why I don’t encourage copying and pasting scripts (including my own) that can be found all over the internet and why I don’t draft texts to an ex for my clients. Instead, I ask my clients to draft the texts they want to send to their ex themselves (in their own words and style) and I only review them and only if necessary edit for effectiveness.