Can Human-AI Collaboration Enhance Empathy?
New research finds AI could facilitate providing support to others in need.
Can artificial intelligence help people be more supportive with each other in online conversations?
A new study has found that an AI system helped people feel like they could engage more empathically in online text-based asynchronous peer-to-peer support conversations. Artificial intelligence has been used to collaborate with humans in many industries, including customer service, healthcare, and mental health. Human-AI collaboration is more common for low-risk tasks, such as composing emails with Gmail's Smart Compose real-time assisted writing or checking spelling and grammar. Augmentation with AI, rather than replacement by AI, is an important approach, especially in areas that require human judgment and oversight, such as more complex tasks like mental health care and high-risk military decision making.
The AI system in the study, named HAILEY (Human-AI coLlaboration approach for EmpathY) is an "AI-in-the-loop agent that provides people with just-in-time feedback" on words to use to provide support empathically. This system is different from an AI-only based system that generates text responses from scratch without collaboration with a person.
The collaborative AI system offers real-time suggestions on how to express empathy in conversations based on words that the user has provided. At that point, the user can accept or reject the suggestion or reword their response based on the AI feedback. This gives the user ultimate authorship over the final wording. The goal is to augment the interaction through human-AI collaboration, rather than replace human interaction with AI-generated responses. The study looked at the impact of providing this AI system as a consultant to peer members of an online support platform called TalkLife, on which peers provide support for each other using asynchronous posts, not chatting or text messaging conversations.
The research team found that a human-AI collaborative approach lead to a nearly 20% increase in feeling able to express empathy, as self-rated by the peer support users who consulted the AI system, compared to people who did not have access to the system. Those in the group of peer supporters who felt like they usually experience difficulty providing support reported an even larger increase in their feeling (39% increase) of being able to express empathy. Of note, the study did not compare levels of "perceived empathy." In other words, the people receiving the support did not rate the responses for level of empathy or compare human-AI collaborative responses to a human-only response; the empathy was rated by those writing the response with the help of the AI system. Future research on perceived empathy will be useful to examine the potential positive impact on other online forums and social networks.
Overall, this study represents promising and innovative research that demonstrates how a human-AI collaboration can allow people to feel more confident about providing support. Empathy, for many, can be a skill learned from modeling and practice. A scalable empathy-coaching tool that can be integrated into online communication is a gamechanger for empathic design—an example of human-AI collaboration facilitating more positive human connection.
Marlynn Wei, M.D., J.D., is a board-certified Harvard and Yale-trained psychiatrist and therapist in New York City.