Your therapist, the chatbot | The Week

0
Your therapist, the chatbot | The Week

How are people using AI for therapy?

A growing number are sharing their anxieties, frustrations, and darkest thoughts with AI chatbots, seeking advice, comfort, and validation from a sympathetic digital helper. There are hundreds of phone apps that pitch themselves as mental health tools. Wysa, which features a cartoon penguin that promises to be a friend “that’s empathetic, helpful, and will never judge,” has 5 million users in more than 30 countries. Youper, which has more than 3 million users, bills itself as “your emotional health assistant.” But many people use generalist chatbots like OpenAI’s ChatGPT as stand-in therapists, or AI companion platforms like Character.AI and Replika, which offer chatbots that appear as humanlike virtual friends and confidants. A recent study found that 12% of American teens had sought “emotional or mental health support” from an AI companion. Proponents say AI therapy could help fill gaps in a health-care system where talk therapy is expensive and often inaccessible. Replika founder Eugenia Kuyda said she’s received lots of emails from users “saying that Replika was there when they just wanted to end it all and kind of walked them off the ledge.” But mental health experts warn that chatbots are a poor substitute for a human therapist and have the potential to cause real harm. “They’re products,” said UC Berkeley psychiatrist Jodi Halpern, “not professionals.”

How do people engage with the chatbots? 

It might be as simple as asking a bot for advice on how to handle stressful situations at work or with a loved one. Kevin Lynch, 71, fed examples of conversations with his wife that hadn’t gone well to ChatGPT and asked what he could have done differently. The bot sometimes responded with frustration—like his wife. But when he slowed down and softened his tone, the bot’s replies softened as well. He’s since used that approach in real life. “It’s just a low-pressure way to rehearse and experiment,” Lynch told NPR. Other people use AI bots as on-call therapists they can talk to at any time of day. Taylee Johnson, 14, told Troodi—the mental health chatbot in her child-focused Troomi phone—her worries about moving to a new neighborhood and an upcoming science test. “It’s understandable that these changes and responsibilities could cause stress,” replied Troodi. Taylee told The Wall Street Journal that she sometimes forgets Troodi “is not a real person.” Kristen Johansson, 32, has relied on ChatGPT since her therapist stopped taking insurance, pushing the cost of a session from $30 to $275. “If I wake up from a bad dream at night, she is right there to comfort me,” Johansson said of the chatbot. “You can’t get that from a human.”

What are the dangers? 

link

Leave a Reply

Your email address will not be published. Required fields are marked *