Record numbers of people are turning to AI chatbots for therapy, reports Anthony Cuthbertson. But recent incidents have uncovered some deeply worrying blindspots of a technology out of control
I was a physiotherapist and the AI recommendations for physical/mechanical health feel like someone grabbed a diagnosis from a lucky dip of options. It sounds very professional but doesn’t specifically diagnose issues for the client.
is it better to have an AI therapist than none at all?
The evidence so far shows that the answer to that is a responding “no”. LLM bots have suggested means of suicide to people in crisis and encouraged unhealthy behavior in people with reading disorders. They are dangerous in such roles and should never be used in place of a therapist.
No therapy is better than a “therapist” that tries to murder you.
The AI therapist question is a very good one, is it better to have an AI therapist than none at all?
I was a physiotherapist and the AI recommendations for physical/mechanical health feel like someone grabbed a diagnosis from a lucky dip of options. It sounds very professional but doesn’t specifically diagnose issues for the client.
The evidence so far shows that the answer to that is a responding “no”. LLM bots have suggested means of suicide to people in crisis and encouraged unhealthy behavior in people with reading disorders. They are dangerous in such roles and should never be used in place of a therapist.
No therapy is better than a “therapist” that tries to murder you.