While some mental health apps may ultimately prove worthy, there is evidence that some can do harm. Subscribe to KFF Health News' free Morning Briefing. Ku said he’d like to see AI used instead to reduce practitioners’ tasks like record-keeping and data entry to “free up more time for humans to connect.” “I have a human therapist, and that will never be replaced by AI.” “The core tenet of medicine is that it’s a relationship between human and human - and AI can’t love,” said Bon Ku, director of the Health Design Lab at Thomas Jefferson University and a pioneer in medical innovation. Can a chatbot achieve transference, the empathic flow between patient and doctor that is central to many types of therapy? The same goes for anxiety or ecstasy, emotions so neurologically complex that scientists have not been able pinpoint their neural origins. “The computer will not, for example, experience loneliness in any sense that we understand it.” But “the experiences a computer might gain under such circumstances are not human experiences,” he told me. He foresaw the evolution of far more sophisticated programs like ChatGPT. He said students would interact with the machine as if Eliza were an actual therapist, when what he’d created was “a party trick,” he said. ![]() Though hailed as an AI triumph, ELIZA’s “success” terrified Weizenbaum, whom I once interviewed. Woman: He says I’m depressed much of the time.ĮLIZA: I am sorry to hear that you are depressed.ĮLIZA: Do you think coming here will help you not to be unhappy? Woman: Well, my boyfriend made me come here.ĮLIZA: Your boyfriend made you come here? In fact, his original AI program, created in the 1960s, was a psychotherapist named ELIZA, which used word and pattern recognition combined with natural language programming to sound like a therapist: There are good reasons to be cautious in the face of this marketing juggernaut.ĭecades ago, Joseph Weizenbaum, a professor at the Massachusetts Institute of Technology and considered one of the fathers of artificial intelligence, predicted AI would never make a good therapist, though it could be made to sound like one. Though marketed to treat conditions such as anxiety, attention-deficit/hyperactivity disorder, and depression, or to predict suicidal tendencies, many warn users (in small print) that they are “ not intended to be medical, behavioral health or other healthcare service” or “ not an FDA cleared product.” Few of the many apps on the market have independent outcomes research showing they help most haven’t been scrutinized at all by the FDA. Unfortunately, in the mental health space, evidence of effectiveness is lacking. At the South by Southwest conference in March, where health startups displayed their products, there was a near-religious conviction that AI could rebuild health care, offering apps and machines that could diagnose and treat all kinds of illnesses, replacing doctors and nurses. Given the Affordable Care Act’s mandate that insurers offer parity between mental and physical health coverage, there is a gaping chasm between demand and supply.įor entrepreneurs, that presents a market bonanza. ![]() At the same time, there has long been a shortage of mental health professionals in the United States more than half of all counties lack psychiatrists. The numbers explain why: Pandemic stresses led to millions more Americans seeking treatment. This story also ran on Los Angeles Times.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |