We had the pleasure of interviewing Dr. Jose Hamilton Vargas--the co-creator of Youper, an artificially intelligent personal assistant that helps manage emotional and behavioral health--about the benefits and risks of using AI chatbots to assist therapy and mental health care. Q: Many are under the misconception that the field of psychiatry and cutting-edge tech are like oil and water. Your a highly qualified and experienced psychiatrist, how do you see the relationship between psychiatry and new technologies? A: Interesting to notice that despite Medicine being a field that is always evolving and adopting new technologies, I must admit that doctors are not the best examples of tech early adopters. It's not different with psychiatrists. Sometimes, we need a little incentive from our patients. The development of new technologies is matched by great public demand. A recent poll, showed that about 70% of Americans have interest in using mobile apps to self-monitor and self-manage their mental health. The most recent advance is the rise of the AI-powered personal assistants. Ten years ago, AI assistants were the stuff of science fiction. Today, you can find dozens of them in the app store. In the medical field, AI assistants are poised to become first-responders to medical problems. They are smart, focused agents who are available to help you any time, any where. I believe it’s safe to say that AI has the potential to improve mental health care. It won’t replace doctors and therapists, but it will help them get better at taking care of our bodies and minds. Q: Where does AI fit in therapy today, and how far can you see it going in the future? A: In the United States, 43.8 million people live with emotional and behavioral health issues, like anxiety and depression. Of those, 60% did not receive treatment for their condition in the last year. The main reason people go without treatment for their mental illness is that they don’t have access to medical care. Often, this is because they can’t afford medical treatment or, equally sadly, because they are afraid of the stigma attached to mental illness. AI can help doctors lower both of these barriers by providing mental health support to people who might otherwise go without care. For a traditional consultation with a psychiatrist, you can expect to pay something in the ballpark of $100–300 per hour. AI assistants, on the other hand, might cost you a fraction of that. Even if it doesn’t completely eliminate the cost of treatment for a mental condition, it does go a long way towards reaching people who can’t afford treatment otherwise. At the same time, AI assistants are less intimidating than medical professionals. Rather than worrying about being seen walking into a therapist or psychiatrist's office, you can use an AI app in the privacy of your own home. If you feel like you can’t admit your true feelings to another human being, even a medical professional, you might find relief in talking to an AI assistant. Q: Personalization is integral to the evolution of mental health care. What is the relationship between AI for mental and behavioral health and personalization? A: I've been a psychiatrist for 12 years, helping people who suffer from a wide range of mental health issues. Depression and anxiety-related problems are the most frequent conditions among the population. Whatever the case, the most common thing I've heard from patients is: "It took me years until I dared to reach for help." This is in line with studies that show that it takes an average of ten years someone with mental health issues to see a healthcare professional. Until people finally see a healthcare professional, they try to overcome the problem by going through all kinds of self-help solutions. However, the challenge is finding a solution that fits your needs and style. In fact, researchers show that several effective therapy approaches can be successfully applied as guided self-help or internet-based treatments for anxiety, depression, and other issues, but without personalization it's impossible to achieve success in mental health care. Q: Part of the future of mental health and wellness tech (and the global mental health movement) is the importance of taking into account cultural differences. How does/can AI for mental health take this into consideration? A: Personalization and taking into account individual differences, including cultural differences, is key for self-help approaches and professional mental health care. Meaningful and actionable information is what you need to make both approaches more personalized. We’re biologically constrained by a limit to the volume, speed and complexity of information that we can understand. Scientists find that when we remove these human limitations and employ machine intelligence to process data at a super-human scale, then a comprehensive picture of each individual emerges AI-powered health tools can empower individuals with the information they need to better understand their condition. These tools can also be used to give doctors a more timely, holistic picture of the patient’s health, helping them to make more informed decisions and allowing them to spend more time on patient care and prevention. Q: Let’s talk clinical. Youper can detect symptoms of depression, anxiety, and social anxiety, how far away are we from using AI for diagnosis? Do you think AI should be used like this? A: Detecting symptoms is a first step, but it's not a substitute for medical diagnosis. Healthcare is extremely complex and, of course, there may be very serious consequences if anything goes wrong. Unlike other areas of tech, you cannot launch a minimally viable product in healthcare—you have to be sure that your technology is safe and accurate. It's vital that AI is being deployed in such a way that it does not replace human contact and judgement, especially where complex needs have to be assessed, or decisions made about treatment plans. In my opinion, AI will first supplement and help doctors with diagnostic decision making, and support patients by monitoring their conditions between the visits to doctors' offices. Q: Self-guided approaches are important to taking control over one’s mental health, is there a risk of over-dependence on AI as is described in human-human therapeutic interactions? A: One of the biggest concerns about taking medications for a mental health condition is the danger of becoming dependent on it. Comparison of effect sizes in pharmacotherapy and psychotherapy trials is difficult, but there are trials comparing drug treatment versus cognitive behavioural therapy in some disorders that suggest that drugs can have faster effects, but the effects of cognitive behavioural therapy might last longer.2 Findings from studies suggest that most people who respond to an SSRI will relapse within a few months if the drug is discontinued after acute treatment, and about 25% of people who respond to SSRI treatment and continue drug treatment will relapse within 6 months. By contrast, the effects of psychological interventions are generally well maintained at follow-up, and participants can continue to apply new skills and make further gains after the end of acute treatment. For this reason, and because of the lower risk of side-effects, psychological interventions should be preferred over pharmacological interventions for initial treatment.3 Psychological interventions delivered using AI will have the same logic, with the difference being that AI will be used as a day to day tool for practicing newly developed mental health skills. The risk of over-dependence on AI is the same for any technology. Q: We are making huge strides in understanding how to enhance engagement with digital health, which is arguably even more important with mental health care and the importance of the digital therapeutic relationship, what is your take on this and how is this approached with Youper? A: Despite demonstrated efficacy, digital interventions are characterized by relatively poor adoption and adherence. A hypothesized reason for this lack of adherence is the loss of the human interactional quality that in-person therapy retains. That is a lesson learned at Youper. We worked several months without rest to build the first version, assembling the most effective psychological techniques initially helping people overcome social anxiety. Within the first week of launching, our team received a great surprise: people all over the world were using Youper and leaving impressive, passionate feedback about their improvement and how the app was having a life-changing impact. Our whole team celebrated, and we even had a small party to cement that achievement in our memories. After coming down from the excitement of the positive feedback, we turned to the data. People were, indeed, having significant symptom reduction, at least those who were committed to using the app. I got a kick in the stomach when I saw our behavior analytics dashboard and saw that only a small percentage of users were really engaged. What was happening? Where had we gone wrong? The psychological protocols and techniques were in place, so we needed insights from users. Our data told us who was engaged and who wasn’t, and both groups had opinions that could help us understand what was working well and what wasn’t. Through the user interviews, a light appeared at the end of the tunnel. We realized the right solution wouldn’t be an app or an online program. The feedback showed us that people were looking for something more personal, like a confidant or a friend to share sensitive things with and get support. At that moment, Youper Emotional Health Assistant was born. To become this assistant, we researched a lot, went deep into technologies that could make the assistant a reality, like artificial intelligence, and pushed ourselves to the limit to create a fantastic user experience. Our hard work proved worthwhile, as the user engagement more than tripled and Youper became the most beloved assistant for emotional and behavioral health in the App Store and on Google Play. Q: Youper is the world’s most beloved assistant for emotional and behavioral health, as the field develops how does Youper plan to stay ahead of the game in a rapidly evolving field? We are on mission to help everyone on the planet to become the best version of themselves, like being a super you. In fact, Youper is You + Super. To keep us on track and ahead of the game, we'll keep doing what made Youper become the most beloved assistant on the planet: listening to our users. We believe that crafting amazing user experiences and delivering real health outcomes are only possible when we put users in first place and use technology, like artificial intelligence, to push forward the human potential. We have great launches ahead, including partnerships with Universities and health centers to help our users take control of their emotional health. Dr. Jose Hamilton Vargas CEO and Co-Founder of Youper Jose Hamilton Vargas is a psychiatrist with over ten years of experience in clinical practice, author of two books, co-founder and CEO of Youper, a personal assistant that help manage your emotional and behavioral health. He cares most about technology, particularly using artificial intelligence, for good. References
1. Wang, P., Berglund, P., Olfson, M., & Kessler, R. (2004). Delays in Initial Treatment Contact after First Onset of a Mental Disorder. Health Services Research, 39(2), 393-416. doi: 10.1111/j.1475-6773.2004.00234.x 2. Heimberg, R., Liebowitz, M., Hope, D., Schneier, F., Holt, C., & Welkowitz, L. et al. (1998). Cognitive Behavioral Group Therapy vs Phenelzine Therapy for Social Phobia. Archives Of General Psychiatry, 55(12), 1133. doi: 10.1001/archpsyc.55.12.1133 3. Mörtberg, E., Clark, D., & Bejerot, S. (2011). Intensive group cognitive therapy and individual cognitive therapy for social phobia: Sustained improvement at 5-year follow-up. Journal Of Anxiety Disorders, 25(8), 994-1000. doi: 10.1016/j.janxdis.2011.06.007 |
RECENT ARTICLES
The 5 Biggest Threats to Mental Health #CureStigma: Is That Even Possible? Using AI Chatbots for Mental Health Care — An Expert Opinion An Interview With Stop, Breathe & Think Zero Suicide? New Scientific Tools to Turn Fantasy into Reality Smart Mental Health Insurance—The Key to Insurtech Industry Dominations? Can New Mental Health Tech Enhance the Therapeutic Relationship? Data Privacy & Security in the Era of Augmented Mental Health The Death of Psychotherapy As We Know It Mental Health Insurance Crisis: Can mHealth Help? Plug the Employee Productivity Drain with Augmented Mental Health A Global Access Solution to the Trillion Dollar Mental Health Crisis |