AI apps are supplemental, not solutions to mental health support
AI apps are supplemental, not solutions to mental health support

AI apps are supplemental, not solutions to mental health support

How did your country report this? Share your view in the comments.

Diverging Reports Breakdown

Mental health crisis: The urgent need for human care

The UK is facing a deepening mental health crisis, with rising demand, long waiting times, and growing concerns over the use of AI as a stopgap solution. Waiting lists are up to two years to see a Mental Health professional (Rethink), and during the waiting period, people’s mental health is decreasing fast. NHS is addressing this critical gap by piloting AI assistants that offer therapy to people while waiting to be seen by a human health professional. This temporary quick-fix solution doesn’t feel right, using an AI robot to offer advice and therapy on their devices, encouraging more use of technology. AI is better placed to improve efficiencies in the back office and for diagnostics in the lab, but not to offer support to a group of fragile and vulnerable people. Especially as this could exacerbate their condition. The Government needs to invest more in recruiting additional mental health nurses and offering attractive salaries. Small-scale trials have reported that fully automated chatbots DMHI can reduce anxiety and depression.

Read full article ▼
The UK is facing a deepening mental health crisis, with rising demand, long waiting times, and growing concerns over the use of AI as a stopgap solution

New figures from the NHS report that 3,790,826 people were in contact with services during 2023-24, compared to 2,726,721 in 2018/19.

Despite the Government spending £11.31 billion on mental health in 2023/24 and £11.79 billion on MHIS in 2024/25, this is not enough to address the crisis. Waiting lists are up to two years to see a Mental Health professional (Rethink), and during the waiting period, people’s mental health is decreasing fast.

The NHS is addressing this critical gap by piloting AI assistants that offer therapy to people while waiting to be seen by a human health professional.

This temporary quick-fix solution doesn’t feel right, using an AI robot to offer advice and therapy on their devices, encouraging more use of technology. Surely, AI is better placed to improve efficiencies in the back office and for diagnostics in the lab, but not to offer support to a group of fragile and vulnerable people. Especially as this could exacerbate their condition. The answer is that the Government needs to invest more in recruiting additional mental health nurses and offering attractive salaries.

Mental health escalating

Levels of mental health cases have escalated since the COVID pandemic, lockdown and the cost-of-living crisis. The speed of modern life and the overuse and reliance on technology and social media, particularly in children, are contributing factors.

With a staggering more than one million children, with 16-year-olds most likely to be seeking NHS support for mental health. Ofcom reported that 46% of adolescents are online ‘almost constantly’. And 97% of children have a smartphone by 12, according to Ofcom data.

Recruiting mental health nurses is challenging, and the demand is outstripping the capacity. The Royal College of Nursing revealed a survey showing dissatisfaction with nurses last year, with nearly half of the respondents (45%) indicating they were planning to quit or consider leaving the profession.

AI therapists

Several NHS Trusts are piloting AI apps, known as Digital Mental Health Interventions (DMHIs), to help people with mental health challenges.

Unlike the day-to-day apps used to support well-being, these apps are subject to standards and regulations. The AI apps must integrate into existing care pathways and the delivery infrastructure, and staff must be upskilled to use them.

It is also suggested that they should supplement rather than replace face-to-face delivery. Small-scale trials have reported that fully automated chatbots DMHI can reduce anxiety and depression, and academics have stated that these could be used to support people while waiting to be seen by a professional.

Naturally, this poses many risks, such as data privacy and safety. Also, human professionals may forget to oversee what is happening with the patient, leaving the AI robot and the patient alone and leaving decision-making to an AI bot!

However, a new study with Dartmouth reported they used a Gen-AI-powered chatbot to provide mental health treatment in a randomised controlled trial. The bot was fine-tuned on mental health conversations created by experts, consisting of cognitive behaviour therapy (CBT). Participants who used the personalised Therabot experienced significant improvements in their symptoms over an eight-week period where 51% said their symptoms for depression were decreased, as were the 31% who suffered from anxiety and 19% improved with an eating disorder.

It is very early, and these chatbots and apps are only being trialled with limited time and data to make such critical decisions. Can therapy work or be effective without human connection?

AI cannot replace the human therapist

People are using ChatGPT for therapy because it’s free and instant. The benefits of speaking to AI robots are that they don’t judge, so people don’t feel worried about what they are asking or talking about in the same way they would if it were a human.

They can ask for advice on what they should do in social or work situations, or children use it to ask about friendship issues. The answers are friendly, creative, and helpful, but it is a robot programmed by an algorithm, so it is not personalised. Let’s not forget that it is a machine! And not trained to deal with individual mental health requirements.

A human therapist knows the patient as a person; they know their back story, what they like and dislike, and they can read their facial expressions and body language. They challenge them with questions, forcing them to think outside the box or from another person’s perspective to help their healing journey.

An AI robot cannot do all this, lacking empathy, nuance, reflection, and understanding. As humans, we are all different and don’t fit into an algorithm, especially someone who has complex mental issues.

A significant concern is that whatever you put into the algorithm is recorded and used elsewhere, which poses significant security concerns for people’s personal, sensitive, and highly confidential data.

AI for adolescence

Any parent that has watched the global hit harrowing TV show Adolescence has almost certainly reduced screen time and monitored usage.

Are parents really going to be comfortable with the suggestion that while they wait about 18 months to two years to see a mental health professional, they can use an AI bot in the interim? I would have thought not!

AI has its place

AI is the perfect solution to streamline processes in the back office and relieve mental health professionals of unnecessary burdens. Enabling them to focus on patient care rather than administrative tasks and reporting. It can also help improve triaging, prioritising, and screening emails and calls in the contact centre and highlight emergency calls.

Plus, automatically sending personalised reminders for appointments in advance or chasing patients who have missed appointments is very costly to the NHS. This can lead to shorter waiting times, improved patient care, and retention of staff who are relieved of the overburden of admin work.

Balance approach

AI to improve processes and efficiencies in the back office is a no-brainer, and it can also be used in labs for precise diagnosis, monitoring and risk.

But to start using it to plug the gap for fragile and vulnerable people, especially children waiting to see a mental health professional from up to 18 months to two years, is high risk.

It is an easy, quick deterrent from the real issue that needs addressing: to increase the investment to recruit more mental health nurses and offer them the higher pay they deserve. If we don’t, then the mental health crisis will escalate to robots or not!

Source: Openaccessgovernment.org | View original article

AI Therapy? How Teens Are Using Chatbots for Eating Disorder Recovery

A growing number of young adults are using AI to seek therapy and circumvent barriers to care. ChatGPT encourages users to challenge the belief that they are not smart enough. Experts say these tools can motivate people who may otherwise have abandoned mental health care. The first AI therapist hit the scene in 1966, when MIT professor Joseph Weizenbaum invented ELIZA, the first-ever chatbot.. More and more people are flocking to these types of mental health tools, which experts say may be worthwhile, according to Psychiatry Online. But are they equipped to offer mental health advice, or even to stand in as digital therapists? According to experts, the issue isn’t so black and white.“So long as [the tool] is built by providers who are well-informed, I think it could absolutely be helpful,” said Kelli Rugless, a licensed psychologist and clinical advisor at Project Heal, a nonprofit aimed at breaking down systematic, financial, and healthcare barriers to eating disorder treatment.

Read full article ▼
Beatriz Santos writes in her journal most days, referring to it as her “lifeline.” As a sophomore at Loyola University, she uses her journal to write down academic goals (“I want to be more motivated”), pose philosophical questions (“Why do I feel incomplete?”), and share feelings of insecurity (“I’m not smart enough”).

Santos likes to take photos of her journal entries and then upload them to ChatGPT in search of advice. Recently, when she uploaded a photo of her entry about not feeling smart enough, ChatGPT encouraged her to challenge the belief with evidence, reframe it with more empowering ones, and look for experiences where the belief proved to be false. “If you think, ‘I’m not smart enough,’” ChatGPT told her, “recall times when you learned new skills or solved different problems.”

“I realize that it’s a chatbot and not an actual person,” Santos says about this practice, noting that she doesn’t feel an emotional connection to ChatGPT. “But even for a chatbot to be able to identify a problem within my journal and be like — ‘These are feelings that you were having in response to your environment; that’s OK’ — is extremely validating, especially when you don’t really have someone to talk to about these things.” Santos has struggled with disordered eating and would like to find a therapist to help her while she’s in college and away from her support system. But a recent change in her insurance provider made the process more complicated than expected. ChatGPT, she said, has ultimately been a more accessible, affordable, and efficient option.

Santos is one of a growing number of young adults who are using artificial intelligence to seek therapy and to circumvent barriers to care — including cost, insurance challenges, and a shortage of mental health providers nationwide. These barriers are especially prevalent when it comes to eating disorders, which are commonly misunderstood and often go undiagnosed, partly due to an acute lack of medical training on how to screen for or treat these disorders. AI tools can provide additional support, but are they equipped to offer mental health advice, or even to stand in as digital therapists? According to experts, the issue isn’t so black and white.

“For a chatbot to be able to identify a problem within my journal and be like — ‘These are feelings that you were having in response to your environment; that’s OK’ — is extremely validating.”

Using AI for therapy is as old as chatbots themselves. The first AI therapist hit the scene in 1966, when MIT professor Joseph Weizenbaum invented ELIZA, the first-ever chatbot. ELIZA was a conversation chatbot, programmed to act as a psychotherapist for users by scanning for keywords and mirroring them back to the user in response, according to CBC.

Since ELIZA, these tools have gotten much more sophisticated, particularly in the last few years. Now, there are far more chatbots specifically designed for mental health support — including Woebot, Wysa, and Therabot, which all rely on evidence-based treatments and have been shown to reduce users’ depression symptoms.

According to Psychiatry Online, more and more people are flocking to these types of mental health tools, which experts say may be worthwhile.

Some Students Are Using ChatGPT to Text Their Own Friends 9 college students share how they use Chat.

“So long as [the tool] is built by providers who are well-informed, I think it could absolutely be helpful, particularly because nationally we are very, very, very understaffed in terms of the amount of providers who specialize in eating disorders,” said Kelli Rugless, a licensed psychologist and clinical advisor at Project Heal, a nonprofit aimed at breaking down systematic, financial, and healthcare barriers to eating disorder treatment. “Every day we’re encountering all of the barriers that exist for people to get access to healing, and so the idea of an AI chatbot is actually very promising in that it could give folks who would never have the ability to access services some level of support.”

AI can be especially appealing to teens who grew up in environments where mental health was stigmatized and who don’t feel comfortable asking for help or seeking their parents’ permission to get therapy. Or, for people who want to take their time in talking about mental health because a bot won’t make them feel rushed.

And, these tools can motivate people who may otherwise have abandoned hope to get care. In March, clinical psychologist Gemma Sharp published a study looking at how a chatbot provided single-session interventions for people on long waitlists for eating disorder treatment. Sharp co-designed the chatbot with other psychologists, as well as people who had recovered from an eating disorder.

Source: Teenvogue.com | View original article

Mental Health Tech: Developing Apps for Accessible Behavioral Health Solutions

Mental health apps provide tools for managing conditions like anxiety, depression, PTSD, and stress. These apps range from guided meditation platforms like Calm to cognitive behavioral therapy (CBT)-based tools like Woebot. In 2024, the global mental health app market was valued at over $4 billion, with millions downloading apps to access mental health care conveniently. Apps must comply with HIPAA (in the U.S.) or GDPR (in Europe) to protect user information. AI-powered apps analyze user inputs (e.g., journal entries) to tailor interventions. Apps promote daily mental wellness practices, like mindfulness or gratitude journaling, helping users build resilience before crises arise. Apps should complement, not replace, professional health care, not substitute for it. They should balance AI with human oversight to avoid over-reliance on algorithms. The future of mental health tech is uncertain, but developers are creating apps to revolutionize mental health Care, making support available anytime, anywhere. For more information, visit the Mental Health Tech blog.

Read full article ▼
Introduction

Mental health care is a critical yet underserved area of healthcare, with nearly 1 in 5 adults globally experiencing a mental health condition each year. Stigma, cost, and limited access to therapists often prevent people from seeking help. Mental health tech, particularly mobile apps, is breaking these barriers by offering accessible, scalable behavioral health solutions. This blog explores how developers are creating apps to revolutionize mental health care, making support available anytime, anywhere.

The Rise of Mental Health Apps in Mental Health Care

Mental health apps provide tools for managing conditions like anxiety, depression, PTSD, and stress. These apps range from guided meditation platforms like Calm to cognitive behavioral therapy (CBT)-based tools like Woebot. By leveraging smartphones, they deliver mental health care to users’ pockets, offering:

Self-help resources: Guided exercises, mood tracking, and coping strategies.

Therapist connections: Teletherapy platforms like BetterHelp link users with licensed professionals.

Community support: Forums or peer-support features foster connection.

Data-driven insights: Analytics to monitor progress, like mood trends over time.

In 2024, the global mental health app market was valued at over $4 billion, with millions downloading apps to access mental health care conveniently.

Why Mental Health Apps Matter

1. Expanding Access to Mental Health Care

Many people, especially in rural or underserved areas, lack access to therapists. Apps bridge this gap by providing 24/7 mental health care resources. For example, Talkspace connects users with therapists via text or video, eliminating the need for in-person visits.

2. Reducing Stigma

Apps offer discreet support, allowing users to seek mental health care privately. This is crucial for those hesitant to visit a clinic due to social stigma.

3. Affordability

Traditional therapy can cost $100-$200 per session. Apps like Headspace offer subscriptions for as low as $12.99/month, making mental health care more affordable.

4. Personalized Support

AI-powered apps analyze user inputs (e.g., journal entries) to tailor interventions. Woebot, for instance, uses chatbot technology to deliver CBT exercises based on a user’s emotional state.

5. Preventive Care

Apps promote daily mental wellness practices, like mindfulness or gratitude journaling, helping users build resilience before crises arise.

Key Considerations for Developing Mental Health Apps

Creating an effective mental health app requires careful planning to ensure usability, efficacy, and trust. Developers should focus on:

1. User-Centric Design

Intuitive UX: Simple navigation and calming visuals encourage regular use.

Accessibility: Features like text-to-speech or multilingual support cater to diverse users.

Engagement: Gamification, such as streaks for daily meditation, boosts retention.

2. Evidence-Based Content

Incorporate proven methodologies like CBT, dialectical behavior therapy (DBT), or mindfulness. Collaborate with psychologists to ensure interventions are clinically sound. For example, Sanvello offers CBT exercises validated by mental health experts.

3. Data Privacy and Security

Mental health data is sensitive. Apps must comply with HIPAA (in the U.S.) or GDPR (in Europe) to protect user information. Use end-to-end encryption and transparent privacy policies to build trust.

4. AI and Personalization

Integrate AI to analyze user behavior and customize content. For instance, an app could suggest breathing exercises if a user reports high anxiety. However, balance AI with human oversight to avoid over-reliance on algorithms.

5. Integration with Healthcare Systems

Enable seamless data sharing with electronic health records (EHRs) or telehealth platforms to support coordinated mental health care. This ensures therapists can access app-generated insights, like mood logs.

6. Scalability

Design apps to handle growing user bases and diverse needs, from mild stress to severe depression. Cloud-based infrastructure and modular features support scalability.

Challenges in Mental Health App Development

Despite their potential, mental health apps face hurdles:

Efficacy Concerns: Not all apps are backed by rigorous research, leading to skepticism about their impact.

User Retention: Many users abandon apps after initial use. Engaging content and reminders are critical.

Regulatory Compliance: Navigating HIPAA or FDA regulations can be complex and costly.

Equity: Low-income users may lack access to smartphones or reliable internet, limiting reach.

Over-Reliance Risk: Apps should complement, not replace, professional mental health care.

Developers can address these by prioritizing research, user feedback, and partnerships with healthcare providers.

The Future of Mental Health Tech

The future of mental health apps is bright, with emerging trends shaping their evolution:

AI Advancements: More sophisticated AI will enable predictive analytics, like detecting early signs of a depressive episode.

Wearable Integration: Combining apps with wearables (e.g., Fitbit) will track physiological data, like heart rate variability, to inform mental health care.

VR and AR: Virtual reality therapy apps could simulate calming environments or exposure therapy for phobias.

Policy Support: Governments are expanding telehealth reimbursement, boosting app adoption.

Global Reach: Localization for non-English languages and culturally relevant content will broaden access to mental health care.

Conclusion

Mental health tech, through innovative apps, is transforming how we approach behavioral health. By expanding access, reducing stigma, and offering personalized mental health care, these apps empower millions to manage their well-being. Developers play a pivotal role by creating user-friendly, evidence-based, and secure solutions. As technology and policies evolve, mental health apps will become integral to global mental health care, ensuring support is just a tap away.

Source: Community.nasscom.in | View original article

Mental Health Doesn’t Have to Cost a Lot. 4 Free Ways to Get Yours Back on Track

Therapy is a critical part of treatment for mental health concerns like depression or anxiety. Online therapy services like BetterHelp and Talkspace make it more affordable, at around $60 to $90 a session. Mental health apps like Moodfit and Sanvello are great tools to use on your mental wellness journey. See how to naturally treat depression and anxiety and ways you can give yourself a happiness boost each day in the next article, “How to Treat Your Depression or Anxiety” at CNN.com/soulmatestories and follow the instructions on Twitter @CNNHealth and @cnnmentalhealth. For confidential support call the Samaritans on 08457 90 90 90 or visit a local Samaritans branch, see www.samaritans.org for details. In the U.S., call the National Suicide Prevention Lifeline at 1-800-273-8255 or visit www.suicidepreventionlifeline.org. For support in the UK, contact the National suicide prevention Lifeline on 1-8457-9255 or click here.

Read full article ▼
Therapy is a critical part of treatment for mental health concerns like depression or anxiety. It can also be expensive. Online therapy services like BetterHelp and Talkspace make it more affordable, at around $60 to $90 a session, but that’s still not in the budget for many people. So, what do you do if you can’t afford it?

Therapy will always be the gold standard, but if it’s temporarily out of reach, that doesn’t mean your mental health has to suffer. These simple wellness strategies can help you nourish your mental wellness without spending any money.

Also, see how to naturally treat depression and anxiety and ways you can give yourself a happiness boost each day.

1. Use mental health apps to track daily progress

Mental health apps offer resources to people who otherwise couldn’t get them. While they’re not a substitute for therapy and can’t diagnose conditions, mental health apps like Moodfit and Sanvello are great tools to use on your mental wellness journey. The best mental health apps will help you relieve stress and anxiety and teach you how to manage symptoms in the future.

There’s a lot of variety in what these apps offer and the features that are built in. Many offer a great catalog of educational resources to help you learn about conditions and adapt coping strategies to manage them daily.

Mental health apps can also be a reminder to check in on yourself. Most send push notifications throughout the day, which can be used as an indicator to stop and assess how you’re feeling.

Wiphop Sathawirawong/Getty Images

2. Implement cognitive behavioral therapy strategies on your own

Cognitive behavioral therapy is commonly used to treat depression, anxiety and addiction. CBT strategies and tools are intended to be taken outside of therapy sessions and used in daily life.

It’s called self-directed therapy. Again, it isn’t a replacement for traditional therapy with a professional, but it can supplement your mental health efforts when you don’t have access to talk therapy. This self-help strategy is best reserved for those with moderate symptoms that don’t affect daily tasks.

A systematic review of 33 studies found that self-help treatments can decrease anxiety and depression. Self-directed therapy results were “moderate,” according to the review. So people didn’t feel 100% better, but they reported feeling less anxious or depressed. If you’re interested in self-directed therapy strategies to improve your mental well-being, we recommend checking out the Association for Behavioral and Cognitive Therapies’ list of books. The books on the list have received a “seal of merit.”

Common self-directed therapy techniques:

Journaling: Writing down your thoughts and feelings and reflecting on them can help you identify negative thoughts and behavior patterns. Once you’re aware, you can take meaningful steps toward making changes.

Writing down your thoughts and feelings and reflecting on them can help you identify negative thoughts and behavior patterns. Once you’re aware, you can take meaningful steps toward making changes. Guided courses: With self-directed therapy, you have to start somewhere. Guided courses can help you learn methods and tactics for daily management. You can consult the National Alliance on Mental Illness for its mental health education directory.

With self-directed therapy, you have to start somewhere. Guided courses can help you learn methods and tactics for daily management. You can consult the National Alliance on Mental Illness for its mental health education directory. Mental health apps: Many mental health apps use cognitive behavioral therapy techniques to reduce anxiety and help manage symptoms

Getty Images

3. Stay connected to others

It’s important to connect with other people, especially those experiencing similar things. Studies show that connecting to others can provide a sense of meaning and purpose and decrease loneliness. Group therapy or support groups are typically led by a mental health professional or group leader and can be low-cost or free. Whether it be friends, family or strangers, sharing your feelings and experiences is essential.

You also can use the Substance Abuse and Mental Health Services Administration website to locate community resources near you.

Connections with people aren’t the only ones that can help improve your mental health. Pets and animals can reduce stress and anxiety levels. Take some intentional time to hang out with your pet — play with your dog, hug your cat. If you don’t have a pet, you can volunteer at a local animal shelter or humane society. Fostering or pet-sitting animals is also an option.

4. Practice mindfulness and meditation

Meditation has a history that stretches back thousands of years, but it’s become an extremely popular stress-relieving practice in the last few. Mindfulness helps you become more attuned to what you’re feeling and thinking, which helps you manage your thoughts and emotions more effectively, rather than becoming overwhelmed by them. Mindfulness uses techniques like meditation and breathwork to improve your mental health.

Mindfulness can help you manage symptoms of anxiety and other mental health disorders by helping you understand and cope with what you’re feeling. Studies show that meditation can help reduce stress, alleviate symptoms of depression or anxiety and help you sleep. The focus is on mind and body integration, which can help you enhance your mental well-being.

You can also use meditation apps to reduce stress and help maintain your mindfulness regimen. These free or low-cost apps are great for beginners.

Read more: Headspace Review: Get Tools for Mindfulness, Meditation and More for Just $5 a Month

Other practical tips to improve your mental health without therapy

Getty Images

When should I see a therapist?

Self-directed therapy and well-being tactics are extremely useful, but they’re not the be-all and end-all of mental health. Face-to-face time with a licensed therapist is essential for those with severe conditions and symptoms.

The first thing you should do is check your insurance. Employer-provided insurance and Medicaid may cover screenings, psychotherapy and counseling. Your insurance coverage will depend on your state and your health plan, but many plans include mental health coverage for in-network therapists.

Read more: How to Find the Best Therapist Near You

Your finances shouldn’t stop you from getting the help you need. It may take some research into therapists and programs, but there are low-cost options.

Sliding scale payments: Some therapists offer sliding scale fees — you pay what you can afford. The cost will be based on your income. Not all therapists offer this, but many do.

Some therapists offer sliding scale fees — you pay what you can afford. The cost will be based on your income. Not all therapists offer this, but many do. Low-cost or free services: Some therapists offer low-cost or free counseling for individual and group sessions. If you live near a college or university, the graduate department may offer free or discounted therapy sessions.

Some therapists offer low-cost or free counseling for individual and group sessions. If you live near a college or university, the graduate department may offer free or discounted therapy sessions. Community health centers: Community mental health centers assist those in surrounding areas.

Community mental health centers assist those in surrounding areas. Local and online support groups: Local organizations and volunteers in many areas offer support groups for things like grief and addiction. Use Mental Health America’s list of support groups to find one that best fits your needs. You can participate in a peer-led support group through the National Alliance of Mental Illness (NAMI).

More for your mental health

Source: Cnet.com | View original article

Q&A: Should pediatricians recommend therapy chatbots to patients?

A lot of research is still needed about use, outcomes and privacy. Most of these apps use large language models to respond to users in a human-like way. Children typically do not make decisions about their health and health care in the same way that a fully competent, capacitated adult does. There is an emerging base, but the data are still mixed, says Bryanna Moore, PhD, a clinical ethicist at the University of Rochester in New York.”I feel like I learn about new platforms or new apps every week,” Moore says. “They have been developed in the adult setting, and are starting to be picked up and expanded into the pediatric realm,’ she says. “They should not be a replacement for other forms of mental health support,” says Moore. “If a child is experiencing an acute mental crisis, it is not the time to be relying on apps, many of which do not have safety features built into them,” she adds. “It’s really important to be honest with patients and families about what we know and what we don’t know”

Read full article ▼
Q&A: Should pediatricians recommend therapy chatbots to patients?

Key takeaways:

Developers have created AI-based apps to treat adult and pediatric mental illness.

A lot of research is still needed about use, outcomes and privacy.

The use of AI in medicine is expanding to aid providers and patients with a variety of tasks, and one area that is growing rapidly is applications for treating mental illness.

“I feel like I learn about new platforms or new apps every week,” Bryanna Moore, PhD, a clinical ethicist and assistant professor in the department of health humanities and bioethics at the University of Rochester in New York, told Healio.

Most of these apps use large language models to respond to users in a human-like way, and many have been designed to utilize cognitive behavioral therapy-based tools, Moore said. Although some research has assessed the benefits and risks of using chatbots for mental health treatment, few studies have focused on pediatric populations.

“They have been developed in the adult setting, and are starting to be picked up and expanded into the pediatric realm,” but there is still so much to learn about them, Moore said.

We talked to Moore about how therapy chatbots work and what pediatricians should know before recommending them to patients.

Healio: How do therapy chatbots work? What are they designed to do?

Moore: They range from really simplistic bots — where each conversation is new, and it does not remember you over time — to more sophisticated apps that have a memory of you as a person and adapt their approach based on what they learn from you in particular.

What most of them have in common is that they rely on large language models — an algorithm that scoops up natural language and uses that to make predictions about a string of text and respond in a way that emulates a human therapist. You open the app, you share how you are feeling and it writes back in a way that either validates those feelings, helps you reframe a problem or uses other tools from cognitive behavioral therapy to help you move through whatever it is you are experiencing.

Healio: Are there special considerations that need to be made when designing therapy bots for kids?

Moore: I work in both the pediatric and adult settings as a clinical ethicist, and working with kids is just different. They are a vulnerable population who are typically dependent on adults and/or society for providing basic goods that they need to survive and flourish. Children typically do not make decisions about their health and health care in the same way that a fully competent, capacitated adult does.

A really important consideration when thinking about, developing and using these apps with children is that their minds are still developing. They are still learning about themselves, their identity and coping. Childhood is such a broad umbrella, and kids are at all different stages of cognitive and social development. Having apps that are sensitive to that and responsive to different social and developmental needs and speeds is really important.

Healio: Are there situations where these apps could be helpful?

Moore: These technologies can be most helpful when there is a concrete skill that kids might need to practice, including social skills, coping skills or mindfulness. Most of the anecdotal evidence says they are most helpful when a person is having increased anxiety. They jump on and follow a guided exercise that helps them feel validated or has them practice their breathing. There is a sense of anonymity that can also be helpful particularly for adolescents or even younger children who are struggling with something but are not ready to talk to someone about it yet.

Healio: What are the downsides of using these apps?

Moore: They should not be a replacement for other forms of mental health support. This is the situation I worry about. There have been some reports, mostly in adult populations, of what happens when these bots go wrong — when there are hallucinations in the AI, or someone develops a strong overreliance on the technology that negatively impacts other aspects of their life.

If a child is experiencing an acute mental health crisis, it is absolutely not the time to be relying on apps, many of which do not have any safety features built into them. Most app developers and platforms are very clear that they are not intended to be used in those sorts of situations.

Healio: How should pediatric providers talk to patients and families about therapy bots?

Moore: My honest answer to this question is that I do not think we know enough about the evidence base — the benefits and harms — to have good conversations and good informed consent processes for these technologies. There is an emerging evidence base, but the data are still really mixed. One of the big things that we can do is be really honest with patients and families about what we know and what we don’t know.

Importantly, pediatric providers should speak with patients and parents about these apps in a way where they are framed as a supplement — not a replacement — for children seeking help and receiving support from others in their life.

Healio: Are there any concerns about privacy or confidentiality?

Moore: AI is the Wild West at the moment. We do not have regulations in place to ensure privacy, to clarify who is responsible for the data or how they will be stored. Some platforms and developers do a much better job of being clear about how they are trying to mitigate that risk. But for others, there is not a lot of information available. So yes, it absolutely is a concern.

There is some of discussion of digital phenotyping in the existing literature and how the sensitive information that people might share if they are using these apps [will be used]. What if that is going to be sold or used for surveillance or other purposes, outside of the stated goals of the therapy bot? That is a huge ethical concern — one that we could potentially mitigate. But we just need more information, more transparency and clearer regulations around it.

Healio: Should pediatricians recommend therapy bots?

Moore: My honest answer for that is also I do not know. Apps like Alongside and Troodi are great examples. They have built-in safety features and are developed in a way that is very sensitive to the unique needs and interests of children and adolescents. I think there are some apps that pediatricians might consider using and recommending to children and families, but it needs to be on a case-by-case basis. It should not be, “Here is your app, off you go,” but more, “Let’s try this, see if it helps you with symptom management and other therapeutic goals, then come back together and discuss.”

There is absolutely a place to potentially recommend them, but we still need so much more research. We need to talk to children, adolescents and parents about what pressures they feel, how they think about using the apps and what would be most beneficial.

Healio: Anything else?

Moore: It is really important for us to talk about health disparities in this space. I think we need to develop [these apps] and recommend and utilize them with a real sensitivity to existing health disparities when it comes to mental health and their potential impact on those disparities. Maybe these apps could close the gap, but they could also potentially exacerbate it.

One of the worst things we could do is push people toward using the apps because they are a relatively low-wait-time, low-cost option without having a much stronger evidence base for both how that will impact individuals, but also how it will impact communities who are most likely to be pushed toward or away from using the apps.

For more information:

Bryanna Moore, PhD, can be reached at bryanna_moore@urmc.rochester.edu or on Bluesky at @bryannamore.bsky.social.

Source: Healio.com | View original article

Source: https://www.wilx.com/2025/06/08/ai-apps-are-supplemental-not-solutions-mental-health-support/

Leave a Reply

Your email address will not be published. Required fields are marked *