The news: Lyra Health is debuting a generative AI (genAI) chatbot for mild to moderate mental health challenges like burnout, sleep, or stress. The Lyra AI “clinicial-grade” chatbot includes a risk-flagging system to identify mental health situations that need immediate attention and connect to a 24/7 care team.
How we got here: The rise of AI chatbot use, particularly among Gen Z and Gen Alpha consumers, has exposed significant risks of psychological harm.
This rapid adoption is now linked to multiple lawsuits filed against AI companies concerning psychological injuries and even suicides involving children and teens.
Why it matters: Budget-constrained and tech-savvy consumers are already using GenAI chatbots for mental health wellness and ad hoc therapy.
Our take: AI chatbots for mental health are expected to expand as rising demand for therapy services continues to limit affordable access. Lyra’s rollout shows how AI therapy could be deployed responsibly: limiting chatbot use to lower-risk mental health issues and emphasizing safety.
In contrast, general-use chatbots used for mental health, which the American Psychological Association warns against, highlight the risks of unregulated AI therapy.
To ensure AI-powered mental health tools build trust, health tech marketers should:
This content is part of EMARKETER’s subscription Briefings, where we pair daily updates with data and analysis from forecasts and research reports. Our Briefings prepare you to start your day informed, to provide critical insights in an important meeting, and to understand the context of what’s happening in your industry. Not a subscriber? Click here to get a demo of our full platform and coverage.
You've read 0 of 2 free articles this month.
One Liberty Plaza9th FloorNew York, NY 100061-800-405-0844
1-800-405-0844sales@emarketer.com