HomeArtificial IntelligenceArticle
AI · Healthcare

AI Mental Health Chatbots Face New Regulation After Teenager Suicide Linked to AI Companion

By Health DeskApril 14, 2026 · 5 min read

Regulators in the US and EU are preparing new rules for AI mental health chatbots following the death of a 14-year-old who was reportedly using an AI companion for emotional support.

The tragedy has prompted urgent action. The FTC has opened an investigation into AI companion companies, and the EU has classified emotional AI chatbots as 'high-risk' systems under the AI Act.

The AI companion market has grown rapidly to $2.3 billion in 2026. Companies like Replika, Character.ai, and several newcomers offer AI companions that provide emotional support, conversation, and companionship.

Mental health professionals are divided. Some see AI companions as a useful supplement to therapy, especially for people who cannot access human therapists. Others worry that the technology creates unhealthy emotional dependencies and may delay people from seeking appropriate professional help.

Artificial Intelligence
AI · Healthcare
The Sovereign Post
■ More from Artificial Intelligence