Back to homeScienceArchive

Science | Europe

AI Chatbots and Mental Health: A New Medical Study Says Doctors Need to Start Asking Their Patients a New Question

| 4 min read| By EuroBulletin24 briefing
AI Chatbots and Mental Health: A New Medical Study Says Doctors Need to Start Asking Their Patients a New Question
Markus Winkler pexels

## The Question That Doctors Aren't Asking When a patient comes to a doctor or psychiatrist for a mental health assessment, the intake process typically includes questions about medications, prior diagnoses, family history, substance use, and current stressors. A new paper published in JAMA Psychiatry in April 2026 arg

The Question That Doctors Aren't Asking

When a patient comes to a doctor or psychiatrist for a mental health assessment, the intake process typically includes questions about medications, prior diagnoses, family history, substance use, and current stressors. A new paper published in JAMA Psychiatry in April 2026 argues that there is a question missing from this standard set: are you using AI chatbots as a form of mental health support, and if so, how?

The paper's argument is not that AI chatbots are inherently harmful to people experiencing mental health challenges — the evidence on that question is genuinely mixed and context-dependent. Rather, the argument is that a significant and growing portion of people with mental health conditions are using AI chatbots in ways that are relevant to their clinical care, and that clinicians who are not asking about this use are operating with an incomplete picture of their patients' support systems, coping strategies, and information sources.

The specific behaviors documented in the research involve patients using large language model AI assistants to discuss symptoms, process emotional experiences, research diagnostic criteria, evaluate medication options, and in some cases to make decisions about whether to seek professional help at all. Each of these behaviors has specific clinical relevance: a patient who has been discussing symptoms extensively with an AI chatbot may arrive at a clinical appointment with specific preconceptions about their diagnosis that influence how they describe their experiences; a patient who has been using AI conversations as their primary emotional support may have different needs than one who has extensive human support networks; a patient who has researched medication options through AI may have specific expectations or concerns that benefit from direct clinical address.

What the Research Found About How People Are Using AI for Mental Health

The JAMA Psychiatry paper draws on survey and observational data to characterize the specific ways in which people with diagnosed mental health conditions are integrating AI chatbot use into their daily support routines. The findings suggest that the behavior is more prevalent than clinical practice has acknowledged, and that it takes forms that are clinically significant rather than trivially peripheral.

A substantial minority of respondents with diagnosed anxiety disorders reported using AI chatbots to manage acute anxiety episodes — describing their symptoms to the AI and using the responses to contextualize and regulate their distress. Whether this use is beneficial or harmful appears to depend heavily on the specific nature of the AI's responses: chatbots that respond to anxiety with grounding techniques and perspective-offering appear to function as a useful bridge between episodes and clinical care, while chatbots that engage with anxiety content in ways that elaborate or amplify it appear to worsen outcomes.

The research also identified a category of use that raises specific concerns: individuals who had not yet sought professional diagnosis or treatment using AI chatbots as a substitute for clinical assessment rather than a supplement to it. The specific problem with this pattern is not that people are seeking information and support — both are reasonable responses to mental health challenges — but that AI chatbots are not equipped to perform clinical assessment, cannot detect the specific diagnostic nuances that require trained clinical observation, and cannot provide the structured therapeutic interventions whose efficacy is supported by clinical evidence.

What Doctors Are Being Asked to Do Differently

The paper's recommendations for clinical practice are specific and actionable. Physicians and mental health professionals should add AI chatbot use to standard intake and ongoing care assessments, treating it as a clinically relevant behavior alongside other self-help strategies, information-seeking patterns, and support system characteristics. The specific questions recommended address both the fact of use (are you using AI chatbots in connection with your mental health?) and the character of use (what kinds of conversations are you having, how often, and what role do they play in how you manage your condition?).

This information serves several clinical purposes. It helps clinicians understand the information environment their patients are operating in, including potential sources of misinformation that may have shaped their understanding of their condition. It identifies patients who are using AI as a primary support mechanism in ways that may reflect gaps in their human support networks or barriers to accessing professional care. And it provides an opportunity for clinicians to guide patients toward more effective uses of available AI tools and away from patterns of use that appear to worsen outcomes.

The broader policy implication of the paper — that healthcare systems should develop specific guidelines for clinician-patient conversations about AI use in mental health contexts — is one that regulatory bodies and professional medical associations are not yet equipped to act on but will need to address as AI capability continues to expand.

#Science#Europe#AI#JAMA#Chatbots#Mental Health#New Medical Study#Says Doctors Need#Start Asking Their#Health#Mental#Question
More in ScienceBrowse full archive

Comments

0 comments
Checking account...
480 characters left
Loading comments...

Related coverage

Science
The Artemis II Crew Said They Are 'Bonded Forever' — Their First Full Interview After Coming Home Reveals Everything
## Four Astronauts Who Went to the Moon and Came Back Changed Ten days in a pressurized capsule traveling 252,760 miles ...
Science
Medication Abortion Pills Could Go Over the Counter — The Science Says Yes, But Politics Says No
A new JAMA study confirms abortion pills would be safe to sell OTC at pharmacies. Here is what the science says, what th...
Science
The Science of Loneliness Shows It's Killing People — And Nobody Is Taking It Seriously Enough
New research shows lonely people have a 29% higher risk of heart attack and 32% higher risk of stroke. Here is the full ...
Science
The Specific Science Behind Why the Mediterranean Diet Keeps Proving It Works
New 2026 research confirms the Mediterranean diet's cardiovascular benefits at the cellular level. Here is what scientis...
Science
The AI That Found 3,000 New Antibiotics in a Week — What It Means for the Antibiotic Resistance Crisis
An AI model discovered 3,000 potential new antibiotics candidates in one week of computational work. Here is what this m...
Science
The Dog Aging Project Is Trying to Help Dogs Live Longer — and It Might Save Humans Too
Scientists are studying how to help dogs live longer, healthier lives. The research insights are also reshaping human ag...

More stories

World
Hungary Voted Out Orbán After 16 Years — Here Is What the EU Did Next and Why It Matters for Europe's Future
Technology
San Francisco Just Opened an AI Grocery Store With 2 Human Employees — This Is What Shopping There Is Actually Like
Technology
Anthropic Is Holding Back Its Most Advanced AI Model Because It's Too Dangerous to Release — Here's What That Means
Economy
FEMA Owes $10 Billion to Disaster-Hit Communities — and It Simply Isn't Paying
Sports
Rory McIlroy Has Won Back-to-Back Masters — Here Is Why He Now Belongs Among Golf's True Legends
Sports
Liverpool vs PSG at Anfield Tonight: Can the Reds Overturn a One-Goal Deficit and Reach the Champions League Semi-Finals?
Entertainment
Ruby Rose Has Accused Katy Perry of Sexual Misconduct — Here Is Every Detail Confirmed So Far
Entertainment
Britney Spears Checked Into Rehab Voluntarily — What We Know and What It Means
Entertainment
Justin Bieber's Coachella Set Was Divisive, Weird, and Completely His Own — Here's What Actually Happened
Military
The US Draft Registration Becomes Automatic in December — Here Is Exactly What That Means for Young American Men
Economy
Oil Just Hit $100 Again After Trump's Hormuz Blockade — Here Is What It Means for Your Wallet This Week
Entertainment
The Olivier Awards 2026: Rachel Zegler Won Her First, Paapa Essiedu Dominated, and Rosamund Pike Surprised Everyone