OpenAI updates ChatGPT to better support user mental health

As more people turn to artificial intelligence for emotional support, OpenAI has released significant updates to ChatGPT aimed at addressing mental health concerns and prioritizing user safety. These updates are designed to make the chatbot less likely to foster unhealthy dependencies or provide inappropriate advice, and are informed by consultations with clinicians and technology experts....

Team MyndStories
Words by Team MyndStories

Published August 26, 2025 · 2 min read

OpenAI updates ChatGPT to better support user mental health

As more people turn to artificial intelligence for emotional support, OpenAI has released significant updates to ChatGPT aimed at addressing mental health concerns and prioritizing user safety. These updates are designed to make the chatbot less likely to foster unhealthy dependencies or provide inappropriate advice, and are informed by consultations with clinicians and technology experts.

Key improvements in ChatGPT’s mental health support

  • ChatGPT now detects possible signs of emotional stress or mental health struggles in user conversations. It responds by suggesting evidence-based resources instead of trying to resolve crises directly.
  • The chatbot includes gentle reminders for users to take breaks when conversations are lengthy or emotionally intense. This is to promote healthy digital habits and discourage overuse.
  • ChatGPT has stopped giving direct, decisive advice on personal or sensitive topics. Instead, it asks open-ended questions and offers balanced perspectives, supporting thoughtful decision-making rather than pushing users toward a single choice.
  • OpenAI has reversed earlier changes that made the chatbot overly agreeable, after feedback showed this could reinforce harmful beliefs. Ongoing adjustments now prioritize ethical AI use and user well-being.
  • Advisory groups, made up of psychiatrists, youth mental health professionals, and human-computer interaction experts, review system behavior and help guide further improvements.

Why these updates matter

More people are looking to AI for advice about sensitive topics, from anxiety and loneliness to relationship breakups. In fact, having extended conversations with AI chatbots is now giving rise to AI psychosis, a phenomenon where users lose touch with reality. Experts have cautioned that while chatbots can be helpful and help people through tough times, they may also miss signs of acute distress or provide guidance that is too simplistic or even harmful.

The new updates reinforce the point that AI should offer support and resources but not replace human professionals when it comes to mental health.

OpenAI stresses that ChatGPT is not a substitute for therapy or counseling. Instead, the focus is on giving users a tool that is safe, responsible, and helps connect people to proper resources if needed.

Looking ahead

OpenAI plans to continue improving ChatGPT’s safeguards, drawing on input from users and experts. With more than 700 million people now using ChatGPT each week, these changes reflect the need for responsible AI that helps people without replacing the value of real human support.

Continue the Healing

Stories like this one helped 23 Indians share their truth.

Read the MyndStories Anthology — first-person essays on anxiety, grief, identity, and the slow work of healing.

Get the BookFind a Therapist

The MyndStories Collection

Our physical imprint on the digital landscape. Curated for the slow, thoughtful reader.

View All Products
Our Stories Are Us — MyndStories Anthology
Our Anthology

Our Stories Are Us

23 Indians share their mental health journeys in essays that heal. Bound in a book that matters.

Order Your Copy
MyndReaders

Book Club

Expand your horizons with monthly curations of literature that challenges and heals. Join a community that reads with intention.

Join the Book Club
Book Club Selection 1
Book Club Selection 2

Continue the Journey

Weekly Curations

Join 15,000+ readers who receive our Sunday morning editorial on mental health, literature, and living well.

We value your peace. No spam, just stories.