Using AI for Mental Wellness

Using AI for Mental Wellness

❓ Can I use AI tools (like ChatGPT or mental health apps) for support? 

Yes—but with limits. AI can give you self-care ideas, mindfulness exercises, or journaling prompts. It can be a helpful supplement, but it is not a substitute for a counselor, therapist, or doctor. 

 ✅ Safe Ways to Use AI 

  • Use AI for grounding exercises (e.g., “give me 3 breathing techniques”). 
  • Ask for study tips or stress management strategies. 
  • Use AI to help you journal or reflect when you’re having a hard time putting feelings into words. 
  • Treat it like a wellness tool, similar to a meditation app—not like a therapist. 
     

 🚫 What NOT to Do with AI 

  • Do not rely on AI during a crisis (suicidal thoughts, self-harm urges, panic attack). AI is not equipped to help in emergencies. 
  • Do not share private medical or personal details—AI tools are not fully confidential.  Be aware that anything you put into AI could be used by anyone, for any purpose. 
  • Do not expect AI to understand your full situation or provide accurate diagnoses. Avoid using AI as your only source of emotional support—it cannot replace human care. 
  • AI tools are designed to keep you engaged for as long as possible.  Even when we intend to use them for a specific purpose, it is easy to lose track of that intention the more we use them. If you’re spending a lot of time using AI tools for mental health support, that is a clue it’s time to check in with a professional therapist.  
  • AI interactions are always accessible, and “frictionless” - they avoid the awkwardness and discomforts that come with human connections.  At the same time, humans are made for connection with other humans - and we don’t want to lose those skills.  Loneliness is one of the top things we hear students struggling with, and reliance on AI alone can deepen that isolation. 
  • AI retains bias - because it pulls from human-generated content, AI is not objective.  Take in AI generated content the way you would from any human source–with a grain of salt. 
 

Do's and Don't's

Do This 

  • Use AI for coping strategies, journaling, or mindfulness
  • Check in with your counselor about what AI suggested 
  • Keep personal info private 
  • See AI as a tool, not a therapist  

🚫 Don’t Do This 

  • Use AI instead of professional counseling
  • Rely on AI in an emergency or crisis 
  • Overshare sensitive details that you want to keep confidential - such as about your health status, or family members
  • Depend on AI for deep emotional support 

❓ What if I’m in crisis? 

If you ever feel unsafe, overwhelmed, or are thinking of hurting yourself: 

  • Call 988 (Suicide & Crisis Lifeline) in the U.S. 
  • Use campus emergency services or call 911 if there is immediate danger. 
  • Reach out to Student Health and Counseling Services at (510) 885-3735 for urgent support.  This number can connect you to crisis phone counseling after hours – choose option 2. 
     

🌱 Bottom Line 

AI can help you with small steps like breathing exercises or journaling prompts, but real healing and support come from human connection. Your counselor is here to listen, understand, and walk with you through challenges.