Leveraging AI to Support Student Mental Health and Well-Being

By Armando Montero

October 16, 2024

Share this

The latest report on student mental health from Healthy Minds reveals encouraging data but also underscores the complex nature of this persistent problem and the ongoing need for comprehensive support.

The traditional campus resources—counseling centers and wellness programs—are stretched thin, and higher education leaders are beginning to explore innovative solutions to provide this critical support. While there’s no easy fix, the emergence of artificial intelligence (AI) is giving colleges new options to consider.

AI can’t replace human care, but it can help. AI tools are already being used to connect students with resources, manage everyday stresses, and even flag potential warning signs before a student is in crisis. In this post, we’ll look at how campuses are using AI for mental health and wellness, where it’s making a difference, and what challenges still need to be addressed.

AI on campus

With new advancements in natural language processing, one of the most common uses of AI on college campuses has been chatbots.

As a preventative measure, chatbots can disseminate information and talk with students on a regular basis to take the stress off the transition to college and everyday student life. Arizona State University’s tool Hey Sunny is a chatbot that helps students adjust to college life; it allows staff and other students to be aware of what questions are being asked and enables them to stay informed. From questions about classes, housing, budgeting, mental health, and more, tools such as this can become a regular guide for students and can integrate a healthy culture of positive mental health behaviors and habits.

Chatbots can also be trained to specifically target mental health and wellness concerns, including early warning signs and other less severe symptoms. This preventative approach allows counseling centers to focus more resources and time toward those students with moderate to severe symptoms and immediate needs.

One example of a chatbot that uses this approach is Wysa, an AI-driven app that provides immediate support for students by using clinically validated, conversational AI that evaluates chat responses to offer initial steps of care with human coaching available. Breathhh is an AI-powered Chrome extension that monitors and analyzes interactions with web activity and online behaviors to provide mental health exercises at appropriate moments. Similarly, researchers at Stanford University have developed a tool named WoeBot that offers cognitive behavioral therapy techniques through conversation.

As research demonstrates the importance of personalized and individualized treatment plans, AI could enable institutions and respective departments to personalize treatment based on students’ unique characteristics. In the future—by monitoring and analyzing student data—AI can connect students with self-help tools and with apps to monitor their progress, thus adapting specific interventions.

Additionally, compliance with and regular updates for intervention treatment and services are difficult for human professionals to monitor. Whether it provides reminders to take prescribed medication or to follow up on specific tasks and prompts, AI can be a source of individual support that current interventions services do not have the capability to do. In the future, AI can track and predict behavior among students that might trigger noncompliance—particularly in high-stress environments—and pass along information to counseling centers or case managers to follow up with students and build routines to avoid these barriers.

Looking ahead, AI may play an even bigger role in intervention and treatment. A comprehensive review in 2019 analyzed various studies in which AI was utilized to interpret different data sources, and it concluded that machine learning algorithms could effectively predict and classify mental health conditions (e.g., suicidal ideation, depression, schizophrenia) with high accuracy and could predict suicide risk with 80 percent accuracy. The data sources reviewed included electronic health records, smartphone and video monitoring data, and social media information—all of which are used by college-aged students at a much higher frequency than previous generations.

Concerns and considerations

Despite the many potentially positive uses of AI to support student mental health, there are also some concerns.

Accelerated demand. With current and incoming cohorts of college students demonstrating increased need for mental health services, institutions should keep in mind the capacity of existing personnel and resources. The integration of AI-related tools, particularly those such as chatbots that reach many students, is likely to increase the number of students who will seek human professional services to verify or continue conversations started with such tools. These tools are able to reach a larger audience but still require some level of human oversight and monitoring, which will require more time commitment.

Root cause determinations. Institutional leaders should use caution around how and when AI-powered tools are rolled out on campus environments to ensure that they are not received as or expected to be a one-off solution to mental health challenges among the student population. Mental health challenges are complex, nuanced, and dependent on each individual scenario—AI cannot identify the root causes of mental health issues. Moreover, these root causes may shift with new generations of students as each cohort faces a unique set of experiences and social contexts.

Privacy and data concerns. One of the most prevalent concerns that exist around AI-powered tools on college campuses surrounds student privacy and data-usage concerns. Institutions must take steps to ensure data privacy and protect sensitive health-related information. This effort does not stop simply at compliance, but rather it must include building intentional and robust safeguards.

Bias in decision-making. One concern that has continued to rise from professionals at all levels is AI’s susceptibility to bias based on the data and information it is given. Bias in these tools can lead to unfair or ineffective outcomes, particularly when dealing with sensitive issues such as mental health. This concern can be mitigated by ensuring that the data collected is representative of the student population each institution serves and by examining training data with a close eye for any potential bias at the start. Additionally, institutions must ensure that these algorithms are sensitive to cultural differences in emotional expression and avoid making assumptions based on a narrow or dominant cultural lens.

Artificial intelligence has the potential be a powerful tool to help institutions address rising mental health concerns among college students. As institutions consider ways to leverage AI, they must move beyond historic one-off approaches and build resources that are integrated into the college experience to create the best experience for each student. This effort comes with ensuring that concerns around data privacy, transparency, and bias are addressed, as well as considering changes in demand for mental health services. As AI evolves, institutions of higher education can continue to find innovative ways to meet the needs of students and chip away at their rising mental health challenges.


If you have any questions or comments about this blog post, please contact us.

About the Author

Armando Montero

Keep Reading

What Can Colleges and Universities Do to Support the Mental Health of Their Student-Athletes?

The demands of the college experience can place significant stress on student-athletes, which can in turn impact their mental heatlh. How can higher education leaders, coaches, and faculty work together to address these challenges?

November 3, 2023

Centering Equity in Student Mental Health Task Forces: Lessons Learned From the University of Michigan

Based on their work with the University of Michigan’s Rackham Graduate School Task Force on Graduate Student Mental Health, Sara Abelson, Meghan Duffy, and Janelle Goodwill identify eight ways that university mental health task forces can center equity in their work.

September 21, 2020

College Student Mental Health and Well-Being: A Survey of Presidents

To better understand how campuses are navigating the challenge of student mental health and well-being, ACE conducted a survey of 400 college and university presidents at the end of April. Read what they had to say.

August 12, 2019