Detecting Depression Early: How AI Reads Your Mood Before You Realize It

Image Credit: Abbat | Unsplash

Depression affects nearly 300 million individuals worldwide, posing significant challenges for timely detection and intervention. Traditional methods often rely on self-reporting, which can be unreliable as many sufferers hesitate to disclose their struggles. However, groundbreaking advancements in artificial intelligence are paving the way for innovative solutions that harness the ubiquitous presence of smartphones to identify depressive symptoms non-invasively.

The Challenge of Detecting Depression

Depression remains one of the most prevalent mental health issues globally, impacting approximately 4% of the population. Its subtle manifestations and the stigma surrounding mental health often hinder accurate diagnosis and timely support. Conventional detection methods typically involve clinical assessments and self-reporting, which can miss silent sufferers who do not openly communicate their emotional state.

AI Innovations from Stevens Professor Sang Won Bae

Stevens Institute of Technology is at the forefront of addressing this challenge through the pioneering work of Professor Sang Won Bae. Collaborating with doctoral candidate Rahul Islam, Professor Bae is developing sophisticated AI-driven smartphone applications designed to monitor and detect signs of depression by analyzing users’ physiological and behavioral data captured via their devices.

PupilSense: Tracking Eye Movements for Mood Insights

One of the flagship projects, PupilSense, focuses on monitoring the user’s pupil movements. By continuously capturing high-resolution snapshots of the user’s eyes during routine smartphone interactions, PupilSense measures pupil diameter and analyzes pupillary reflexes—biomarkers that have been linked to depressive episodes in extensive research. In initial trials involving 25 participants over four weeks, PupilSense processed around 16,000 phone interactions. The AI was trained to distinguish between normal and atypical pupil responses, achieving a 76% accuracy rate in identifying moments when users felt depressed. This performance surpasses existing smartphone-based depression detection systems, highlighting PupilSense’s potential as a reliable screening tool.

FacePsy: Decoding Facial Expressions for Emotional States

Complementing PupilSense, the FacePsy system delves into facial expressions to infer emotional well-being. By discreetly analyzing facial landmarks, eye movements, and head gestures during smartphone use, FacePsy identifies patterns indicative of depression. Notably, the system observed that increased smiling could paradoxically signal underlying depressive moods, possibly reflecting a “brave face” façade. Early findings also revealed that reduced facial movements in the mornings and specific head gestures, such as yawning, correlated with higher depressive symptoms. Additionally, more frequent eye-opening behaviours in the morning and evening were linked to potential depression, suggesting that outward signs of alertness might mask internal struggles.

Promising Results and Future Directions

The combined efforts of PupilSense and FacePsy demonstrate significant promise in leveraging everyday technology for mental health monitoring. The FacePsy pilot study reported an Area Under the Receiver Operating Characteristic curve (AUROC) of 81%, indicating high accuracy in identifying depressive episodes. Moreover, the regression model predicting PHQ-9 scores—a standard measure of depression severity—achieved a Mean Absolute Error of 3.08, showcasing the system’s potential for reliable mood prediction.

Open-Source Accessibility and Ongoing Development

Emphasizing accessibility and collaboration, Professor Bae has made PupilSense available as an open-source project on GitHub, inviting researchers and developers to contribute to its refinement. The FacePsy findings are set to be presented at the upcoming ACM International Conference on Mobile Human-Computer Interaction (MobileHCI) in Australia, underscoring the academic and practical significance of this research. Looking ahead, Professor Bae and her team aim to enhance these technologies, expanding their capabilities to provide real-time mental health support and just-in-time interventions. By integrating these AI-powered tools into the smartphones that millions rely on daily, there is a promising pathway to more proactive and personalized mental health care.

Source: Neuroscience

TheDayAfterAI News

We are your source for AI news and insights. Join us as we explore the future of AI and its impact on humanity, offering thoughtful analysis and fostering community dialogue.

https://thedayafterai.com
Previous
Previous

Generative AI: Balancing Benefits and Risks in the Battle Against Dead Butt Syndrome

Next
Next

AI Breakthrough: Headband-Style Device Poised to Detect Alzheimer’s Years Ahead