Facial Behavior-Based Depression Detection with Bi-LSTM and Evaluation via Universal and Hybrid Mode
Students & Supervisors
Student Authors
Supervisors
Abstract
Depression is a significant mental health disorder affecting millions globally, necessitating early and objective detec tion for timely intervention. Traditional diagnostic methods rely on subjective self-reports, leading to potential biases and delayed assessments. In this study, we propose an automated depression detection approach by analyzing facial behavior and head ges tures using both Hybrid Learning Models (HLSTM, HXGBoost, HSVM, HRandom) and a Universal Learning Model (ULSTM). These models capture both spatial and temporal dependencies in facial expressions and head movements to enhance classification accuracy. The data for this research is extracted from FacePsy which is an open-source affective mobile sensing system that sup plies real world behavioural data. Experimental results show that the models achieved a good classification performance and the Hybrid LSTMmodel provided the best balance between precision and recall. An AUC-ROC score of 0.89 reaffirms the effectiveness of these models. The results highlight the promising framework of combining deep learningbased facial analysis and head movement for non-intrusive, in-the-moment assessment of depression. This study contributes to the expansion of affective computing, as well as the field of AI-based mental health diagnostics, in the interest of accessible, automated mental health evaluation
Keywords
Publication Details
- Type of Publication: Conference
- Conference Name: The 7th International Conference on Activity and Behavior Computing (ABC 2025)
- Date of Conference: 21/04/2025 - 21/04/2025
- Venue: Khalifa University, Abu Dhabi, UAE
- Organizer: Khalifa University, Abu Dhabi, UAE