This repository contains the code and resources for an emotion detection project using the YOLOv5 model. In this project, we utilized the pre-trained YOLOv5 model from the Ultralytics repository to detect and classify emotions on human faces. The goal was to create a practical application for real-time emotion detection that can be deployed on mobile devices.
Emotion detection is an essential aspect of human-computer interaction, and it finds applications in various domains, from user experience enhancement to mental health monitoring. In this project, we focused on the development of a real-time emotion detection system using the YOLOv5 object detection model.
To train our emotion detection model, we created labeled datasets using the "labelme" open-source annotation tool. The dataset contains images of human faces with emotions labeled as "Happy" or "Neutral." We ensured a diverse set of images to enhance the model's ability to recognize emotions accurately.
We fine-tuned the pre-trained YOLOv5 model on our labeled emotion dataset. The training process involved optimizing the model's parameters to achieve high accuracy in identifying "Happy" or "Neutral" expressions. We documented the training configuration and process in the repository.
The trained YOLOv5 model was deployed on an Android phone, showcasing the practical application of object detection techniques for real-time emotion recognition. We developed a custom Android application that utilizes the model's weights to detect and classify emotions in real-time.