The aim is to provide a computer vision solution to detect emotions in real-time and leverage IoT integration to trigger personalized responses.
The primary aim was to produce an emotional detection prototype using IoT to distract users from doom-scrolling when it recognized a user had a sad face. However, the idea was rejected by HU University due to the lack of evidence to support the idea that most people doing doomscorrling have sad faces.
This prototype is built with p5js and Arduino and uses a servo motor to react based on emotion detection. For this project, I took lots of photos of my face in two states: sad and neutral, and then used teachable machine training to train two models: sad and neutral. p5js was used to provide a camera window to catch the user's emotional face. An Arduino was programmed to move a servo motor that was connected to a clock page in order to move the clock page hand. Number 1 shows the neutral state, and number 2 shows the sad state.
- teachable machine
- p5js
- Arduino
- servo motor