Emotion Recognition Based Personal Entertainment Robot Using ML & IP
Keywords:
Smart Emotion, Espeak and Pyttsx, and Face APIAbstract
This project presents a method to automatically detect emotional duality and mixed emotional experience using Linux based system. Co-ordinates, distance, and movement of tracked points were used to create features from visual input that captured facial expressions, head, face gestures, and face movement. Spectral features, prosodic features were extracted using the camera. Espeak and Pyttsx and Face API were used for calculations of features. A combined feature vector was created by feature level fusion and a cascade classifier was used for emotion detection. Live participants and actions are to be used for recording simultaneous mixed emotional experiences. As per the calculated result system will play songs and display the books list.
References
- A. S. Patwardhan, “Augmenting Supervised Emotion Recognition with Rule-Based Decision Model”, arXiv, 2016.
- A. S. Patwardhan and G. M. Knapp, “Affect Intensity Estimation Using Multiple Modalities,” Florida Artificial Intelligence Research Society Conference, May. 2014.
- A. S. Patwardhan and G. M. Knapp, “Multimodal Affect Analysis for Product Feedback Assessment,” IIE Annual Conference. Proceedings. Institute of Industrial Engineers-Publisher, 2013.
- SE. Kahou, C. Pal, X. Bouthillier, P. Froumenty, C. Glehre,R. Memisevic, P. Vincent, A. Courville, Y. Bengio, RC. Ferrari and M. Mirza. Combining modality-specific deep neural networks for emotion recognition in video. Proceedings of the 15th ACM on International conference on multimodal interaction, 2013.
- F. D. Schönbrodt and J. B. Asendorpf, " The Challenge of Constructing Psychologically Believable Agents," J. Media Psychology, vol. 23, pp. 100-107, Apr. 2011.
- N. C. Krämer, I. A. Gurgel, and G. Bente, " Emotion and motivation in embodied conversational agents," presented at the AISB'05 Conv., Symp. on Agents that Want and Like: Motivational and Emotional Roots of Cognition and Action, Hatfield, UK, 2005.
Downloads
Published
Issue
Section
License
Copyright (c) IJSRCSEIT

This work is licensed under a Creative Commons Attribution 4.0 International License.