Emotion Recognition using Facial Expressions

Authors

  • Udalaya  Department of Computer Science and Engineering, Krishna Engineering College, Ghaziabad, Uttar Pradesh, India
  • Saket Gupta  Department of Computer Science and Engineering, Krishna Engineering College, Ghaziabad, Uttar Pradesh, India
  • Nikhil  Department of Computer Science and Engineering, Krishna Engineering College, Ghaziabad, Uttar Pradesh, India
  • Tushar Chauhan  Department of Computer Science and Engineering, Krishna Engineering College, Ghaziabad, Uttar Pradesh, India
  • Ms.Shipra Gautam  Department of Computer Science and Engineering, Krishna Engineering College, Ghaziabad, Uttar Pradesh, India

Keywords:

Emotion recognition, speech, vision, PCA, SVC, decision level fusion, feature level fusion, affective states, humancomputer interaction (HCI).

Abstract

The interaction between human beings and computers will be more natural if computers are able to perceive and respond to human non-verbal communication such as emotions. Although several approaches have been proposed to recognize human emotions based on facial expressions or speech, relatively limited work has been done to fuse these two, and other, modalities to improve the accuracy and robustness of the emotion recognition system. This paper analyses the strengths and the limitations of systems based only on facial expressions or acoustic information. It also discusses two approaches used to fuse these two modalities: decision level and feature level integration. Using a database recorded from an actress, four emotions were classified: sadness, anger, happiness, and neutral state. By the use of markers on her face, detailed facial motions were captured with motion capture, in conjunction with simultaneous speech recordings. The results reveal that the system based on facial expression gave better performance than the system based on just acoustic information for the emotions considered. Results also show the complementarily of the two modalities and that when these two modalities are fused, the performance and the robustness of the emotion recognition system improve measurably.

References

  1. Black, M. J. and Yacoob, Y. Tracking and recognizing rigid and non-rigid facial motions using local parametric model of image motion. In Proceedings of the International Conference on Computer Vision, pages 374–381. IEEE Computer Society, Cambridge, MA, 2012
  2. Burges, C. A tutorial on support vector machines for pattern recognition. Dat Mining and Know. Disc., vol. 2(2), pp. 1– 47, 2015.
  3. Chen, L.S., Huang, T. S., Miyasato T., and Nakatsu R. Multimodal human emotion / expression recognition, in Proc. of Int. Conf. on Automatic Face and Gesture Recognition, (Nara, Japan), IEEE Computer Soc., April 2015
  4. Chen, L.S., Huang, T.S. Emotional expressions in audiovisual human computer interaction. Multimedia and Expo, 2000. ICME 2000. 2000 IEEE International Conference on, Volume: 1, 30 July-2 Aug. 2016.Cowie, R., Douglas-Cowie, E., Tsapatsoulis, N., Votsis, G., Kollias, S., Fellenz, W., Taylor, J.G. Emotion recognition in human-computer interaction. Signal Processing Magazine, IEEE, Volume: 18, Issue: 1, Jan 2018
  5. Essa, Pentland, A. P. Coding, analysis, interpretation, and recognition of facial expressions. IEEE Transc. On Pattern Analysis and Machine Intelligence, 19(7):757– 763, JULY 2017.
  6. Lee C. M., Narayanan, S.S., Pieraccini, R. Classifying emotions in human-machine spoken dialogs. Multimedia and Expo, 2002. ICME '02. Proceedings. 2002 IEEE International. Conference on , Volume: 1 , 26-29 Aug. 2015.
  7. Lee, C. M., Yildirim, S., Bulut, M., Kazemzadeh A., Busso,C., Deng, Z., Lee, S., Narayanan, S.S. Emotion Recognition based on Phoneme Classes. to appear in Proc. ICSLP’04, 2016.
  8. Lee C. M., Narayanan S.S. Towards detecting emotions in spoken dialogs. IEEE Trans. on Speech & Audio Processing, in press, 2016.
  9. Pantic, M., Rothkrantz, L.J.M. Toward an affectsensitive multimodal human-computer interaction. Proceedings of the IEEE , Volume: 91 Issue: 9 , Sept. 2018
  10. Yildirim, S., Bulut, M., Lee, C. M., Kazemzadeh, A., Deng, Z., Busso, C., Lee, S., Narayanan, S.S., Analysis of acoustic correlates in emotional speech. to appear in ICSLP’04, 2019.

Downloads

Published

2021-04-30

Issue

Section

Research Articles

How to Cite

[1]
Udalaya, Saket Gupta, Nikhil, Tushar Chauhan, Ms.Shipra Gautam, " Emotion Recognition using Facial Expressions" International Journal of Scientific Research in Computer Science, Engineering and Information Technology(IJSRCSEIT), ISSN : 2456-3307, Volume 7, Issue 2, pp.172-183, March-April-2021.