Emotional Intelligence in Voice Assistants : Advancing Human-AI Interaction

Authors

  • Ankur Binwal Indiana University, USA Author

DOI:

https://doi.org/10.32628/CSEIT241051039

Keywords:

Emotional Intelligence, Voice Assistants, Human-AI Interaction, Speech Emotion Recognition, Ethical Considerations

Abstract

This article explores the integration of emotional intelligence (EI) into AI voice assistants, examining techniques for emotion recognition from speech, adaptive response generation, and the impact on user experience. It discusses key components including acoustic feature analysis, machine learning approaches, and multimodal systems for emotion detection. The article also addresses ethical considerations such as privacy concerns, potential manipulation, and technical challenges in achieving robust, real-time, and culturally-adaptive EI systems. Applications in mental health support and customer service are examined, highlighting both the potential benefits and necessary precautions. Through a comprehensive review of recent research, this work aims to contribute to the responsible development of emotionally intelligent AI voice assistants that enhance human-AI interaction while prioritizing user well-being.

Downloads

Download data is not yet available.

References

MarketsandMarkets, "Conversational AI Market by Component (Platform and Services), Type (IVA and Chatbots), Technology (ML and Deep Learning, NLP, ASR), Application, Deployment Mode (Cloud and On-premises), Vertical, and Region - Global Forecast to 2026," MarketsandMarkets, Jun. 2021. [Online]. Available: https://www.marketsandmarkets.com/Market-Reports/conversational-ai-market-49043506.html

A. Niculescu, H. van Dijk, A. Nijholt, H. Li, and S. L. See, "Making Social Robots More Attractive: The Effects of Voice Pitch, Humor and Empathy," International Journal of Social Robotics, vol. 13, no. 5, pp. 1205–1214, Apr. 2021. [Online]. Available: https://link.springer.com/article/10.1007/s12369-012-0171-x

S. G. Koolagudi and K. S. Rao, "Emotion recognition from speech: a review," International Journal of Speech Technology, vol. 15, no. 2, pp. 99-117, Jun. 2012. [Online]. Available: https://doi.org/10.1007/s10772-011-9125-1 DOI: https://doi.org/10.1007/s10772-011-9125-1

R. A. Khalil, E. Jones, M. I. Babar, T. Jan, M. H. Zafar, and T. Alhussain, "Speech Emotion Recognition Using Deep Learning Techniques: A Review," IEEE Access, vol. 7, pp. 117327-117345, 2019. [Online]. Available: https://doi.org/10.1109/ACCESS.2019.2936124 DOI: https://doi.org/10.1109/ACCESS.2019.2936124

P. Zhong, D. Wang, and C. Miao, "An Affect-Rich Neural Conversational Model with Biased Attention and Weighted Cross-Entropy Loss," Proceedings of the AAAI Conference on Artificial Intelligence, vol. 33, no. 01, pp. 7492-7500, Jul. 2019. [Online]. Available: https://doi.org/10.1609/aaai.v33i01.33017492 DOI: https://doi.org/10.1609/aaai.v33i01.33017492

N. Majumder, P. Poria, D. Hazarika, R. Mihalcea, A. Gelbukh, and E. Cambria, "DialogueRNN: An Attentive RNN for Emotion Detection in Conversations," Proceedings of the AAAI Conference on Artificial Intelligence, vol. 33, no. 01, pp. 6818-6825, Jul. 2019. [Online]. Available: https://doi.org/10.1609/aaai.v33i01.33016818 DOI: https://doi.org/10.1609/aaai.v33i01.33016818

T. Hu, A. Xu, Z. Liu, Q. You, Y. Guo, V. Sinha, J. Luo and R. Akkiraju, "Touch Your Heart: A Tone-aware Chatbot for Customer Care on Social Media," in Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, 2018, pp. 1-12. [Online]. Available: https://doi.org/10.1145/3173574.3173989 DOI: https://doi.org/10.1145/3173574.3173989

A. P. Chaves and M. A. Gerosa, "How should my chatbot interact? A survey on social characteristics in human–chatbot interaction design," International Journal of Human–Computer Interaction, vol. 37, no. 8, pp. 729-758, 2021. [Online]. Available: https://doi.org/10.1080/10447318.2020.1841438 DOI: https://doi.org/10.1080/10447318.2020.1841438

S. B. Lovato and A. M. Piper, "Young Children and Voice Search: What We Know From Human-Computer Interaction Research," Frontiers in Psychology, vol. 10, p. 8, Jan. 2019. [Online]. Available: https://doi.org/10.3389/fpsyg.2019.00008 DOI: https://doi.org/10.3389/fpsyg.2019.00008

S. Mohseni, N. Zarei, and E. D. Ragan, "A Multidisciplinary Survey and Framework for Design and Evaluation of Explainable AI Systems," ACM Transactions on Interactive Intelligent Systems, vol. 11, no. 3-4, pp. 1-45, Nov. 2021. [Online]. Available: https://doi.org/10.1145/3387166 DOI: https://doi.org/10.1145/3387166

S. Latif, R. Rana, S. Younis, J. Qadir, and J. Epps, "Transfer Learning for Improving Speech Emotion Classification Accuracy," in Proc. Interspeech 2022, 2022, pp. 3368-3372. [Online]. Available: https://arxiv.org/abs/1801.06353

B. Zhang, E. M. Provost, and G. Essl, "Cross-corpus acoustic emotion recognition from singing and speaking: A multi-task learning approach," in Proceedings of the 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2022, pp. 6367-6371. [Online]. Available: https://ieeexplore.ieee.org/document/7472790

F. Ringeval,, "AVEC 2019 Workshop and Challenge: State-of-Mind, Detecting Depression with AI, and Cross-Cultural Affect Recognition," in Proceedings of the 9th International on Audio/Visual Emotion Challenge and Workshop, 2023, pp. 3-12. [Online]. Available: https://doi.org/10.1145/3347320.3357688 DOI: https://doi.org/10.1145/3347320.3357688

A. Følstad and M. Skjuve, "Chatbots for customer service: User experience and motivation," Computers in Human Behavior, vol. 142, May 2023, 107646. [Online]. Available: https://dl.acm.org/doi/10.1145/3342775.3342784

Downloads

Published

09-10-2024

Issue

Section

Research Articles

How to Cite

[1]
Ankur Binwal, “Emotional Intelligence in Voice Assistants : Advancing Human-AI Interaction”, Int. J. Sci. Res. Comput. Sci. Eng. Inf. Technol, vol. 10, no. 5, pp. 513–523, Oct. 2024, doi: 10.32628/CSEIT241051039.

Similar Articles

1-10 of 162

You may also start an advanced similarity search for this article.