Emotion Based Crime Track

Authors

  • Saiprasad V Department of Computer Science, Yuvakshetra Institute of Management Studies, Palakkad, Kerala, India Author
  • Sreyas J Department of Computer Science, Yuvakshetra Institute of Management Studies, Palakkad, Kerala, India Author
  • Megha S Department of Computer Science, Yuvakshetra Institute of Management Studies, Palakkad, Kerala, India Author
  • Akshay A Assistant Professor, Department of Computer Science, Yuvakshetra Institute of Management Studies, Palakkad, Kerala, India Author
  • Jibin Joy Assistant Professor, Department of Computer Science, Yuvakshetra Institute of Management Studies, Palakkad, Kerala, India Author

DOI:

https://doi.org/10.32628/CSEIT2410269

Keywords:

Vision Transformers, Deep Learning, Papaya Diseases, Identification, Accuracy, Agriculture

Abstract

Human emotions, which are primarily conveyed through nonverbal communication such as facial expressions and body language, can be challenging to decipher due to their nuanced nature. For instance, an angry individual might avoid making eye contact, clench their fists, or display a furrowed brow, while a person experiencing joy may exhibit a wide smile, relaxed posture, and lively gestures. Similarly, the subtleties in the facial expressions and emotions of potential criminals can be identified using Artificial Intelligence (AI) techniques. These AI methodologies involve several steps: initially capturing the face, then detecting and testing the face for various expressions, followed by facial recognition, and finally analyzing facial ratios to identify criminals or specific individuals based on behavioral cues and patterns. Utilizing the latest advancements in technology, the Python programming language, in conjunction with Convolutional Neural Networks (CNN) algorithms, is employed to analyze a person’s emotions with remarkable accuracy and efficiency. This project introduces a cutting-edge technique known as Facial Emotion Recognition using Convolutional Neural Networks (FERC). FERC operates in two primary phases within the CNN framework: the first phase aims to remove the background from the image, while the second focuses on the intricate details of facial features, allowing the system to extract and analyze these features with precision. The culmination of this process is the detection of the subject's emotion, which is subsequently presented in a comprehensive graphical report. This report showcases the detected emotions over time, providing valuable insights into the individual’s emotional state, enabling researchers, psychologists, and even law enforcement agencies to better understand human behavior and emotional responses in various contexts.

Downloads

Download data is not yet available.

References

https://sciendo.com/article/10.1515/aucts-2015-0089

https://ieeexplore.ieee.org/document/8776617

G. Tonguç and B. O. Ozkara, “Automatic recognition of student emotions from facial expressions during a lecture,” Computers &Education, vol. 148, Article ID 103797, 2020 DOI: https://doi.org/10.1016/j.compedu.2019.103797

https://www.sciencedirect.com/science/article/pii/S235291482030201X

https://analyticsindiamag.com/my-first-cnn-project-emotion-detectionusing- convolutional-neural-network-with-tpu/

L. Shen, M. Wang, and R. Shen, “Affective e-Learning: using “emotional” data to improve learning in pervasive learning environment,” Educational Technology & Society, vol. 12, no. 2, pp. 176–189, 2009.

D.Yanga,Abeer Alsadoona,P.W.C .Prasad*a,A.K.Singhb,A.ELchouemic a.School of Computing and Mathematics, Charles Sturt University, Sydney b.The Department of Computer Applications, National Institute of Technology, Haryana ,India c.Walden University,USA

https://ieeexplore.ieee.org/document/8710406 9. Yang, Kriegman, & Ahuja (2002) "Detecting Faces in Images: A Survey", IEEETransactions on Pattern Analysis and Machine Intelligence.

Downloads

Published

07-03-2024

Issue

Section

Research Articles

How to Cite

[1]
Saiprasad V, Sreyas J, Megha S, Akshay A, and Jibin Joy, “Emotion Based Crime Track”, Int. J. Sci. Res. Comput. Sci. Eng. Inf. Technol, vol. 10, no. 2, pp. 439–449, Mar. 2024, doi: 10.32628/CSEIT2410269.

Similar Articles

1-10 of 242

You may also start an advanced similarity search for this article.