Emotion Based Crime Track
DOI:
https://doi.org/10.32628/CSEIT2410269Keywords:
Vision Transformers, Deep Learning, Papaya Diseases, Identification, Accuracy, AgricultureAbstract
Human emotions, which are primarily conveyed through nonverbal communication such as facial expressions and body language, can be challenging to decipher due to their nuanced nature. For instance, an angry individual might avoid making eye contact, clench their fists, or display a furrowed brow, while a person experiencing joy may exhibit a wide smile, relaxed posture, and lively gestures. Similarly, the subtleties in the facial expressions and emotions of potential criminals can be identified using Artificial Intelligence (AI) techniques. These AI methodologies involve several steps: initially capturing the face, then detecting and testing the face for various expressions, followed by facial recognition, and finally analyzing facial ratios to identify criminals or specific individuals based on behavioral cues and patterns. Utilizing the latest advancements in technology, the Python programming language, in conjunction with Convolutional Neural Networks (CNN) algorithms, is employed to analyze a person’s emotions with remarkable accuracy and efficiency. This project introduces a cutting-edge technique known as Facial Emotion Recognition using Convolutional Neural Networks (FERC). FERC operates in two primary phases within the CNN framework: the first phase aims to remove the background from the image, while the second focuses on the intricate details of facial features, allowing the system to extract and analyze these features with precision. The culmination of this process is the detection of the subject's emotion, which is subsequently presented in a comprehensive graphical report. This report showcases the detected emotions over time, providing valuable insights into the individual’s emotional state, enabling researchers, psychologists, and even law enforcement agencies to better understand human behavior and emotional responses in various contexts.
Downloads
References
https://sciendo.com/article/10.1515/aucts-2015-0089
https://ieeexplore.ieee.org/document/8776617
G. Tonguç and B. O. Ozkara, “Automatic recognition of student emotions from facial expressions during a lecture,” Computers &Education, vol. 148, Article ID 103797, 2020 DOI: https://doi.org/10.1016/j.compedu.2019.103797
https://www.sciencedirect.com/science/article/pii/S235291482030201X
https://analyticsindiamag.com/my-first-cnn-project-emotion-detectionusing- convolutional-neural-network-with-tpu/
L. Shen, M. Wang, and R. Shen, “Affective e-Learning: using “emotional” data to improve learning in pervasive learning environment,” Educational Technology & Society, vol. 12, no. 2, pp. 176–189, 2009.
D.Yanga,Abeer Alsadoona,P.W.C .Prasad*a,A.K.Singhb,A.ELchouemic a.School of Computing and Mathematics, Charles Sturt University, Sydney b.The Department of Computer Applications, National Institute of Technology, Haryana ,India c.Walden University,USA
https://ieeexplore.ieee.org/document/8710406 9. Yang, Kriegman, & Ahuja (2002) "Detecting Faces in Images: A Survey", IEEETransactions on Pattern Analysis and Machine Intelligence.
Downloads
Published
Issue
Section
License
Copyright (c) 2024 International Journal of Scientific Research in Computer Science, Engineering and Information Technology
This work is licensed under a Creative Commons Attribution 4.0 International License.