Detection and Classification of Human Action Events from Captured Video Streams

Authors

  • Dr. Puttegowda D  Department of Computer Science Engineering, ATME College of Engineering, Mysuru, Karnataka, India
  • Suma A P   Department of Electrical and Electronics, Vidya Varadhaka College of Engineering, Mysuru, Karnataka, India

Keywords:

Background Subtraction, Edge Tracking, Classification, Spatio-Temporal Interest Points.

Abstract

Human detection and recognizing their actions from the captured video streams is more difficult and challenging task in the field of image processing. The human action recognition is more complex due to variability in shapes and articulation of human body, motions in the background scene, lighting conditions and occlusion. Human actions are recognized by tracking the selected object over the consecutive frames of gray scale image sequences, initially the background motion of the input video stream is subtracted, and its binary images are constructed, the object which needs to be monitored is selected by enclosing the required pixels within bounding rectangle, by using spatiotemporal interest points (Mo-SIFT). The selected foreground pixels within the bounding rectangle are then trackedusing edge tracking algorithm over the consecutive frames of gray scale images. The features like horizontal stride (HS) and vertical distance (VD) are extracted while tracking and the values of these features from the current frame are subtracted with the previous frame values to know the motion. The obtained results after subtraction are then applied to K-Nearest Neiberhood method to recognizing human action using linear prediction technique. This methodology finds an application where monitoring the human actions is required such as shop surveillance, city surveillance, airports surveillance and other places where security is the prime factor.

References

  1. Puttegowda D and Dr. M C Padma,"A Framework for Event Classification from Video Sequences using Bayesian Neural Network",Communications on Applied Electronics (CAE),Volume: 05,Issue: 02,Pages:1-5,DOI: 10.5120/cae2016652229,ISSN : 2394-4714,May,2016
  2. Murat EKINCI,Eyup GEDIKLI (2005) Silhouette Based Human Motion and Action Detection and Analysis for Real-Time Automated Video Surveillance. Turk J Elec Engin. Volume 13,No.2.
  3. Puttegowda D and Dr. M C Padma,"Human Motion Detection and Recognising their Actions from the Video Streams"   International Conference on Informatics and Analytics (ICIA’16),Pondicherry,DOI: http://dx.doi.org/10.1145/ 2980258.2980290,August-2016
  4. Nazh Ikizler and Ponar Duygulu (1999) Human Action Recognition Using Distribution of Oriented Rectangular Patches. Computer vision and pattern recognition (CVPR.05).Volume1,pp 886-893.
  5. Nazh Ikizler and Ponar Duygulu (2009) Histograms of oriented rectangles: A new pose descriptor for human action recognition. Image and vision computing .Volume 27,Issue 10,pp 1515-1526.
  6. Chunfeng Yuan,Weiming Hu,Xi Li,Stephen Maybank,Guan Luo (2004) Human Action Recognition under Log-Euclidean Riemannian Metric. Computer Vision .ACCV2009 9th Asian Conference on computer vision.
  7. Haiying Guan,Ronda Venkateswarlu Adaptive Multimodal Algorithm on Human Detection and Tracking.
  8. Link to Weizmann Dataset : http://www.wisdom.weizmann.ac.il /~vision/SpaceTime Actions.html.
  9. R.Venkatesh Babu and R.Hariharan (2009) Image processing,video surveillance,and security related applications using parallel machines. NAL-PD-FS-0916 National Aerospace Laboratories.
  10. Puttegowda D,Deepak N.A and Rajesh Shukla,"Robust Image Transmission over Nose Channel using Independent Component Analysis",International Journal of Advanced Networking & Application,Volume: 01,Issue: 06,Pages:347-352 (2010).

Downloads

Published

2018-02-28

Issue

Section

Research Articles

How to Cite

[1]
Dr. Puttegowda D, Suma A P , " Detection and Classification of Human Action Events from Captured Video Streams, IInternational Journal of Scientific Research in Computer Science, Engineering and Information Technology(IJSRCSEIT), ISSN : 2456-3307, Volume 3, Issue 1, pp.1870-1875, January-February-2018.