A Survey Paper on Smart Human Activity Detection Using Yolo

Prof. Aparna ThakreProfeesor, Computer Engineering, Siddhant College of Enguneering Technical Campus, Sudumbhare, Pune, IndiaAtharav DeshpandeStudent, Computer Engineering, Siddhant College of Enguneering Technical Campus, Sudumbhare, Pune, IndiaRohan KadamStudent, Computer Engineering, Siddhant College of Enguneering Technical Campus, Sudumbhare, Pune, IndiaAjay UjagareStudent, Computer Engineering, Siddhant College of Enguneering Technical Campus, Sudumbhare, Pune, IndiaAbhishek jadhavStudent, Computer Engineering, Siddhant College of Enguneering Technical Campus, Sudumbhare, Pune, India

Vol 7 No (2023): Volume 7, Special Issue of ICRTET- 2023 | Pages: 236-241

International Research Journal of Innovations in Engineering and Technology

OPEN ACCESS | Research Article | Published Date: 16-07-2023

doi Logo IRJIET.ICRTET49

Abstract

A simple operational model could allow one person to monitor all around us to ensure security and privacy, while maintaining cost and performance of management and getting it right. This inspection with real-time video monitoring function can be sent to hospitals or nursing homes for the sick and elderly, as well as various people working in important area such as airport. In we decided to use the YOLOv4 (You Only One See One) algorithm, which is the newest and fastest of the total algorithms for fast analysis of actions and accurate results when dealing with complex human behaviour. This method uses a bounding box to indicate the action. In these cases, we collected 4,674 different data from different hospitals or different cases, making the most accurate use of one of the largest datasets used in this type of project. When we research, we divide our actions into three different classes: standing, sitting, and walking. Model can control and analyse the activities of many patients or other normal people, and support can monitor the activities of many. After completing three projects, the model achieved an average accuracy of 94.6667.

Keywords

.


Citation of this Article

Prof. Aparna Thakre, Atharav Deshpande, Rohan Kadam, Ajay Ujagare, Abhishek jadhav, “A Survey Paper on Smart Human Activity Detection Using Yolo” in proceeding of International Conference of Recent Trends in Engineering & Technology ICRTET - 2023, Organized by SCOE, Sudumbare, Pune, India, Published in IRJIET, Volume 7, Special issue of ICRTET-2023, pp 236-241, June 2023.

References
  1. Redmon, J.,  2020.  YOLO:  Real-Time  Object  Detection.  [online]  Pjreddie.com.  Available  at: (J) [Accessed 6 December 2020].
  2. “Data Generated by new surveillance cameras to increase exponentially in the coming years." [ Online, Accessedon 12 March 2018].             http://www.securityinfowatch.com/news/12160483/data-generated-by-new-surveillance-cameras-to-increase- exponentially-in-the-comingyears
  3. J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, "You Only Look Once: Unified, Real-Time Object Detection," 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, 2016, pp. 779-788. DOI: 10.1109/CVPR.2016.91
  4. C Wolf, J. Mille, E. Lombardi, O. Celiktutan, M. Jiu, E. Dogan, G. Eren, M. Baccouche, E. Dellandrea, C.-E. Bichot, C. Garcia, B. Sankur, Evaluation of video activity localizations integrating quality and quantity measurements, In Computer Vision and Image Understanding (127):14-30, 2014.
  5. Limin Wang, Yu Qiao, and Xiaoou Tang. Action recognition with trajectory-pooled deep-convolutional descriptors. In CVPR, pages 4305– 4314, 2015.
  6. Simonyan, Karen & Zisserman, Andrew. (2014). Two-Stream Convolutional Networks for Action Recognition in Videos. Advances in Neural Information Processing Systems.
  7. Roberts Damaševičius, Mindaugas Vasiljevas, Justas Šalkevičius, Marcin Woźniak, "Human Activity Recognition in AAL Environments Using Random proposed workions," Computational and Mathematical Methods in Medicine, vol. 2016, Article ID 4073584, 17 pages, 2016
  8. Vrigkas M, Nikou C and Kakadiaris IA (2015) A Review of Human Activity Recognition Methods. Front. Robot. AI 2:28. DOI: 10.3389/front.2015.00028
  9. Vishwakarma S, Agrawal A. A survey on activity recognition and behavior understanding in video surveillance. Vis Comput. 2013;29(10):983–1009.
  10. Bayat A, Pomplun M, Tran DA. A study on human activity recognition using accelerometer data from smartphones. Procedia Comput Sci. 2014;34:450–7.
  11. Radu V, Lane N. D, Bhattacharya S, Mascolo C, Marina M. K, Kawsar F. Towards deep multimodal learning for activity recognition on mobile devices. In: Proceedings of the 2016 ACM international joint conference on pervasive and ubiquitous computing: adjunct. ACM; 2016, September. pp. 185-188.
  12. Moya Rueda F, Grzeszick R, Fink G, Feldhorst S, ten Hompel M. Convolutional neural networks for human activity recognition using body-worn sensors. In: Informatics, Vol. 5, No. 2. Multidisciplinary Digital Publishing Institute. 2018.p. 26.
  13. Zeng M, Nguyen L. T, Yu B, Mengshoel O. J, Zhu J, Wu P, Zhang J. Convolutional neural networks for human activity recognition using mobile sensors. In: 6th International conference on mobile computing, applications, and services. IEEE; 2014, November. pp. 197-205.
  14. Raptis M, Sigal L. Poselet key-framing: a model for human activity recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2013. pp. 2650-2657.
  15. Redmon J, Farhadi A. YOLO9000: better, faster, stronger. In: Proceedings of the IEEE conference on computer vision and pattern recognition. 2017. pp. 7263-7271
  16. Vishwakarma S, Agrawal A. A survey on activity recognition and behavior understanding in video surveillance. Vis Comput. 2013;29(10):983–1009.
  17. Katrina V, Zervakis M, Kalaitzakis K. A survey of video processing techniques for traffic applications. Image Vis Comput. 2003;21(4):359–81.
  18. Roboflow Blog. 2020. Breaking Down Yolov4. [online] Available at: [Accessed 6 December 2020].
  19. Bayat A, Pomplun M, Tran DA. A study on human activity recognition using accelerometer data from smartphones. Procedia Comput Sci. 2014;34:450–7.
  20. Medium. 2020. Introduction To Yolov4: Research Review. [online] Available at: [Accessed 6 December 2020].
  21. Medium. 2020. YOLO — You Only Look Once, Real-Time Object Detection Explained. [online] Available at: [Accessed 6 December 2020]
  22. Redmon, J. and Farhadi, A., 2020. Yolov3: An Incremental Improvement. [online] arXiv.org. Available at: [Accessed 6 December 2020]
  23. T. Choudhury, S. Consolvo, B. Harrison, J. Hightower, A. LaMarca, L. LeGrand, et al., "The mobile sensing platform:
  24. An embedded activity recognition system," IEEE Pervasive Computing, vol. 7, no. 2, pp. 32-41, 2008
  25. G.Akilandasowmya, P.Sathiya, P.AnandhaKumar Human action recognition in the research area of computer vision, IEEE International Conference on automatically detect and retrieve semantic events in the video, 2015 Seventh International Conference on,15-17 Dec.2015
  26. Arie et al. Human activity recognition using multidimensional indexing IEEE Transactions on Pattern Analysis and Machine Intelligence, Volume: 24, Issue: 8, Aug 2002.