Look Cursoral AI: Machine Learning Enhanced Eye-Powered Interaction

D.SowjanyaStudent, Dept. of CSE, Mother Theresa Institute of Engineering and Technology, Chittoor (dist), Andhra Pradesh, IndiaDabbala Chenga Arun ReddyStudent, Dept. of CSE, Mother Theresa Institute of Engineering and Technology, Chittoor (dist), Andhra Pradesh, IndiaGuduru SatheeshStudent, Dept. of CSE, Mother Theresa Institute of Engineering and Technology, Chittoor (dist), Andhra Pradesh, IndiaG.SireeshaStudent, Dept. of CSE, Mother Theresa Institute of Engineering and Technology, Chittoor (dist), Andhra Pradesh, IndiaA.YamunaStudent, Dept. of CSE, Mother Theresa Institute of Engineering and Technology, Chittoor (dist), Andhra Pradesh, India

Vol 9 No 25 (2025): Volume 9, Special Issue of INSPIRE’25 April 2025 | Pages: 273-277

International Research Journal of Innovations in Engineering and Technology

OPEN ACCESS | Research Article | Published Date: 24-04-2025

doi Logo doi.org/10.47001/IRJIET/2025.INSPIRE44

Abstract

This project introduces LookCursorAi, a revolutionary human-computer interaction system that harnesses facial recognition technology to provide a handsfree, intuitive, and responsive user experience. Designed to enhance accessibility and inclusivity, particularly for individuals with mobility impairments, this innovative solution leverages Python, OpenCV, and pyautogui to track facial movements, detect landmarks, and recognize gestures, effectively controlling the cursor. Key features include realtime facial detection, cross-platform compatibility, performance optimization, and a user-centric interface. By eliminating traditional mouse and keyboard dependencies, LookCursorAi fosters a more inclusive computing environment, promoting enhanced user satisfaction, efficiency, and accessibility.

Keywords

Human-Computer Interaction, Facial Recognition Technology, Landmark Detection, Accessibility, Hands-Free Computing, Real-time Processing, Computer Vision, Gesture Detection


Citation of this Article

D.Sowjanya, Dabbala Chenga Arun Reddy, Guduru Satheesh, G.Sireesha, & A.Yamuna. (2025). Look Cursoral AI: Machine Learning Enhanced Eye-Powered Interaction. In proceeding of International Conference on Sustainable Practices and Innovations in Research and Engineering (INSPIRE'25), published by IRJIET, Volume 9, Special Issue of INSPIRE’25, pp 273-277. Article DOI https://doi.org/10.47001/IRJIET/2025.INSPIRE44

References
  1. W. Fahim et al., "A visual analytic in deep learning approach to eye movement," IEEE Access 8, pp. 45924-45937, 2020.
  2. R. J. K. Jacob, "The use of eye movements in human computer interaction," ACM Trans. Inf. Syst. 9, 2, pp. 152-169, 1991.
  3. P. Mistry and P. Maes, "SixthSense: a wearable gestural interface," ACM SIGGRAPH ASIA 2009 Sketches, 85, 2009.
  4. Y. Tang and J. Su, "Eye movement prediction based on adaptive BP neural network," Scientific Programming, 2021.
  5. S. Liaqat et al., "Predicting ASD diagnosis in children with synthetic and image-based eye gaze data," Signal Processing: Image Communication 94, 116198, 2021.
  6. J. Hammel et al., "Environmental barriers and supports to everyday participation," Arch. Phys. Med. Rehabil. 96, 4, pp. 578-588, 2015.
  7. J. Cannan and H. Hu, "Human-Machine Interaction (HMI): A Survey," Univ. Essex, 2011.
  8. D. Gorecky et al., "Human-machine interaction in the industry 4.0 era," Proc. 12th IEEE Int. Conf. Ind. Informat. (INDIN), pp. 289-294, 2014.
  9. X. Tong et al., "Dynamic gesture based short-range human-machine interaction," U.S. Patent 9 164 589, Oct. 20, 2015.
  10. S. Combefis et al., "A formal framework for design and analysis of human-machine interaction," Proc. IEEE Int. Conf. Syst., Man, Cybern., pp. 1801-1808, 2011.
  11. S. Seneviratne et al., "A survey of wearable devices and challenges," IEEE Commun. Surveys Tutr. 19, 4, pp. 2573-2620, 4th Quart., 2017.