Look Cursoral AI: Machine Learning Enhanced Eye-Powered Interaction

Abstract

This project introduces LookCursorAi, a revolutionary human-computer interaction system that harnesses facial recognition technology to provide a handsfree, intuitive, and responsive user experience. Designed to enhance accessibility and inclusivity, particularly for individuals with mobility impairments, this innovative solution leverages Python, OpenCV, and pyautogui to track facial movements, detect landmarks, and recognize gestures, effectively controlling the cursor. Key features include realtime facial detection, cross-platform compatibility, performance optimization, and a user-centric interface. By eliminating traditional mouse and keyboard dependencies, LookCursorAi fosters a more inclusive computing environment, promoting enhanced user satisfaction, efficiency, and accessibility.

Country : India

1 D.Sowjanya2 Dabbala Chenga Arun Reddy3 Guduru Satheesh4 G.Sireesha5 A.Yamuna

  1. Student, Dept. of CSE, Mother Theresa Institute of Engineering and Technology, Chittoor (dist), Andhra Pradesh, India
  2. Student, Dept. of CSE, Mother Theresa Institute of Engineering and Technology, Chittoor (dist), Andhra Pradesh, India
  3. Student, Dept. of CSE, Mother Theresa Institute of Engineering and Technology, Chittoor (dist), Andhra Pradesh, India
  4. Student, Dept. of CSE, Mother Theresa Institute of Engineering and Technology, Chittoor (dist), Andhra Pradesh, India
  5. Student, Dept. of CSE, Mother Theresa Institute of Engineering and Technology, Chittoor (dist), Andhra Pradesh, India

IRJIET, Volume 9, Special Issue of INSPIRE’25 April 2025 pp. 273-277

doi.org/10.47001/IRJIET/2025.INSPIRE44

References

  1. W. Fahim et al., "A visual analytic in deep learning approach to eye movement," IEEE Access 8, pp. 45924-45937, 2020.
  2. R. J. K. Jacob, "The use of eye movements in human computer interaction," ACM Trans. Inf. Syst. 9, 2, pp. 152-169, 1991.
  3. P. Mistry and P. Maes, "SixthSense: a wearable gestural interface," ACM SIGGRAPH ASIA 2009 Sketches, 85, 2009.
  4. Y. Tang and J. Su, "Eye movement prediction based on adaptive BP neural network," Scientific Programming, 2021.
  5. S. Liaqat et al., "Predicting ASD diagnosis in children with synthetic and image-based eye gaze data," Signal Processing: Image Communication 94, 116198, 2021.
  6. J. Hammel et al., "Environmental barriers and supports to everyday participation," Arch. Phys. Med. Rehabil. 96, 4, pp. 578-588, 2015.
  7. J. Cannan and H. Hu, "Human-Machine Interaction (HMI): A Survey," Univ. Essex, 2011.
  8. D. Gorecky et al., "Human-machine interaction in the industry 4.0 era," Proc. 12th IEEE Int. Conf. Ind. Informat. (INDIN), pp. 289-294, 2014.
  9. X. Tong et al., "Dynamic gesture based short-range human-machine interaction," U.S. Patent 9 164 589, Oct. 20, 2015.
  10. S. Combefis et al., "A formal framework for design and analysis of human-machine interaction," Proc. IEEE Int. Conf. Syst., Man, Cybern., pp. 1801-1808, 2011.
  11. S. Seneviratne et al., "A survey of wearable devices and challenges," IEEE Commun. Surveys Tutr. 19, 4, pp. 2573-2620, 4th Quart., 2017.