EXPLORING GAZE-GESTURE INTERACTION ON THE WEB: A COMPARISON WITH MOUSE INPUT FOR OBJECT MANIPULATION

Автор(и)

DOI:

https://doi.org/10.32782/IT/2024-3-4

Ключові слова:

gaze-gesture interaction, mouse interaction, web-based interaction, object manipulation, humancomputer interaction, gaze tracking, accessible technology, hand gesture recognition

Анотація

The purpose of this study is to implement and evaluate a web-based gaze-gesture interaction method for object manipulation using a standard web camera. This method combines gaze tracking for object selection with hand gestures for natural manipulation tasks like rotating, scaling, and dragging. Unlike other implementations of such interaction that require specialized hardware, this method uses widely available technology, making advanced interaction techniques more accessible. The scientific novelty lies in developing a gaze-gesture interaction system that operates entirely on a web platform using standard hardware, removing the need for expensive, specialized equipment and enabling broader adoption. The methodology involved creating a web-based system using computer vision algorithms for real-time gaze tracking and gesture recognition. A user study was conducted where participants completed object manipulation tasks using both the gaze-gesture input and traditional mouse input, with task completion times recorded and analyzed. Conclusion. The study shows that gaze-gesture interaction is particularly effective for tasks requiring simultaneous actions, such as rotating and scaling objects, outperforming mouse input in these scenarios. While mouse input remains more efficient for simpler tasks, gaze-gesture interaction offers strong potential for enhancing complex task interactions on web platforms, contributing to the development of more accessible and intuitive input methods.

Посилання

Argelaguet F., Andujar C. A survey of 3D object selection techniques for virtual environments. Computers & Graphics, 2013. 37(3), 121–136. https://doi.org/10.1016/j.cag.2012.12.003

Chatterjee I., Xiao R., Harrison C. Gaze+gesture: Expressive, precise and targeted free-space interactions. Proceedings of the 2015 ACM on International Conference on Multimodal Interaction, 2015. 131–138. https://doi.org/10.1145/2818346.2820752

Google. (n.d.). MediaPipe hand landmarker. URL: https://ai.google.dev/edge/mediapipe/solutions/vision/hand_landmarker June 26, 2024

Hales J. Interacting with objects in the environment by gaze and hand gestures. 2013. URL:https://api.semanticscholar.org/CorpusID:52206471

Lystbæk M. N., Rosenberg P., Pfeuffer K., Grønbæk J. E., Gellersen H. Gaze-hand alignment: Combining eye gaze and mid-air pointing for interacting with menus in augmented reality. Proceedings of the ACM on Human-Computer Interaction, 6(ETRA), 2022. 1–18. https://doi.org/10.1145/3530886

Papoutsaki A., Sangkloy P., Laskey J., Daskalova N., Huang J., Hays J. WebGazer: Scalable webcam eye tracking using user interactions. Proceedings of the 25th International Joint Conference on Artificial Intelligence (IJCAI), 2016. 3839–3845.

Pfeuffer K. Design principles & issues for gaze and pinch interaction. ArXiv, abs/2401.10948. 2024. URL: https://api.semanticscholar.org/CorpusID:267069101

Pfeuffer K., Mayer B., Mardanbegi D., Gellersen H. Gaze + pinch interaction in virtual reality. Proceedings of the 5th Symposium on Spatial User Interaction, 2017. 99–108. https://doi.org/10.1145/3131277.3132180

Reddy N. L., Murugeswari R., Imran Md., Subhash N., Reddy N. V. K., Adarsh N. B. Virtual mouse using hand and eye gestures. 2023 International Conference on Data Science, Agents & Artificial Intelligence (ICDSAAI), 2023. 1–5. https://doi.org/10.1109/ICDSAAI59313.2023.10452550

Rozado D., Niu J., Lochner M. Fast human-computer interaction by combining gaze pointing and face gestures. ACM Transactions on Accessible Computing, 2017. 10(3), 1–18. https://doi.org/10.1145/3075301

Ryu K., Lee J.-J., Park J.-M. GG interaction: A gaze–grasp pose interaction for 3D virtual object selection. Journal on Multimodal User Interfaces, 2019. 13(4), 383–393. https://doi.org/10.1007/s12193-019-00305-y

Slambekova D., Bailey R., Geigel J. Gaze and gesture based object manipulation in virtual worlds. Proceedings of the 18th ACM Symposium on Virtual Reality Software and Technology, 2012. 203–204. https://doi.org/10.1145/2407336.2407380

Wachs J. P., Kölsch M., Stern H., Edan Y. Vision-based hand-gesture applications. Communications of the ACM, 2011. 54(2), 60–71. https://doi.org/10.1145/1897816.1897838

Ware C., Mikaelian H. H. An evaluation of an eye tracker as a device for computer input2. Proceedings of the SIGCHI/GI Conference on Human Factors in Computing Systems and Graphics Interface, 1986. 183–188. https://doi.org/10.1145/29933.275627

##submission.downloads##

Опубліковано

2024-12-06