PERFORMANCE COMPARISON OF OBJECT ACQUISITION USING AUTOMATIC SCANNING AND FACIAL FEATURE TRACKING

Main Article Content

Hari Singh
Jaswinder Singh

Abstract

This paper presents a comparison of object acquisition in an HCI system using two different techniques: automatic scanning and facial feature tracking. In automatic scanning the focus moves from one object to next object automatically after a predefined time period called scanning time. Automatic scanning has been implemented by using MATLAB algorithm which virtually activates the tab key after each scanning time and the focus moves from object to object. The user activates a selection trigger for selection of the object when the focus comes over the object of interest. Whereas, in facial feature tracking approach the mouse cursor is moved in proportion to the movement of user’s face. To implement this technique Camera Mouse has been used which requires a simple webcam. It continuously takes facial images of the user and finds the mouse cursor position from the face coordinates. The two techniques are compared based upon accuracy and acquisition time for acquisition of text and graphic objects.

Downloads

Download data is not yet available.

Article Details

Section
Articles
Author Biography

Hari Singh, Research Scholar IKG Punjab Technical University Kapurthala (Punjab) - India

Assistant Professor Department of Electronics and Communication Engineering DAV Institute of Engineering and Technology, Jalandhar (India)

References

V. N. Hegde, R. S. Ullagaddimath, and K. S, “Low Cost Eye Based Human Computer Interface System,†in 2016 IEEE Annual India Conference (INDICON), 2016.

X. A. Zhao, E. D. Guestrin, D. Sayenko, T. Simpson, and M. Gauthier, “Typing with Eye-Gaze and Tooth-Clicks,†in ETRA ’12 Proceedings of the Symposium on Eye Tracking Research and Applications, 2012, pp. 341–344.

T. P. A. Guillamet, M. T. M. Teixidó, A. F. Viso, and C. R. J. Palacín, “Implementation of a Robust Absolute Virtual Head Mouse Combining Face Detection , Template Matching and Optical Flow Algorithms,†Telecommunication Systems, vol. 52, no. 3, pp. 1479–1489, 2013.

Z. Hao and Q. Lei, “Vision-Based Interface : Using Face and Eye Blinking Tracking with Camera,†in IITA’08 Second International Symposium on Intelligent Information Technology Applications, 2008.

M. Betke, J. Gips, and P. Fleming, “The Camera Mouse: Visual tracking of body features to provide computer access for people with severe disabilities,†IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 10, no. 1, pp. 1–10, 2002.

I. S. Mackenzie and B. Ashtiani, “BlinkWrite : Efficient Text Entry using Eye Blinks,†Universal Access in the Information Society, vol. 10, no. 1, pp. 69–80, 2011.

P. Biswas and P. Langdon, “A New Interaction Technique Involving Eye Gaze Tracker and Scanning System,†in Proceedings of the 2013 Conference on Eye Tracking, 2013, pp. 67–70.

K. Grauman, M. Betke, J. Lombardi, J. Gips, and G. R. Bradski, “Communication via Eye Blinks and Eyebrow Raises : Video-based Human-Computer Interfaces,†Universal Access in the Information Society, vol. 2, no. 4, pp. 359–373, 2003.

S. Yang, C. Lin, S. Lin, and C. Lee, “Design of Virtual Keyboard using Blink Control Method for the Severely Disabled,†Computer Methods and Programs in Biomedicine, vol. 111, no. 2, pp. 410–418, 2013.

R. Quain and M. M. Khan, “Portable Tongue-Supported Human Computer Interaction System Design and Implementation,†in 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), 2014, pp. 6302–6307.

M. Kumar, A. Paepcke, and T. Winograd, “EyePoint : Practical Pointing and Selection Using Gaze and Keyboard,†in CHI’07 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 2007, pp. 421–430.

K. Arai and R. Mardiyanto, “Eye-based HCI with Full Specification of Mouse and Keyboard using Pupil Knowledge in the Gaze Estimation,†in Proceedings - 2011 8th International Conference on Information Technology: New Generations, ITNG 2011, 2011, pp. 423–428.

W. Siriluck, S. Kamolphiwong, and T. Kamolphiwong, “Blink and Click,†in Proceedings of the 1st International Convention on Rehabilitation Engineering and Assistive Technology: in Conjunction with 1st Tan Tock Seng Hospital Neurorehabilitation Meeting, 2007, pp. 43–46.

A. Huckauf and M. H. Urbina, “Object Selection in Gaze Controlled Systems : What You Don ’ t Look At Is What You Get,†ACM Transactions on Applied Perceptron, vol. 8, no. 2, pp. 1–14, 2011.

J. V. Singh and G. Prasad, “Enhancing an Eye-Tracker based Human-Computer Interface with Multi-modal Accessibility Applied for Text Entry,†International Journal of Computer Applications, vol. 130, no. 16, pp. 16–22, 2015.

A. Huckauf, M. H. Urbina, and B. Weimar, “On Object Selection in Gaze Controlled Environments,†Journal of Eye Movement Research, vol. 2, no. 4, pp. 1–7, 2008.

Y. Gizatdinova, O. Spakov, and V. Surakka, “Comparison of Video-Based Pointing and Selection Techniques for Hands-Free Text Entry,†in AVI’12 Proceedings of the International Working Conference Advanced Visual Interfaces, 2012, pp. 132–139.

A. Posusta, A. J. Sporka, O. Polacek, S. Rudolf, and J. Otahal, “Control of Word Processing Environment using Myoelectric Signals,†Journal on Multimodal User Interfaces, vol. 9, no. 4, pp. 299–311, 2015.

V. Rantanen, J. Verho, J. Lekkala, O. Tuisku, V. Surakka, and T. Vanhala, “The Effect of Clicking by Smiling on the Accuracy of Head-Mounted Gaze Tracking,†in ETRA’12 Proceedings of the Symposium on Eye Tracking Research and Applications, 2012, pp. 345–348.