Yang-Keun Ahn, Kwang-Soon Choi, Young-Choong Park



Screen Control System Based on Fingertip Tracking : AirFlick




This paper suggests a method of extracting fingertips from 3D information obtained from single-depth cameras in a smart-device environment, and for controlling the screens without directly touching them. This method extracts fingertips from hand images, and controls the screen by reading the fingertips’tracking information. First, we extract hand areas in real time by utilizing depth information, and remove the noise by preprocessing the extracted hand areas to obtain the hand area through labeling. Next, we extract prospective fingertips from the areas and obtain final fingertips after a verification process. Fingertip movement information is indicated as a graph, and movement at a pace exceeding a certain critical level generates flick events. Finally, the key codes obtained from such events are relayed to application contents to control the screen.


Finger Tracking, Fingertip Detection, Air Touch, Screen Control System, Finger Touch


[1] Y. Hirobe, T.Niikura, Y. Watanabe, T. Komuro, M. Ishikawa, “Vision-based Input Interface for Mobile Devices with High-speed Fingertip Tracking,” Adj. Proc. ACM UIST 2009, pp. 7-8.

[2] Y. Takeoka et al., “Z-touch: an infrastructure for 3d gesture interaction in the proximity of tabletop surfaces,” Proceedings of ITS’10, 2010.

[3] Y. Tsukada et al., “Layerd touch panel: the input device with touch layers,” Proceedings of CHI’02, 2002, pp. 584-585.

[4] http://www.softkinetic.com

Cite this paper

Yang-Keun Ahn, Kwang-Soon Choi, Young-Choong Park. (2017) Screen Control System Based on Fingertip Tracking : AirFlick. International Journal of Computers, 2, 69-73


Copyright © 2017 Author(s) retain the copyright of this article.
This article is published under the terms of the Creative Commons Attribution License 4.0