Home About the Journal Latest Work Current Issue Archive Special Issues Editorial Board
<< Previous Next >>

2020, 2(2): 153-161

Published Date:2020-4-20 DOI: 10.1016/j.vrih.2020.02.001

On attaining user-friendly hand gesture interfaces to control existing GUIs

Abstract

Background
Hand gesture interfaces are dedicated programs that principally perform hand tracking and hand gesture prediction to provide alternative controls and interaction methods. They take advantage of one of the most natural ways of interaction and communication, proposing novel input and showing great potential in the field of the human-computer interaction. Developing a flexible and rich hand gesture interface is known to be a time-consuming and arduous task. Previously published studies have demonstrated the significance of the finite-state-machine (FSM) approach when mapping detected gestures to GUI actions.
Methods
In our hand gesture interface, we broadened the FSM approach by utilizing gesture-specific attributes, such as distance between hands, distance from the camera, and time of occurrences, to enable users to perform unique GUI actions. These attributes are obtained from hand gestures detected by the RealSense SDK employed in our hand gesture interface. By means of these gesture-specific attributes, users can activate static gestures and perform them as dynamic gestures. We also provided supplementary features to enhance the efficiency, convenience, and user-friendliness of our hand gesture interface. Moreover, we developed a complementary application for recording hand gestures by capturing hand keypoints in depth and color images to facilitate the generation of hand gesture datasets.
Results
We conducted a small-scale user survey with fifteen subjects to test and evaluate our hand gesture interface. Anonymous feedback obtained from the users indicates that our hand gesture interface is adequately facile and self-explanatory to use. In addition, we received constructive feedback about minor flaws regarding the responsiveness of the interface.
Conclusions
We proposed a hand gesture interface along with key concepts to attain user-friendliness and effectiveness in the control of existing GUIs.

Keyword

Human-computer interaction ; Gesture recognition ; Computer vision applications

Cite this article

Egemen ERTUGRUL, Ping LI, Bin SHENG. On attaining user-friendly hand gesture interfaces to control existing GUIs. Virtual Reality & Intelligent Hardware, 2020, 2(2): 153-161 DOI:10.1016/j.vrih.2020.02.001

References

1. Ge L H, Liang H, Yuan J S, Thalmann D. Robust 3D hand pose estimation in single depth images: from single-view CNN to multi-view CNNs. In: 2016 IEEE Conference on Computer Vision and Pattern Recognition. Las Vegas, USA, IEEE Computer Society, 2016 DOI:10.1109/CVPR.2016.391

2. Anders M, Jakobsen M R, Hornbæk K. Vulture: a mid-air word-gesture keyboard. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. Toronto, Canada, SIGCHI, 2014 DOI:10.1145/2556288.2556964

3. Thammathip P, Clark A, Billinghurst M, Cockburn A. User-defined gestures for augmented reality. In: Human-Computer Interaction–INTERACT 2013: 14th IFIP TC 13 International Conference. Cape Town, South Africa, IFIP, 2013 DOI:10.1145/2468356.2468527

4. Sharp T, Wei Y C, Freedman D, Kohli P, Krupka E, Fitzgibbon A, Accurate Izadi S., robust, and flexible real-time hand tracking. Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing System. Seoul, Republic of Korea, ACM, 2015 DOI:10.1145/2702123.2702179

5. Srinath S, Maria A, Christian F, Oulasvirta A T. Investigating the dexterity of multi-finger input for mid-air text entry. In: ACM Conference on Human Factors in Computing Systems. New York, USA, ACM, 2015

6. Sridhar S, Mueller F, Oulasvirta A, Theobalt C. Fast and robust hand tracking using detection-guided optimization. In: 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). Boston, USA, IEEE, 2015 DOI:10.1109/cvpr.2015.7298941

7. Tagliasacchi A, Schröder M, Tkach A, Bouaziz S, Botsch M, Pauly M. Robust articulated-ICP for real-time hand tracking. Computer Graphics Forum, 2015, 34(5): 101–114 DOI:10.1111/cgf.12700

8. Wagner J, Lecolinet E, Selker T. Multi-finger chords for hand-held tablets: recognizable and memorable. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. SIGCHI, 2014

9. Chen Q, Sun X, Wei Y C, Tang X O, Sun J. Realtime and robust hand tracking from depth. In: 2014 IEEE Conference on Computer Vision and Pattern Recognition. Columbus, USA, IEEE, 2014 DOI:10.1109/cvpr.2014.145

10. Sridhar S, Oulasvirta A, Theobalt C. Interactive markerless articulated hand motion tracking using RGB and depth data. Proceedings of the 2013 IEEE International Conference on Computer Vision. Sydney, Australia, IEEE, 2013 DOI:10.1109/ICCV.2013.305

11. Krupka E, Karmon K, Bloom N, Freedman D, Gurvich I, Hurvitz A, Leichter I. Toward realistic hands gesture interface: keeping it simple for developers and machines. Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. New York, USA, ACM, 2017 DOI:10.1145/3025453.3025508

12. Intel. RealSense™ SDK for Windows. https://software.intel.com/en-us/articles/realsense-sdk-windows-eol

13. Noonan M. Windows Input Simulator

Related

1. Yuanyuan SHI, Yunan LI, Xiaolong FU, Kaibin MIAO, Qiguang MIAO, Review of dynamic gesture recognition Virtual Reality & Intelligent Hardware 2021, 3(3): 183-206

2. Chongyang SUN, Weizhi NAI, Xiaoying SUN, Tactile sensitivity in ultrasonic haptics: Do different parts of hand and different rendering methods have an impact on perceptual threshold? Virtual Reality & Intelligent Hardware 2019, 1(3): 265-275

3. Yang LI, Jin HUANG, Feng TIAN, Hong-An WANG, Guo-Zhong DAI, Gesture interaction in virtual reality Virtual Reality & Intelligent Hardware 2019, 1(1): 84-112

4. Wanlu ZHENG, Wenming ZHENG, Yuan ZONG, Multi-scale discrepancy adversarial network for cross-corpus speech emotion recognition Virtual Reality & Intelligent Hardware 2021, 3(1): 65-75

5. Mohammad Mahmudul ALAM, S. M. Mahbubur RAHMAN, Affine transformation of virtual 3D object using 2D localization of fingertips Virtual Reality & Intelligent Hardware 2020, 2(6): 534-555

6. Meng SONG, Shiyi LIU, Ge YU, Lili GUO, Dangxiao WANG, An immersive space liquid bridge experiment system with gesture interaction and vibrotactile feedback Virtual Reality & Intelligent Hardware 2019, 1(2): 219-232

7. Jiaxin LIU, Hongxin ZHANG, Chuankang LI, COMTIS: Customizable touchless interaction system for large screen visualization Virtual Reality & Intelligent Hardware 2020, 2(2): 162-174

8. Mengting XIAO, Zhiquan FENG, Xiaohui YANG, Tao XU, Qingbei GUO, Multimodal interaction design and application in augmented reality for chemical experiment Virtual Reality & Intelligent Hardware 2020, 2(4): 291-304

9. Benjia ZHOU, Jun WAN, Yanyan LIANG, Guodong GUO, Adaptive cross-fusion learning for multi-modal gesture recognition Virtual Reality & Intelligent Hardware 0, -(-): 1-13

10. Benjia ZHOU, Jun WAN, Yanyan LIANG, Guodong GUO, Adaptive cross-fusion learning for multi-modal gesture recognition Virtual Reality & Intelligent Hardware 2021, 3(3): 235-247

11. Xuezhi YAN, Qiushuang WU, Xiaoying SUN, Electrostatic tactile representation in multimedia mobile terminal Virtual Reality & Intelligent Hardware 2019, 1(2): 201-218