Home About the Journal Latest Work Current Issue Archive Special Issues Editorial Board
<< Previous Next >>

2019, 1(1): 1-9

Published Date:2019-2-20 DOI: 10.3724/SP.J.2096-5796.2018.0007

Design of finger gestures for locomotion in virtual reality

Abstract

Background
Within a virtual environment (VE) the control of locomotion (e.g., self-travel) is critical for creating a realistic and functional experience. Usually the direction of locomotion, while using a head-mounted display (HMD), is determined by the direction the head is pointing and the forward or backward motion is controlled with a hand held controllers. However, hand held devices can be difficult to use while the eyes are covered with a HMD. Free hand gestures, that are tracked with a camera or a hand data glove, have an advantage of eliminating the need to look at the hand controller but the design of hand or finger gestures for this purpose has not been well developed.
Methods
This study used a depth-sensing camera to track fingertip location (curling and straightening the fingers), which was converted to forward or backward self-travel in the VE. Fingertip position was converted to self-travel velocity using a mapping function with three parameters: a region of zero velocity (dead zone) around the relaxed hand position, a linear relationship of fingertip position to velocity (slope or β) beginning at the edge of the dead zone, and an exponential relationship rather than a linear one mapping fingertip position to velocity (exponent). Using a HMD, participants moved forward along a virtual road and stopped at a target on the road by controlling self-travel velocity with finger flexion and extension. Each of the 3 mapping function parameters was tested at 3 levels. Outcomes measured included usability ratings, fatigue, nausea, and time to complete the tasks.
Results
Twenty subjects participated but five did not complete the study due to nausea. The size of the dead zone had little effect on performance or usability. Subjects preferred lower β values which were associated with better subjective ratings of control and reduced time to complete the task, especially for large targets. Exponent values of 1.0 or greater were preferred and reduced the time to complete the task, especially for small targets.
Conclusions
Small finger movements can be used to control velocity of self-travel in VE. The functions used for converting fingertip position to movement velocity influence usability and performance.

Keyword

Human computer interaction ; Virtual environment ; Gesture design

Cite this article

Rachel HUANG, Carisa HARRIS-ADAMSON, Dan ODELL, David REMPEL. Design of finger gestures for locomotion in virtual reality. Virtual Reality & Intelligent Hardware, 2019, 1(1): 1-9 DOI:10.3724/SP.J.2096-5796.2018.0007

References

1. Ware C, Osborne S. Exploration and virtual camera control in virtual three dimensional environments. ACM SIGGRAPH computer graphics, 1990, 24(2): 175–183 DOI:10.1145/91394.91442

2. Riecke B E, Bodenheimer B, McNamara T P, Williams B, Peng P, Feuereissen D. Do we need to walk for effective virtual reality navigation? physical rotations alone may suffice. In: International Conference on Spatial Cognition, Springer, Berlin, Heidelberg, 2010: 234–247 DOI:10.1007/978-3-642-14749-4_21

3. Maggioni C. A novel gestural input device for virtual reality. In: Proceedings of IEEE Virtual Reality Annual International Symposium, IEEE, 1993, 118–124 DOI:10.1109/VRAIS.1993.380789

4. Baudel T, Beaudouin-Lafon M. Charade: remote control of objects using free-hand gestures. Communications of the ACM, 1993, 36(7): 28–35 DOI:10.1145/159544.159562

5. Jeong D H, Song C G, Chang R, Hodges L. User experimentation: an evaluation of velocity control techniques in immersive virtual environments. Virtual Reality, 2009, 13(1): 41–50 DOI:10.1007/s10055-008-0098-6

6. Nai W, Rempel D, Liu Y, Barr A, Harris-Adamson C, Wang Y. Performance and user preference of various functions for mapping hand position to movement velocity in a virtual environment. In: Virtual, Augmented and Mixed Reality, Springer International Publishing, 2017, 141–152 DOI:10.1007/978-3-319-57987-0_12

7. Ware C, Slipp L. Using velocity control to navigate 3d graphical environments: A comarison of three interfaces. In: Proceedings of the Human Factors Society Annual Meeting. Sage CA, Los Angeles, 1991, 35(5): 300–304 DOI:10.1177/154193129103500513

8. Card S K, English W K, Burr B J. Evaluation of Mouse, Rate-controlled isometric joystick, step keys, and text keys for text selection on a CRT. Ergonomics, 1978, 21(8): 601–613 DOI:10.1080/00140137808931762

9. Chan E, Seyed T, Stuerzlinger W, Yang X-D, Maurer F. User elicitation on single-hand microgestures. In: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. San Jose, California, ACM, 2016: 3403–341 DOI:10.1145/2858036.2858589

10. Card S K, Moran T P, Newell A. The psychology of human-computer interaction. Hillsdale, New Jersey: Lawrence Erlbaum Associates, 1983

Related

1. Jun-Hao YIN, Chin-Boon CHNG, Pooi-Mun WONG, Nicholas HO, Matthew CHUA, Chee-Kong CHUI, VR and AR in human performance researchAn NUS experience Virtual Reality & Intelligent Hardware 2020, 2(5): 381-393

2. Jiaxin LIU, Hongxin ZHANG, Chuankang LI, COMTIS: Customizable touchless interaction system for large screen visualization Virtual Reality & Intelligent Hardware 2020, 2(2): 162-174

3. Shiguang QIU, Shuntao LIU, Deshuai KONG, Qichang HE, Three-dimensional virtual-real mapping of aircraft autom-atic spray operation and online simulation monitoring Virtual Reality & Intelligent Hardware 2019, 1(6): 611-621