Home About the Journal Latest Work Current Issue Archive Special Issues Editorial Board
<< Previous Next >>

2019, 1(3): 303-315

Published Date:2019-6-20 DOI: 10.3724/SP.J.2096-5796.2019.0013

Influence of multi-modality on moving target selection in virtual reality

Abstract

Background
Owing to recent advances in virtual reality (VR) technologies, effective user interaction with dynamic content in 3D scenes has become a research hotspot. Moving target selection is a basic interactive task in which the user performance research in tasks is significant to user interface design in VR. Different from the existing static target selection studies, the moving target selection in VR is affected by the change in target speed, angle and size, and lack of research on some key factors.
Methods
This study designs an experimental scenario in which the users play badminton under the condition of VR. By adding seven kinds of modal clues such as vision, audio, haptics, and their combinations, five kinds of moving speed and four kinds of serving angles, and the effect of these factors on the performance and subjective feelings in moving target selection in VR, is studied.
Results
The results show that the moving speed of the shuttlecock has a significant impact on the user performance. The angle of service has a significant impact on hitting rate, but has no significant impact on the hitting distance. The acquisition of the user performance by the moving target is mainly influenced by vision under the combined modalities; adding additional modalities can improve user performance. Although the hitting distance of the target is increased in the trimodal condition, the hitting rate decreases.
Conclusion
This study analyses the results of user performance and subjective perception, and then provides suggestions on the combination of modality clues in different scenarios.

Keyword

Multimodal ; Moving target selection ; Virtual reality

Cite this article

Yang LI, Dong WU, Jin HUANG, Feng TIAN, Hong'an WANG, Guozhong DAI. Influence of multi-modality on moving target selection in virtual reality. Virtual Reality & Intelligent Hardware, 2019, 1(3): 303-315 DOI:10.3724/SP.J.2096-5796.2019.0013

References

1. Gunn T J, Irani P, Anderson J. An evaluation of techniques for selecting moving targets. In: Proceedings of the 27th international conference extended abstracts on Human factors in computing systems. Boston, MA, USA, ACM, 2009, 3329–3334 DOI:10.1145/1520340.1520481

2. Lee B, Oulasvirta A: Modelling Error Rates in Temporal Pointing. In: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. San Jose, California, USA, ACM, 2016, 1857–1868 DOI:10.1145/2858036.2858143

3. Ilich M V. Moving target selection in interactive video. Tanpakushitsu Kakusan Koso Protein Nucleic Acid Enzyme, 2009, 44(12 Suppl): 1682–1690

4. MacKenzie I S, Buxton W. Extending Fitts' law to two-dimensional tasks. In: Proceedings of the SIGCHI conference on Human factors in computing systems. Monterey, California, USA, ACM, 1992 DOI:10.1145/142750.142794

5. Tresilian J R. Hitting a moving target: Perception and action in the timing of rapid interceptions. Perception & Psychophysics 2005, 67(1): 129–149 DOI:10.3758/bf03195017

6. Su X J, Au O K C, Lau R W H. The implicit fan cursor: a velocity dependent area cursor. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 2014: 753–762 DOI:10.1145/2556288.2557095

7. Ernst M O, Banks M S. Humans integrate visual and haptic information in a statistically optimal fashion. Nature 2002, 415(6870): 429–433 DOI:10.1038/415429a

8. Vanacken L, Raymaekers C, Coninx K. Evaluating the influence of multimodal feedback on egocentric selection metaphors in virtual environments//Haptic and Audio Interaction Design. Berlin, Heidelberg: Springer Berlin Heidelberg, 2006: 12–23 DOI:10.1007/11821731_2

9. Shadmehr R, Smith M A, Krakauer J W. Error correction, sensory prediction, and adaptation in motor control. Annual Review of Neuroscience, 2010, 33(1): 89–108 DOI:10.1146/annurev-neuro-060909-153135

10. Jagacinski R J, Repperger D W, Ward S L, Moran M S. A test of fitts' law with moving targets. Human Factors: the Journal of the Human Factors and Ergonomics Society 1980, 22(2): 225–233 DOI:10.1177/001872088002200211

11. Zhai S, Conversy S, Beaudouin-Lafon M, Guiard Y. Human on-line response to target expansion. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. Ft. Lauderdale, Florida, USA, ACM, 2003, 177–184 DOI:10.1145/642611.642644

12. Huang J, Tian F, Fan X M, Zhang X, Zhai S M. Understanding the uncertainty in 1D unidirectional moving target selection. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. Montreal QC, Canada, ACM, 2018, 1–12 DOI:10.1145/3173574.3173811

13. Lee B, Kim S, Oulasvirta A, Lee J I, Park E. Moving target selection. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. Montreal QC, Canada, ACM, 2018, 1–12 DOI:10.1145/3173574.3173804

14. Kabbash P, Buxton W A S. The “prince” technique: Fitts' law and selection using area cursors. In: Proceedings of the SIGCHI conference on Human factors in computing systems. ACM Press, 1995, 273–279

15. Grossman T, Balakrishnan R. The bubble cursor: enhancing target acquisition by dynamic resizing of the cursor's activation area. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 2005: 281–290 DOI:10.1145/1054972.1055012

16. Hasan K, Grossman T, Irani P. Comet and target ghost: techniques for selecting moving targets. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. Vancouver, BC, Canada, ACM, 2011, 839–848

17. Oviatt S, DeAngeli A, Kuhn K. Integration and synchronization of input modes during multimodal human-computer interaction. In: Referring Phenomena in a Multimedia Context and their Computational Treatment. Madrid, Spain, Association for Computational Linguistics, 1997, 1–13 DOI:10.3115/1621585.1621587

18. Oviatt S. Multimodal interfaces//The Human-Computer Interaction Handbook. CRC Press, 2007, 413–432 DOI:10.1201/9781410615862.ch21

19. Hay J C, Pick H L, Ikeda K. Visual capture produced by prism spectacles. Psychonomic Science 1965, 2(1–12): 215–216 DOI:10.3758/bf03343413

20. Heller M A. Haptic dominance in form perception with blurred vision. Perception 1983, 12(5): 607–613 DOI:10.1068/p120607

21. Jaimes A, Sebe N. Multimodal human–computer interaction: A survey. Computer Vision and Image Understanding 2007, 108(1/2): 116–134 DOI:10.1016/j.cviu.2006.10.019

22. Oviatt S. Mutual disambiguation of recognition errors in a multimodel architecture. In: Proceedings of the SIGCHI conference on Human factors in computing systems the CHI is the limit. Pittsburgh, Pennsylvania, USA, ACM, 1999 DOI:10.1145/302979.303163

23. Cockburn A, Brewster S. Multimodal feedback for the acquisition of small targets. Ergonomics 2005, 48(9): 1129–1150 DOI:10.1080/00140130500197260

24. MacKenzie C L, Marteniuk R G, Dugas C, Liske D, Eickmeier B. Three-dimensional movement trajectories in fitts' task: implications for control. The Quarterly Journal of Experimental Psychology Section A 1987, 39(4): 629–647 DOI:10.1080/14640748708401806

25. Whitney D, Westwood D A, Goodale M A. The influence of visual motion on fast reaching movements to a stationary object. Nature 2003, 423(6942): 869–873 DOI:10.1038/nature01693

26. Schmidt G, Baillot Y, Brown D G, Tomlin E B, Swan J E. Toward Disambiguating Multiple Selections for Frustum-Based Pointing. In: 3D User Interfaces (3DUI'06). Alexandria, VA, USA, IEEE, 2006, 87–94 DOI:10.1109/tridui.2006.1618277

27. Mine M R, BrooksJr. F P, Sequin C H. Moving objects in space: exploiting proprioception in virtual-environment interaction. In: Proceedings of the 24th annual conference on Computer graphics and interactive techniques. 1997, 19–26 DOI:10.1145/258734.258747

28. Poupyrev I, Ichikawa T, Weghorst S, Billinghurst M. Egocentric object manipulation in virtual environments: empirical evaluation of interaction techniques. Computer Graphics Forum 1998, 17(3): 41–52 DOI:10.1111/1467-8659.00252

29. Menelas B A J, Picinali L, Bourdot P, Katz B F G. Non-visual identification, localization, and selection of entities of interest in a 3D environment. Journal on Multimodal User Interfaces 2014, 8(3): 243–256 DOI:10.1007/s12193-014-0148-1

30. Cabreira A T, Hwang F. Evaluating the effects of feedback type on older adults' performance in mid-air pointing and target selection. In: Proceedings of the Symposium on Spatial User Interaction. Berlin, Germany, ACM, 2018 DOI:10.1145/3267782.3267933

31. Ariza O, Bruder G, Katzakis N, Steinicke F. Analysis of proximity-based multimodal feedback for 3D selection in immersive virtual environments. In: 2018 IEEE Conference on Virtual Reality and 3D User Interfaces. Tuebingen/Reutlingen, Germany. New York, USA, IEEE, 2018 DOI:10.1109/vr.2018.8446317

32. Faeth A, Harding C. Emergent effects in multimodal feedback from virtual buttons. ACM Transactions on Computer-Human Interaction 2014, 21(1): 1–23 DOI:10.1145/2535923

33. Mould D, Gutwin C. The effects of feedback on targeting with multiple moving targets. In: Proceedings of Graphics Interface. 2004, 25–32

Related

1. Dangxiao WANG, Yuan GUO, Shiyi LIU, Yuru ZHANG, Weiliang XU, Jing XIAO, Haptic display for virtual reality: progress and challenges Virtual Reality & Intelligent Hardware 2019, 1(2): 136-162

2. Ming-Hao YANG, Jian-Hua TAO, Data fusion methods in multimodal human computer dialog Virtual Reality & Intelligent Hardware 2019, 1(1): 21-38

3. Akifumi TAKAHASHI, Kenta TANABE, Hiroyuki KAJIMOTO, Haptic interface using tendon electrical stimulation with consideration of multimodal presentation Virtual Reality & Intelligent Hardware 2019, 1(2): 163-175

4. Mengting XIAO, Zhiquan FENG, Xiaohui YANG, Tao XU, Qingbei GUO, Multimodal interaction design and application in augmented reality for chemical experiment Virtual Reality & Intelligent Hardware 2020, 2(4): 291-304

5. Yang LI, Jin HUANG, Feng TIAN, Hong-An WANG, Guo-Zhong DAI, Gesture interaction in virtual reality Virtual Reality & Intelligent Hardware 2019, 1(1): 84-112

6. Athirah SYAMIMI, Yiwei GONG, Ryan LIEW, VR industrial applicationsA singapore perspective Virtual Reality & Intelligent Hardware 2020, 2(5): 409-420

7. Susu HUANG, Daqing QI, Jiabin YUAN, Huawei TU, Review of studies on target acquisition in virtual reality based on the crossing paradigm Virtual Reality & Intelligent Hardware 2019, 1(3): 251-264

8. Yukang YAN, Xin YI, Chun YU, Yuanchun SHI, Gesture-based target acquisition in virtual and augmented reality Virtual Reality & Intelligent Hardware 2019, 1(3): 276-289

9. Yuan GAO, Le XIE, A review on the application of augmented reality in craniomaxillofacial surgery Virtual Reality & Intelligent Hardware 2019, 1(1): 113-120

10. Yuan CHANG, Guo-Ping WANG, A review on image-based rendering Virtual Reality & Intelligent Hardware 2019, 1(1): 39-54

11. Shiguang QIU, Shuntao LIU, Deshuai KONG, Qichang HE, Three-dimensional virtual-real mapping of aircraft autom-atic spray operation and online simulation monitoring Virtual Reality & Intelligent Hardware 2019, 1(6): 611-621

12. Xu PENG, Zhenyu GAO, Yitong DING, Dongfeng ZHAO, Xiaoyu CHI, Study of ghost image suppression in polarized catadioptric virtual reality optical systems Virtual Reality & Intelligent Hardware 2020, 2(1): 70-78

13. Zhiming HU, Sheng LI, Meng GAI, Temporal continuity of visual attention for future gaze prediction in immersive virtual reality Virtual Reality & Intelligent Hardware 2020, 2(2): 142-152

14. Hengwei XU, Siru LI, Wenpeng SONG, Jiajun SUN, Xinli WU, Xiaoqi WANG, Wenzhen YANG, Zhigeng PAN, Abdennour EI RHALIBI, Thermal perception method of virtual chemistry experiments Virtual Reality & Intelligent Hardware 2020, 2(4): 305-315

15. TJ MATTHEWS, Feng TIAN, Tom DOLBY, Interaction design for paediatric emergency VR training Virtual Reality & Intelligent Hardware 2020, 2(4): 330-344

16. Hongxin ZHANG, Jin ZHANG, Xue YIN, Kan ZHOU, Zhigeng PAN, Abdennour EI RHALIBI, Cloud-to-end rendering and storage management for virtual reality in experimental education Virtual Reality & Intelligent Hardware 2020, 2(4): 368-380

17. Xiang ZHOU, Liyu TANG, Ding LIN, Wei HAN, Virtual & augmented reality for biological microscope in experiment education Virtual Reality & Intelligent Hardware 2020, 2(4): 316-329

18. Haoyu WANG, Jianhuang WU, A virtual reality based surgical skills training simulator for catheter ablation with real-time and robust interaction Virtual Reality & Intelligent Hardware 2021, 3(4): 302-314

19. Na ZHANG, Liwen TAN, Fengying LI, Bing HAN, Yifa XU, Development and application of digital assistive teaching system for anatomy Virtual Reality & Intelligent Hardware 2021, 3(4): 315-335

20. Lihui HUANG, Siti Faatihah Binte Mohd TAIB, Ryan Aung BA, Zhe An GOH, Mengshan XU, Virtual reality research and development in NTU Virtual Reality & Intelligent Hardware 2020, 2(5): 394-408

21. Stéphanie PHILIPPE, Alexis D. SOUCHET, Petros LAMERAS, Panagiotis PETRIDIS, Julien CAPORAL, Gildas COLDEBOEUF, Hadrien DUZAN, Multimodal teaching, learning and training in virtual reality: a review and case study Virtual Reality & Intelligent Hardware 2020, 2(5): 421-442

22. Jia Ming LEE, Xinxing XIA, Clemen OW, Felix CHUA, Yunqing GUAN, VEGO: A novel design towards customizable and adjustable head-mounted display for VR Virtual Reality & Intelligent Hardware 2020, 2(5): 443-453

23. Jingcheng QIAN, Yancong MA, Zhigeng PAN, Xubo YANG, Effects of Virtual-real fusion on immersion, presence, and learning performance in laboratory education Virtual Reality & Intelligent Hardware 2020, 2(6): 569-584

24. Aiguo SONG, Liyue FU, Multi-dimensional force sensor for haptic interaction: a review Virtual Reality & Intelligent Hardware 2019, 1(2): 121-135

25. Wenmin ZHU, Xiumin FAN, Yanxin ZHANG, Applications and research trends of digital human models in the manufacturing industry Virtual Reality & Intelligent Hardware 2019, 1(6): 558-579

26. Mohammad Mahmudul ALAM, S. M. Mahbubur RAHMAN, Affine transformation of virtual 3D object using 2D localization of fingertips Virtual Reality & Intelligent Hardware 2020, 2(6): 534-555

27. Yuan WEI, Dongdong GUAN, Qiuchen WANG, Xiangxian LI, Yulong BIAN, Pu QIN, Yanning XU, Chenglei YANG, Virtual fire drill system supporting co-located collaboration Virtual Reality & Intelligent Hardware 2019, 1(3): 290-302