Home About the Journal Latest Work Current Issue Archive Special Issues Editorial Board
<< Previous Next >>

2020, 2(2): 162-174

Published Date:2020-4-20 DOI: 10.1016/j.vrih.2020.01.003

COMTIS: Customizable touchless interaction system for large screen visualization

Abstract

Background
Large screen visualization systems have been widely utilized in many industries. Such systems can help illustrate the working states of different production systems. However, efficient interaction with such systems is still a focus of related research.
Methods
In this paper, we propose a touchless interaction system based on RGB-D camera using a novel bone-length constraining method. The proposed method optimizes the joint data collected from RGB-D cameras with more accurate and more stable results on very noisy data. The user can customize the system by modifying the finite-state machine in the system and reuse the gestures in multiple scenarios, reducing the number of gestures that need to be designed and memorized.
Results/Conclusions
The authors tested the system in two cases. In the first case, we illustrated a process in which we improved the gesture designs on our system and tested the system through user study. In the second case, we utilized the system in the mining industry and conducted a user study, where users say that they think the system is easy to use.

Keyword

Human computer interaction ; RGB-D camera ; Touchless interaction ; Gesture recognition

Cite this article

Jiaxin LIU, Hongxin ZHANG, Chuankang LI. COMTIS: Customizable touchless interaction system for large screen visualization. Virtual Reality & Intelligent Hardware, 2020, 2(2): 162-174 DOI:10.1016/j.vrih.2020.01.003

References

1. Wilson A D. TouchLight: an imaging touch screen and display for gesture-based interaction. In: Proceedings of the 6th International Conference on Multimodal Interfaces, 2004, 69–76

2. Malik S, Ranjan A, Balakrishnan R. Interacting with large displays from a distance with vision-tracked multi-finger gestural input. In: Proceedings of the 18th annual ACM Symposium on User interface Software and Technology. Seattle, WA, USA, ACM Press, 2005, 43–52 DOI:10.1145/1095034.1095042

3. Mendes D, Sousa M, Araujo B, Ferreira A, Noronha H, Campos P, Soares L, Raposo A, Jorge J. Collaborative 3D visualization on large screen displays. In: Powerwall-international Workshop on Interactive, Ultra-high-resolution displays. ACM, CHI, 2013

4. O'Hara K, Gonzalez G, Sellen A, Penney G V A, Mentis H, Criminisi A, Corish R, Rouncefield M, Dastur N. Touchless interaction in surgery. Communications of the ACM, 2014, 70–77

5. Wachs J P, Stern H I, Edan Y, Gillam M, Handler J, Feied C, Smith M. A gesture-based tool for sterile browsing of radiology images. Journal of the American Medical Informatics Association, 2008, 15(3): 321–323 DOI:10.1197/jamia.m2410

6. Qian K, Niu J, Yang H. Developing a gesture based remote human-robot interaction system using kinect. International Journal of Smart Home, 2013, 7(4): 203–208

7. Garzotto F, Valoriani M. Don't touch the oven: motion-based touchless interaction with household appliances. In: Proceedings of the International Working Conference on Advanced Visual Interfaces, 2012, 721–724

8. Hsu H M J. The potential of kinect in education. International Journal of Information and Education Technology, 2011, 365–370 DOI:10.7763/ijiet.2011.v1.59

9. Buchmann V, Violich S, Billinghurst M, Cockburn A. FingARtips. In: Proceedings of the 2nd international conference on Computer graphics and interactive techniques in Austalasia and Southe East Asia. Singapore, ACM Press, 2004, 212–221 DOI:10.1145/988834.988871

10. Hettiarachchi A, Wigdor D. Annexing reality: Enabling opportunistic use of everyday objects as tangible proxies in augmented reality. In: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, 2016, 1957–1967

11. Ma M, Fallavollita P, Blum T, Eck U, Sandor C, Weidert S W J, Navab N. Kinect for interactive AR anatomy learning. In: Mixed and Augmented Reality (ISMAR). IEEE International Symposium, 2013, 277–278

12. Cassola F, Morgado L, de Carvalho F, Paredes H, Fonseca B, Martins P. Online-gym: a 3D virtual gymnasium using kinect interaction. Procedia Technology, 2014, 13, 130–138 DOI:10.1016/j.protcy.2014.02.017

13. Fürntratt H, Neuschmied H. Evaluating pointing accuracy on kinect v2 sensor. In: International Conference on Multimedia and Human-Computer Interaction (MHCI). 2014, 124–1

14. Azimi M. Skeletal joint smoothing white paper. MSDN digital library, 2012

15. Loumponias K, Vretos N, Daras P, Tsaklidis G. Using tobit kalman filtering in order to improve the motion recorded by microsoft Kinect. In: Proceedings of the International workshop on applied probability IWAP. 2016

16. Moon S, Park Y, Ko D W, Suh I H. Multiple kinect sensor fusion for human skeleton tracking using kalman filtering. International Journal of Advanced Robotic Systems, 2016, 13(2): 65 DOI:10.5772/62415

17. Medeiros A C S, Tavares T A, da Fonseca I E. How to design an user interface based on gestures? In: International Conference of Design, User Experience, and Usability. 2015, 63–74

18. Piumsomboon T, Clark A, Billinghurst M, Cockburn A. User-defined gestures for augmented reality. IFIP Conference on Human-Computer Interaction, 2013, 282–299

19. Lin W H, Du L, Harris-Adamson C, Barr A, Rempel D. Design of hand gestures for manipulating objects in virtual reality. Human-Computer Interaction. User Interface Design, Development and Multimodality. Cham: Springer International Publishing, 2017, 584–592 DOI:10.1007/978-3-319-58071-5_44

20. Kou Y B, Kow Y M, Cheng K. Developing intuitive gestures for spatial interaction with large public displays. Distributed, Ambient, and Pervasive Interactions. Cham: Springer International Publishing, 2015, 174–181 DOI:10.1007/978-3-319-20804-6_16

21. Ren Z, Meng J J, Yuan J S, Zhang Z Y. Robust hand gesture recognition with kinect sensor. In: Proceedings of the 19th ACM international conference on Multimedia. Scottsdale, Arizona, ACM Press, 2011, 759–760 DOI:10.1145/2072298.2072443

22. Li Y. Hand gesture recognition using Kinect. Software Engineering and Service Science (ICSESS). In: IEEE 3rd International Conference on. IEEE, 2012, 196–199

23. Wobbrock J O, Wilson A D, Li Y. Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes. In: Proceedings of the 20th annual ACM symposium on User interface software and technology. Newport, Rhode Island, ACM Press, 2007, 159–168 DOI:10.1145/1294211.1294238

24. Cao Z, Simon T, Wei S, Sheikh Y. Realtime multi-person 2d pose estimation using part affinity fields. CVPR, 2017

Related

1. Rachel HUANG, Carisa HARRIS-ADAMSON, Dan ODELL, David REMPEL, Design of finger gestures for locomotion in virtual reality Virtual Reality & Intelligent Hardware 2019, 1(1): 1-9

2. Jun-Hao YIN, Chin-Boon CHNG, Pooi-Mun WONG, Nicholas HO, Matthew CHUA, Chee-Kong CHUI, VR and AR in human performance researchAn NUS experience Virtual Reality & Intelligent Hardware 2020, 2(5): 381-393

3. Yang LI, Jin HUANG, Feng TIAN, Hong-An WANG, Guo-Zhong DAI, Gesture interaction in virtual reality Virtual Reality & Intelligent Hardware 2019, 1(1): 84-112

4. Meng SONG, Shiyi LIU, Ge YU, Lili GUO, Dangxiao WANG, An immersive space liquid bridge experiment system with gesture interaction and vibrotactile feedback Virtual Reality & Intelligent Hardware 2019, 1(2): 219-232

5. Egemen ERTUGRUL, Ping LI, Bin SHENG, On attaining user-friendly hand gesture interfaces to control existing GUIs Virtual Reality & Intelligent Hardware 2020, 2(2): 153-161

6. Mengting XIAO, Zhiquan FENG, Xiaohui YANG, Tao XU, Qingbei GUO, Multimodal interaction design and application in augmented reality for chemical experiment Virtual Reality & Intelligent Hardware 2020, 2(4): 291-304

7. Benjia ZHOU, Jun WAN, Yanyan LIANG, Guodong GUO, Adaptive cross-fusion learning for multi-modal gesture recognition Virtual Reality & Intelligent Hardware 0, -(-): 1-13

8. Benjia ZHOU, Jun WAN, Yanyan LIANG, Guodong GUO, Adaptive cross-fusion learning for multi-modal gesture recognition Virtual Reality & Intelligent Hardware 2021, 3(3): 235-247

9. Yuanyuan SHI, Yunan LI, Xiaolong FU, Kaibin MIAO, Qiguang MIAO, Review of dynamic gesture recognition Virtual Reality & Intelligent Hardware 2021, 3(3): 183-206