Adv Search
Home | Accepted | Article In Press | Current Issue | Archive | Special Issues | Collections | Featured Articles | Statistics

2020, 2(4): 291-304 Published Date:2020-8-20

DOI: 10.1016/j.vrih.2020.07.005

Multimodal interaction design and application in augmented reality for chemical experiment

Full Text: PDF (8) HTML (141)

Export: EndNote | Reference Manager | ProCite | BibTex | RefWorks


Augmented reality classrooms have become an interesting research topic in the field of education, but there are some limitations. Firstly, most researchers use cards to operate experiments, and a large number of cards cause difficulty and inconvenience for users. Secondly, most users conduct experiments only in the visual modal, and such single-modal interaction greatly reduces the users’ real sense of interaction. In order to solve these problems, we propose the Multimodal Interaction Algorithm based on Augmented Reality (ARGEV), which is based on visual and tactile feedback in Augmented Reality. In addition, we design a Virtual and Real Fusion Interactive Tool Suite (VRFITS) with gesture recognition and intelligent equipment.
The ARGVE method fuses gesture, intelligent equipment, and virtual models. We use a gesture recognition model trained by a convolutional neural network to recognize the gestures in AR, and to trigger a vibration feedback after a recognizing a five-finger grasp gesture. We establish a coordinate mapping relationship between real hands and the virtual model to achieve the fusion of gestures and the virtual model.
The average accuracy rate of gesture recognition was 99.04%. We verify and apply VRFITS in the Augmented Reality Chemistry Lab (ARCL), and the overall operation load of ARCL is thus reduced by 29.42%, in comparison to traditional simulation virtual experiments.
We achieve real-time fusion of the gesture, virtual model, and intelligent equipment in ARCL. Compared with the NOBOOK virtual simulation experiment, ARCL improves the users’ real sense of operation and interaction efficiency.
Keywords: Augmented reality ; Gesture recognition ; Intelligent equipment ; Multimodal Interaction ; Augmented Reality Chemistry Lab

Cite this article:

Mengting XIAO, Zhiquan FENG, Xiaohui YANG, Tao XU, Qingbei GUO. Multimodal interaction design and application in augmented reality for chemical experiment. Virtual Reality & Intelligent Hardware, 2020, 2(4): 291-304 DOI:10.1016/j.vrih.2020.07.005

1. Collazos C A, Merchan L. Human-computer interaction in Colombia: bridging the gap between education and industry. IT Professional, 2015, 17(1): 5–9 DOI:10.1109/mitp.2015.8

2. Desai K, Belmonte U H H, Jin R, Prabhakaran B, Diehl P, Ramirez V A, Johnson V, Gans M. Experiences with multi-modal collaborative virtual laboratory (MMCVL). In: 2017 IEEE Third International Conference on Multimedia Big Data (BigMM). Laguna Hills, CA, USA, IEEE, 2017, 376–383 DOI:10.1109/bigmm.2017.62

3. Chen L, Tang W, John N W, Wan T R, Zhang J J. SLAM-based dense surface reconstruction in monocular Minimally Invasive Surgery and its application to Augmented Reality. Computer Methods and Programs in Biomedicine, 2018, 158: 135–146 DOI:10.1016/j.cmpb.2018.02.006

4. Huynh B, Orlosky J, Höllerer T. In-situ labeling for augmented reality language learning. In: 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). Osaka, Japan, IEEE, 2019, 1606–1611 DOI:10.1109/vr.2019.8798358

5. Karambakhsh A, Kamel A, Sheng B, Li P, Yang P, Feng D D. Deep gesture interaction for augmented anatomy learning. International Journal of Information Management, 2019, 45: 328–336 DOI:10.1016/j.ijinfomgt.2018.03.004

6. Sun C X. The design and implementation of children's education system based on augmented reality. The Shandong University, 2017

7. Fidan M, Tuncel M. Integrating augmented reality into problem based learning: the effects on learning achievement and attitude in physics education. Computers & Education, 2019, 142: 103635 DOI:10.1016/j.compedu.2019.103635

8. Dave I R, Chaudhary V, Upla K P. Simulation of analytical chemistry experiments on augmented reality platform. In: Advances in Intelligent Systems and Computing. Singapore: Springer Singapore, 2018, 393–403 DOI:10.1007/978-981-13-0224-4_35

9. İbili E, Çat M, Resnyansky D, Şahin S, Billinghurst M. An assessment of geometry teaching supported with augmented reality teaching materials to enhance students' 3D geometry thinking skills. International Journal of Mathematical Education in Science and Technology, 2020, 51(2): 224–246 DOI:10.1080/0020739x.2019.1583382

10. Rani S S, Dhrisya K J, Ahalyadas M. Hand gesture control of virtual object in augmented reality. In: 2017 International Conference on Advances in Computing, Communications and Informatics (ICACCI). Udupi, India, IEEE, 2017, 1500–1505 DOI:10.1109/icacci.2017.8126053

11. Skaria S, Al-Hourani A, Lech M, Evans R J. Hand-gesture recognition using two-antenna Doppler radar with deep convolutional neural networks. IEEE Sensors Journal, 2019, 19(8): 3041–3048 DOI:10.1109/jsen.2019.2892073

12. Côté-Allard U, Fall C L, Drouin A, Campeau-Lecours A, Gosselin C, Glette K, Laviolette F, Gosselin B. Deep learning for electromyographic hand gesture signal classification using transfer learning. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2019, 27(4): 760–771 DOI:10.1109/tnsre.2019.2896269

13. Sinha K, Kumari R, Priya A, Paul P. A computer vision-based gesture recognition using hidden Markov model. In: Innovations in Soft Computing and Information Technology. Singapore: Springer Singapore, 2019, 55–67 DOI:10.1007/978-981-13-3185-5_6

14. Zhang L Z, Zhang Y R, Niu L D, Zhao Z J, Han X W. HMM static hand gesture recognition based on combination of shape features and wavelet texture features. Wireless and Satellite Systems, 2019, 187–197 DOI:10.1007/978-3-030-19156-6_18

15. Ahmad S U D, Akhter S. Real time rotation invariant static hand gesture recognition using an orientation based hash code. 2013 International Conference on Informatics, Electronics and Vision (ICIEV), 2013, 1–6 DOI:10.1109/iciev.2013.6572620

16. Saba T, Rehman A, Harouni M. Cursive multilingual characters recognition based on hard geometric features. International Journal of Computational Vision and Robotics, 2020, 10(3): 213 DOI:10.1504/ijcvr.2020.10029034

17. Wu D, Pigou L, Kindermans P J, Le N D H, Shao L, Dambre J, Odobez J M. Deep dynamic neural networks for multimodal gesture segmentation and recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2016, 38(8): 1583–1597 DOI:10.1109/tpami.2016.2537340

18. Elmezain M, Al-Hamadi A, Michaelis B. Hand trajectory-based gesture spotting and recognition using HMM. In: 2009 16th IEEE International Conference on Image Processing (ICIP). Cairo, Egypt, IEEE, 2009, 3577–3580 DOI:10.1109/icip.2009.5414322

19. Padam Priyal S, Bora P K. A robust static hand gesture recognition system using geometry based normalizations and Krawtchouk moments. Pattern Recognition, 2013, 46(8): 2202–2219 DOI:10.1016/j.patcog.2013.01.033

20. Liang H, Yuan J S, Thalmann D, Thalmann N M. AR in hand: egocentric palm pose tracking and gesture recognition for augmented reality applications. In: Proceedings of the 23rd ACM International Conference on Multimedia-MM'15. Brisbane, Australia, York New, Press ACM, 2015, 743–744 DOI:10.1145/2733373.2807972

21. Wang J. Research on the application of virtual simulation experiment in physics experiment teaching of senior high school. 2018

22. Law K E, Lowndes B R, Kelley S R, Blocker R C, Larson D W, Hallbeck M S, Nelson H. NASA-task load index differentiates surgical approach: opportunities for improvement in colon and rectal surgery. Annals of Surgery, 2020, 271(5): 906–912 DOI:10.1097/sla.0000000000003173

email E-mail this page

Articles by authors