Home About the Journal Latest Work Current Issue Archive Special Issues Editorial Board
<< Previous Next >>

2020, 2(6): 518-533

Published Date:2020-12-20 DOI: 10.1016/j.vrih.2020.05.006

A multichannel human-swarm robot interaction system in augmented reality

Abstract

Background
A large number of robots have put forward the new requirements for human-robot interaction. One of the problems in human-swarm robot interaction is how to naturally achieve an efficient and accurate interaction between humans and swarm robot systems. To address this, this paper proposes a new type of human-swarm natural interaction system.
Methods
Through the cooperation between three-dimensional (3D) gesture interaction channel and natural language instruction channel, a natural and efficient interaction between a human and swarm robots is achieved.
Results
First, A 3D lasso technology realizes a batch-picking interaction of swarm robots through oriented bounding boxes. Second, control instruction labels for swarm-oriented robots are defined. The instruction label is integrated with the 3D gesture and natural language through instruction label filling. Finally, the understanding of natural language instructions is realized through a text classifier based on the maximum entropy model. A head-mounted augmented reality display device is used as a visual feedback channel.
Conclusions
The experiments on selecting robots verify the feasibility and availability of the system.

Keyword

Human-swarm interaction ; Augmented reality ; Multichannel integration

Cite this article

Mingxuan CHEN, Ping ZHANG, Zebo WU, Xiaodan CHEN. A multichannel human-swarm robot interaction system in augmented reality. Virtual Reality & Intelligent Hardware, 2020, 2(6): 518-533 DOI:10.1016/j.vrih.2020.05.006

References

1. Alfeo A L, Cimino M G C A, de Francesco N, Lazzeri A, Lega M, Vaglini G. Swarm coordination of mini-UAVs for target search using imperfect sensors. Intelligent Decision Technologies, 2018, 12(2): 149–162 DOI:10.3233/idt-170317

2. Li K, Ni W, Wang X, Liu R P, Kanhere S S, Jha S. Energy-efficient cooperative relaying for unmanned aerial vehicles. IEEE Transactions on Mobile Computing, 2016, 15(6): 1377–1386 DOI:10.1109/tmc.2015.2467381

3. Zhang Q, Gong Z K, Yang Z Q, Chen Z Q. Distributed convex optimization for flocking of nonlinear multi-agent systems. International Journal of Control, Automation and Systems, 2019, 17(5): 1177–1183 DOI:10.1007/s12555-018-0191-x

4. Krause J, Ruxton G D, Krause S. Swarm intelligence in animals and humans. Trends in Ecology & Evolution, 2010, 25(1): 28–34 DOI:10.1016/j.tree.2009.06.016

5. Vassev E, Hinchey M, Nixon P. A formal approach to self-configurable swarm-based space-exploration systems. 2010 NASA/ESA Conference on Adaptive Hardware and Systems. Anaheim, CA, USA, IEEE, 2010, 83–90 DOI:10.1109/ahs.2010.5546276

6. Kolling A, Walker P, Chakraborty N, Sycara K, Lewis M. Human interaction with robot swarms: a survey. IEEE Transactions on Human-Machine Systems, 2016, 46(1): 9–26 DOI:10.1109/thms.2015.2480801

7. Krishnamurthy P, Khorrami F. A distributed monitoring approach for human interaction with multi-robot systems. In: Proceedings of the Companion of the 2017 ACM/IEEE International Conference on Human-Robot Interaction. Vienna Austria, New York, NY, USA, ACM, 2017 DOI:10.1145/3029798.3038327

8. Savla K, Frazzoli E. A dynamical queue approach to intelligent task management for human operators. Proceedings of the IEEE, 2012, 100(3): 672–686 DOI:10.1109/jproc.2011.2173264

9. Setter T, Fouraker A, Kawashima H, Egerstedt M. Haptic interactions with multi-robot swarms using manipulability. Journal of Human-Robot Interaction, 2015, 4(1): 60–74 DOI:10.5898/jhri.4.1.setter

10. Franchi A, Secchi C, Ryll M, Bulthoff H, Giordano P. Shared control: balancing autonomy and human assistance with a group of quadrotor UAVs. IEEE Robotics & Automation Magazine, 2012, 19(3): 57–68 DOI:10.1109/mra.2012.2205625

11. Wang Z J, Schwager M. Kinematic multi-robot manipulation with no communication using force feedback. In: 2016 IEEE International Conference on Robotics and Automation (ICRA). Stockholm, Sweden, IEEE, 2016, 427–432 DOI:10.1109/icra.2016.7487163

12. Gromov B, Gambardella L M, di Caro G A. Wearable multi-modal interface for human multi-robot interaction. In: 2016 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR). Lausanne, Switzerland, IEEE, 2016, 240–245 DOI:10.1109/ssrr.2016.7784305

13. Erat O, Isop W A, Kalkofen D, Schmalstieg D. Drone-augmented human vision: exocentric control for drones exploring hidden areas. IEEE Transactions on Visualization and Computer Graphics, 2018, 24(4): 1437–1446 DOI:10.1109/tvcg.2018.2794058

14. Tsykunov E, Agishev R, Ibrahimov R, Labazanova L, Tleugazy A, Tsetserukou D. SwarmTouch: guiding a swarm of micro-quadrotors with impedance control using a wearable tactile interface. IEEE Transactions on Haptics, 2019, 12(3): 363–374 DOI:10.1109/toh.2019.2927338

15. Zhang Y H, Du Y, Pan F, Wei Y. Intelligent vehicle path tracking algorithm based on cubic B-spline curve fitting. Journal of Computer Applications, 2018, 38(6): 1562–1567(in Chinese)

16. Zhai Y, Xu W Y, Zhang Q. Judgment of topological relation between point and polygon or polyhedron. Computer Engineering and Design, 2015, 36(4): 972–976(in Chinese) DOI:10.16208/j.issn1000-7024.2015.04.026

17. Du G L, Chen M X, Liu C B, Zhang B, Zhang P. Online robot teaching with natural human–robot interaction. IEEE Transactions on Industrial Electronics, 2018, 65(12): 9571–9581 DOI:10.1109/tie.2018.2823667

Related

1. Mengting XIAO, Zhiquan FENG, Xiaohui YANG, Tao XU, Qingbei GUO, Multimodal interaction design and application in augmented reality for chemical experiment Virtual Reality & Intelligent Hardware 2020, 2(4): 291-304

2. Yonghang TAI, Junsheng SHI, Junjun PAN, Aimin HAO, Victor CHANG, Augmented reality-based visual-haptic modeling for thoracoscopic surgery training systems Virtual Reality & Intelligent Hardware 2021, 3(4): 274-286

3. Yukang YAN, Xin YI, Chun YU, Yuanchun SHI, Gesture-based target acquisition in virtual and augmented reality Virtual Reality & Intelligent Hardware 2019, 1(3): 276-289

4. Yuan GAO, Le XIE, A review on the application of augmented reality in craniomaxillofacial surgery Virtual Reality & Intelligent Hardware 2019, 1(1): 113-120

5. Jinyu LI, Bangbang YANG, Danpeng CHEN, Nan WANG, Guofeng ZHANG, Hujun BAO, Survey and evaluation of monocular visual-inertial SLAM algorithms for augmented reality Virtual Reality & Intelligent Hardware 2019, 1(4): 386-410

6. Xiaomei ZHAO, Fulin TANG, Yihong WU, Real-time human segmentation by BowtieNet and a SLAM-based human AR system Virtual Reality & Intelligent Hardware 2019, 1(5): 511-524

7. Chan QIU, Shien ZHOU, Zhenyu LIU, Qi GAO, Jianrong TAN, Digital assembly technology based on augmented reality and digital twins: a review Virtual Reality & Intelligent Hardware 2019, 1(6): 597-610

8. Wang LI, Junfeng WANG, Sichen JIAO, Meng WANG, Shiqi LI, Research on the visual elements of augmented reality assembly processes Virtual Reality & Intelligent Hardware 2019, 1(6): 622-634

9. Pengfei HAN, Gang ZHAO, A review of edge-based 3D tracking of rigid objects Virtual Reality & Intelligent Hardware 2019, 1(6): 580-596

10. Shenze WANG, Kaikai DU, Ningfang SONG, Dongfeng ZHAO, Di FENG, Zhengqian TU, Study on the adaptability of augmented reality smartglasses for astigmatism based on holographic waveguide grating Virtual Reality & Intelligent Hardware 2020, 2(1): 79-85

11. Xiang ZHOU, Liyu TANG, Ding LIN, Wei HAN, Virtual & augmented reality for biological microscope in experiment education Virtual Reality & Intelligent Hardware 2020, 2(4): 316-329

12. Lingfei ZHU, Qi CAO, Yiyu CAI, Development of augmented reality serious games with a vibrotactile feedback jacket Virtual Reality & Intelligent Hardware 2020, 2(5): 454-470

13. Yun-Han LEE, Tao ZHAN, Shin-Tson WU, Prospects and challenges in augmented reality displays Virtual Reality & Intelligent Hardware 2019, 1(1): 10-20

14. Zike YAN, Hongbin ZHA, Flow-based SLAM: From geometry computation to learning Virtual Reality & Intelligent Hardware 2019, 1(5): 435-460