Home About the Journal Latest Work Current Issue Archive Special Issues Editorial Board

TABLE OF CONTENTS

2022,  4 (2):   1 - 2

Published Date:2022-4-20 DOI: 10.3724/SP.J.2096-5796.2022.04.02

Content

With the rise of the concept of the metaverse, mixed reality continues to receive keen attention from all over the world. Academia and industry worldwide are constantly innovating the key technologies of mixed reality software and hardware. For a long time, the field of mixed reality mainly focused on display technology, but with the increase of mixed reality applications, people gradually found that the lack of interactive technology and solutions has become a bottleneck restricting the development of mixed reality technology. On the mixed reality platform represented by AR/VR helmets, people cannot completely get rid of the remote control to complete human-computer interaction, which greatly limits the development of applications on it. How to enable users to exchange information with computers based on the natural interaction modality has become an urgent problem to be solved.
Human-computer interaction, as a key technology for connecting virtual and reality, will provide an indispensable channel for human-computer information exchange in the Metaverse. In this unstructured information space of virtual and real integration, human-computer interaction faces the problems of complex interactive semantics and dynamic and changeable scenes. A natural, efficient and reliable human-computer interface needs new technical support. The breakthroughs in the key technologies of human-computer interaction will play a significant role in promoting the development of industries.
The challenges posed by mixed reality to human-computer interaction are comprehensive. In virtual reality (VR), the world that people see is an immersive space that is completely controlled by computers. How people move in it and how to deal with the relationship between virtual space and physical space are the basic problems that need to be solved. In augmented reality AR, physical reality and virtual information are superimposed, and how to manipulate virtual information has also become a problem. Whether it is VR or AR, how effectively users express interaction intentions is a factor that directly affects user experience. At the application level, how to express itself and transmit information in the current popular virtual human technology has also become an important factor in determining the interactivity of mixed reality systems.
In order to provide readers an understanding of the cutting-edge research on the above questions, this issue invites 5 articles, all from the human-computer interaction research teams in domestic universities. In "Navigation in virtual and real environment using brain computer interface: a progress report", Professor Liu Yue from the School of Optoelectronics of Beijing Institute of Technology reviewed the recent 20 years of work on navigation in the virtual world based on brain-computer interfaces, and summarized the brain-computer asynchronous Research opportunities and challenges in an asynchronized controlled manner. Yu Chun's paper "Design and evaluation of window management operations in AR headset+smartphone interface" from Tsinghua University introduces his team's multi-window management system in AR, and highlights the role of using smartphones to assist efficient control. Professor Cheng Shiwei from Zhejiang University of Technology proposed a mobile eye tracking method based on the combination of appearance and feature model in "EasyGaze: Hybrid eye tracking approach for handheld mobile devices", which uses human eye images to compute feature vectors and calculate gaze Point coordinates, and the effects of human eye image resolution, light source conditions, and different feature vectors on eye tracking accuracy are studied. "Designing generation Y interactions: The case of Yphone" by Liu Wei from the design major of Beijing Normal University emphasizes human factors, and introduces the new-generation interaction method that is universally applicable in life situations into work situations. The somatosensory human-computer interaction in the field of mixed reality provides evidence and basis. Bian Yulong of Shandong University "Motivation effect of animated pedagogical agent's personality and feedback strategy types on learning in virtual training" focuses on the Persona effect produced by virtual teaching agents in mixed/virtual reality teaching, and proposes teaching from the social-motivation dimension.

Reference