Home About the Journal Latest Work Current Issue Archive Special Issues Editorial Board

Articles

    Editorial

  • Intelligent interaction in mixed reality
    Yuanchun SHI, Chun YU

    DOI:10.3724/SP.J.2096-5796.2022.04.02

    2022, 4(2) : 1-2

    PDF (18) HTML (142)
  • Review

  • Navigation in virtual and real environment using brain computer interface:a progress report
    Haochen HU, Yue LIU, Kang YUE, Yongtian WANG

    DOI:10.1016/j.vrih.2021.10.002

    2022, 4(2) : 89-114

    Abstract(236) PDF (21) HTML (419)
    A brain-computer interface (BCI) facilitates bypassing the peripheral nervous system and directly communicating with surrounding devices. Navigation technology using BCI has developed—from exploring the prototype paradigm in the virtual environment (VE) to accurately completing the locomotion intention of the operator in the form of a powered wheelchair or mobile robot in a real environment. This paper summarizes BCI navigation applications that have been used in both real and VEs in the past 20 years. Horizontal comparisons were conducted between various paradigms applied to BCI and their unique signal-processing methods. Owing to the shift in the control mode from synchronous to asynchronous, the development trend of navigation applications in the VE was also reviewed. The contrast between high-level commands and low-level commands is introduced as the main line to review the two major applications of BCI navigation in real environments: mobile robots and unmanned aerial vehicles (UAVs). Finally, applications of BCI navigation to scenarios outside the laboratory; research challenges, including human factors in navigation application interaction design; and the feasibility of hybrid BCI for BCI navigation are discussed in detail.
  • Article

  • Design and evaluation of window management operations in AR headset+smartphone interface
    Jie REN, Chun YU, Yueting WENG, Chengchi ZHOU, Yuanchun SHI

    DOI:10.1016/j.vrih.2021.12.002

    2022, 4(2) : 115-131

    Abstract(195) PDF (11) HTML (242)
    Background
    The combination of an augmented reality (AR) headset and a smartphone can simultaneously provide a wider display and a precise touch input; it can redefine the way we use applications today. However, users are deprived of such benefits owing to the independence of the two devices. There is a lack of intuitive and direct interactions between them.
    Methods
    In this study, we conduct a formative investigation to understand the window management requirements and interaction preferences of using an AR headset and a smartphone simultaneously and report the insights we gained. In addition, we introduce an example vocabulary of window management operations in the AR headset and smartphone interface.
    Results
    This allows users to manipulate windows in a virtual space and shift windows between devices efficiently and seamlessly.
  • Designing generation Y interactions: The case of YPhone
    Wei LIU

    DOI:10.1016/j.vrih.2021.12.005

    2022, 4(2) : 132-152

    Abstract(321) PDF (15) HTML (247)
    Background
    With an increasing number of products becoming digital, mobile, and networked, paying attention to the quality of interactions with such products is also becoming more relevant. Although the quality of such interactions has been addressed in several scientific studies, little attention has been paid to their implementation in real life and everyday contexts.
    Methods
    This paper describes the development of a novel office phone prototype, called YPhone, which demonstrates the application of a specific set of Generation Y interaction qualities (instantaneous, playful, collaborative, expressive, responsive, and flexible) in the context of office work. The working prototype supports office workers in experiencing new types of interactions. It was set out in practice through a series of evaluations.
    Results
    We found that the playful, expressive, responsive, and flexible qualities incur greater trust than the instantaneous and collaborative qualities. Such qualities can be grouped, although this may differ for different evaluated products, and researchers must be cautious about generalizations.
    Conclusions
    The overall evaluation was deemed positive, with some valuable suggestions provided regarding its user interactions and features.
  • Motivation effect of animated pedagogical agent's personality and feedback strategy types on learning in virtual training environment
    Yulong BIAN, Chao ZHOU

    DOI:10.1016/j.vrih.2021.11.001

    2022, 4(2) : 153-172

    Abstract(189) PDF (10) HTML (330)
    Background
    The personality and feedback of an animated pedagogical agent (APA) are vital social-emotional features that render the agent perceptually believable. Their effects on learning during virtual training need to be examined.
    Methods
    In this paper, an explanation model is proposed to clarify the underlying mechanism of how these two features affect learners. Two studies were conducted to investigate the model. In Study 1, the effect of the APA's personality type and feedback strategy on flow experience and performance was reexamined, revealing significant effects of the feedback strategy on flow and performance and a marginally significant effect of the personality type on performance. To explore the mechanism behind these effects, a theoretical model is proposed by distinguishing between intrinsic and extrinsic motivation effects. In Study 2, the model was evaluated, and the APA's personality type was found to significantly influence the factors in the path of the extrinsic motivation effect rather than those in the path of the intrinsic motivation effect.
    Results
    In contrast, the feedback strategy affected factors in the path of the intrinsic motivation effect.
    Conclusions
    These results validated the proposed model. Further distinguishing the two motivation effects is necessary to understand the respective effects of an APA's personality and feedback features on learning experiences and outcomes.
  • EyeGaze: Hybrid eye tracking approach for handheld mobile devices
    Shiwei CHENG, Qiufeng PING, Jialing WANG, Yijian CHEN

    DOI:10.1016/j.vrih.2021.10.003

    2022, 4(2) : 173-188

    Abstract(229) PDF (9) HTML (288)
    Background
    Eye-tracking technology for mobile devices has made significant progress. However, owing to limited computing capacity and the complexity of context, the conventional image feature-based technology cannot extract features accurately, thus affecting the performance.
    Methods
    This study proposes a novel approach by combining appearance- and feature-based eye-tracking methods. Face and eye region detections were conducted to obtain features that were used as inputs to the appearance model to detect the feature points. The feature points were used to generate feature vectors, such as corner center-pupil center, by which the gaze fixation coordinates were calculated.
    Results
    To obtain feature vectors with the best performance, we compared different vectors under different image resolution and illumination conditions, and the results indicated that the average gaze fixation accuracy was achieved at a visual angle of 1.93° when the image resolution was 96 × 48 pixels, with light sources illuminating from the front of the eye.
    Conclusions
    Compared with the current methods, our method improved the accuracy of gaze fixation and it was more usable.
  • A novel SSA-CCA framework for muscle artifact removal from ambulatory EEG
    Yuheng FENG, Qingze LIU, Aiping LIU, Ruobing QIAN, Xun CHEN

    DOI:10.1016/j.vrih.2022.01.001

    2022, 4(1) : 1-21

    Abstract(463) PDF (37) HTML (507)
    Background
    Electroencephalography (EEG) has gained popularity in various types of biomedical applications as a signal source that can be easily acquired and conveniently analyzed. However, owing to a complex scalp electrical environment, EEG is often polluted by diverse artifacts, with electromyography artifacts being the most difficult to remove. In particular, for ambulatory EEG devices with a restricted number of channels, dealing with muscle artifacts is a challenge.
    Methods
    In this study, we propose a simple but effective novel scheme that combines singular spectrum analysis (SSA) and canonical correlation analysis (CCA) algorithms for single-channel problems and then extend it to a few-channel case by adding additional combining and dividing operations to channels.
    Results
    We evaluated our proposed framework on both semi-simulated and real-life data and compared it with some state-of-the-art methods. The results demonstrate this novel framework's superior performance in both single-channel and few-channel cases.
    Conclusions
    This promising approach, based on its effectiveness and low time cost, is suitable for real-world biomedical signal processing applications.
  • Multimodal collaborative BCI system based on the improved CSP feature extraction algorithm
    Cunbo LI, Ning LI, Yuan QIU, Yueheng PENG, Yifeng WANG, Lili DENG, Teng MA, Fali LI, Dezhong YAO, Peng XU

    DOI:10.1016/j.vrih.2022.01.002

    2022, 4(1) : 22-37

    Abstract(376) PDF (19) HTML (485)
    Background
    As a novel approach for people to directly communicate with an external device, the study of brain-computer interfaces (BCIs) has become well-rounded. However, similar to the real-world scenario, where individuals are expected to work in groups, the BCI systems should be able to replicate group attributes.
    Methods
    We proposed a 4-order cumulants feature extraction method (CUM4-CSP) based on the common spatial patterns (CSP) algorithm. Simulation experiments conducted using motion visual evoked potentials (mVEP) EEG data verified the robustness of the proposed algorithm. In addition, to freely choose paradigms, we adopted the mVEP and steady-state visual evoked potential (SSVEP) paradigms and designed a multimodal collaborative BCI system based on the proposed CUM4-CSP algorithm. The feasibility of the proposed multimodal collaborative system framework was demonstrated using a multiplayer game controlling system that simultaneously facilitates the coordination and competitive control of two users on external devices. To verify the robustness of the proposed scheme, we recruited 30 subjects to conduct online game control experiments, and the results were statistically analyzed.
    Results
    The simulation results prove that the proposed CUM4-CSP algorithm has good noise immunity. The online experimental results indicate that the subjects could reliably perform the game confrontation operation with the selected BCI paradigm.
    Conclusions
    The proposed CUM4-CSP algorithm can effectively extract features from EEG data in a noisy environment. Additionally, the proposed scheme may provide a new solution for EEG-based group BCI research.

Aims & Scope

Virtual Reality & Intelligent Hardware (VRIH) is an open access journal that aims to showcase and promote distinguished research in the field of virtual reality and intelligent hardware. It provides a global publishing and academic exchange platform for researchers, professionals and industry practitioners. The journal offers high-quality single-blind peer review and is published bimonthly in English.

Special Issues

  • Intelligent interaction in mixed reality

    156 Browse

    Intelligent interaction in mixed reality

    Intro:With the rise of the concept of the metaverse, mixed reality continues to receive keen attention from...

    2022 Vol. 4 No. 2

  • Locomotion perception and redirection

    627 Browse

    Locomotion perception and redirection

    Intro:Locomotion is a fundamental interaction technique that allows free navigation in virtual scenes. A la...

    2021 Vol. 3 No. 6

  • Virtual reality and augmented reality in medical simulation

    990 Browse

    Virtual reality and augmented reality in medical simulation

    Intro:Virtual reality/augmented reality (VR/AR) technologies have been widely used in medical fields, such ...

    2021 Vol. 3 No. 4

  • Hand and gesture

    1069 Browse

    Hand and gesture

    Intro:Hands play an important role in our daily life. We use our hands for manipulation in working, emphasi...

    2021 Vol. 3 No. 3

  • Simulation and interaction of fluid and solid dynamics

    1176 Browse

    Simulation and interaction of fluid and solid dynamics

    Intro:Fluid and solid simulation is to generate a realistic simulation of fluids and solids, in particular ...

    2021 Vol. 3 No. 2

  • Emotion recognition for human-computer interaction

    1322 Browse

    Emotion recognition for human-computer interaction

    Intro:Emotion recognition is to quantify,describe and recognize different emotional states through thebehav...

    2021 Vol. 3 No. 1