Chinese
Adv Search
Home | Accepted | Article In Press | Current Issue | Archive | Special Issues | Collections | Featured Articles | Statistics

2020,  2 (5):   381 - 393   Published Date:2020-10-20

DOI: 10.1016/j.vrih.2020.07.009
1 Introduction2 VR/AR in Singapore3 VR medical simulation4 AR for surgery5 VR/AR in manufacturing6 Discussion7 Conclusion

Abstract

With the mindset of constant improvement in efficiency and safety in the workspace and training in Singapore, there is a need to explore varying technologies and their capabilities to fulfil this need. The ability of Virtual Reality (VR) and Augmented Reality (AR) to create an immersive experience of tying the virtual and physical environments coupled with information filtering capabilities brings a possibility of introducing this technology into the training process and workspace. This paper surveys current research trends, findings and limitation of VR and AR in its effect on human performance, specifically in Singapore, and our experience in the National University of Singapore (NUS).

Content

1 Introduction
Virtual reality (VR) and Augmented reality (AR) have become popular technologies in the recent era. These technologies enable humans to connect to the digital world. The ability to connect the digital world with the physical world allows one to introduce or alter information, thus widens the possibilities for creation and interaction. Artists can integrate their digital work with the physical space, gamers are able to bring the gaming world out of their computer screens into the physical environment and additional information can be provided in physical workplaces at ease. Singapore is actively participating in this global innovation and research process to discover the use of the technologies.
VR was first invented by Morton Heilig, a cinematographer in 1950 to bring the movie experience to the audience. To date, there are numerous definitions of VR[1,2], yet all of them point towards a common definition of an electronically simulated environment with the aid of instruments to allow interactions in this environment. There are three main types of display devices for VR: spatial projection, head-mounted device (HMD) and hand-held devices. The most common VR display is HMD due to its ability keeping users immersed in the environment. This immersion is the measure of users’ level of “real-ness” in the electronically simulated environment.
As a variation of VR, AR however, allows users to see the real world and have virtual objects super-imposed with the real world. In contrast, VR does not allow users to see the real world and has the virtual environment projected to them instead. This means that AR supplements the real environment instead of completely replacing it. Hence, AR can be defined as a middle ground between the completely virtual environment to the completely real environment[3,4]. The augmented objects should appear to coexist in the real environment and users should be able to interact with these objects with calibrated and programmed instruments, similar to the virtual objects within the VR environment. These calibrated and programmed instruments are otherwise known to fall under the category of human-computer interaction (HCI) consisting of the different means humans can manipulate the virtual/augmented objects ranging from a modern keyboard with manual buttons to graphical interface and speech recognition, etc.
This paper surveys the innovations and research on utilizing VR/AR to improve human performance. Human performance is the measure to the quality of the ability completing any task with three main aspects consisting of speed, accuracy and cognitive load[5]. A good increase of human performance would indicate performing a task faster, more accurately and with a reduced attentional demand. Nonetheless, not all the three aspects will always be improved at a given time.
2 VR/AR in Singapore
Current research of VR/AR in human performance around the world encompasses different areas of interest. These include enabling users to conduct a specific task such as operating a virtual keyboard[6], with a change in the typical keyboard controls for an increase in text-entry capabilities; or imulating different scenarios to train spatial awareness and critical thinking process in firefighters[7]. In healthcare and medicine, human performance research includes assisting and improving the psychomotor effectiveness of patients undergoing rehab[8]; and allowing novice surgeons and medical students to improve hand-eye coordination needed to perform minimally invasive surgeries and interventional radiology procedures[9]. The list of inculcating VR/AR for an increase in human performance is not exhaustive.
Singapore acknowledges the potential benefits of VR/AR on its effects to users. The Singapore aviation industry is one of the most established users of VR. VR is applied in flight simulation schools to teach young aspiring pilots on the operations of an aircraft[10-12]. The medical industry is another interested user[13] which will be explained in the later sections. This paper will survey the current state of research in Singapore particularly in NUS on the effects of VR/AR on human performance, examine the current situations, and discuss its future trends.
In the following, the databases used to search for publications of research performed in Singapore are Scopus, Engineering Village & ScholarBank@NUS. From 1997‒2020, there are a total of 313 patents awarded, 280 journal articles, and 207 conference papers published regarding the use of VR/AR and its impact on human performance. Research on the effects of VR/AR on human performance in Singapore focuses on medical training, interventional therapy and manufacturing. The key reason for this emphasis is that manufacturing processes and medical/healthcare procedures are heavily reliant on human operators. Sometimes available information may potentially generate overwhelming responses to the operators when performing a critical task. In addition, the virtually created environment can provide an immersive experience for a safe and optimal transfer of knowledge in training procedures.
Sections 3‒5 review the state of research and application on VR medical simulation, AR in surgery and manufacturing respectively. The future trends of VR/AR research applications in Singapore are discussed in Section 6. Section 7 summarizes and concludes this paper.
3 VR medical simulation
Medical simulation is often used to educate and train medical students and junior healthcare professionals. It includes surgical simulation and procedural task simulation. They are especially important to provide training to young surgeons as the trainees rarely get the exposure of surgeries or procedures with a live patient. Minimally invasive surgery (MIS) has risen in popularity, due to its numerous benefits such as reduction of blood loss, reduction in tissue trauma and improved recovery times. As compared to traditional open surgery where the surgeons identify anatomical structures and perform surgery directly, MIS is an image-reliant procedure where surgeons navigate through the patient with the assistance of a camera probe while performing the surgery. Surgeons will need to establish the spatial relationship between the medical image information and the physical surgical site related. This is not an easy task to do for most surgeons and the room for error is very small regardless of surgical procedures due to the delicate nature of the human anatomy.
Hand eye coordination is one of the most important skills for MIS surgeons, in addition to critical thinking, acting and reacting to any changes of situation in the surgical room. Training for medical students or junior surgeons is traditionally performed on a computer with a normal keyboard and mouse, which is inadequate in the transfer of skills to the trainee surgeons[14]. Therefore, VR is introduced to provide a realistic and safe virtual environment for trainee surgeons[15]. MIS surgeries such as polymethylmethacrylate (PMMA) injection[16], laparoscopy and percutaneous coronary interventions[17-19] can be virtually simulated for training purposes. As the environment and the objects are virtual, they can be programmed and tailored to meet the required needs of the trainee surgeons. This allows a wider range of possible scenarios simulated to prepare the trainee surgeons for any situations. VR surgical training systems are developed to further improve the quality of training with the use of haptic devices to control the virtual camera probe and the virtual surgical instruments. Through depth and coordinate calculations, the system will handle the interactions among the virtual objects and provide tactile feedback to the trainees[20-22], making the whole experience more immersive and “real”. A typical VR medical simulation has two major parts: virtual objects (i.e., virtual organs) being modelled from the medical images obtained during the pre-operation procedure, and trainees navigating through the virtual environment connected to mechanical controllers and the virtual instruments.
However, modelling of the movement during a VR simulation training should not be a one directional approach. Various approaches are explored to model the virtual objects and the behavior of the virtual objects when users interact with them. For example, a virtual gallbladder modelled for laparoscopy training. The gallbladder could not be treated as a single entity when predicting the behavior during interaction. Instead, both active and passive tensions should be considered due to the stimuli and the muscle involved. Various methods such as genetic algorithms and a multi-layered mass spring model are explored to model the biomechanical behavior of gallbladder[23,24]. A particle-based solution has been proposed to simulate the interactions between blood flow and vessel wall for virtual surgery[25]. The simulation of blood flow and deformation of vessel is achieved by coupling the smoothed particle hydrodynamics and mass spring model. Beside virtual organs, we need to consider the tools and the tool-tissue interactions. It is not easy to model a catheter which is a tube-like instrument used in interventional radiology. The shape of the virtual counterpart of the catheter has to be changed depending on the control at the proximal end of the catheter. Finite element analysis method has been used to model the navigation of the catheter within human vascular system[17,19]. In addition, foreign fluids (e.g., PMMA which is a liquid cement for bones) may be used during a medical procedure. The interactions of the virtual PMMA with the virtual patient and instrument require complicated modelling, as the flow of this liquid cement is in relation to the trainee’s control of the virtual instrument[16]. Proper modelling of all virtual entities in the virtual environment increases the immersive experience to the trainees, or in other words the “real-ness” of the virtual simulation.
Software peripherals can also be included in the VR/AR devices during training process such as an intelligent virtual trainer[26]. These virtual trainers provide information to guide the trainee or to provide corrections when the trainee makes a mistake. Ho et al.[27] utilize this intelligent trainer concept albeit in the assembly of hybrid medical devices, reporting an increase in transfer of knowledge with the use of VR as a medium for training. This is due to the immersive experience and the filtering of relevant information to the trainee coupled with VR/AR being a fresh technology to be used in the classroom setting. These software peripherals are still at the preliminary of its research and there are still work needed to fully integrate them into the VR simulations.
Figure 1 shows an overview of VR medical simulation process. There are a lot of factors and components in VR medical simulation and all of which need to be well established in order to make the experience successful. The current research shows improvement on the general dexterity of the trainee surgeons, particularly on their hand-eye coordination to perform complicated control of apparatus. The trainee surgeons are better equipped and are able to learn the surgical procedures in a faster and safer way. But the virtual simulations still lack of peripherals such as the tactile feedback and the prediction of the events after each action made by users. Other senses such as smell, and sound are also yet to be included which may reduce the “real-ness” of the virtual simulation which hinders the critical thinking training process.
4 AR for surgery
Taking into consideration of the advantages and disadvantages, VR is often used for simulation and training. AR would, however, be more appropriate to aid in medical processes, specifically in surgical procedures. Spatial projection is currently the most popular form of display as compared to the other forms of AR display devices such as the HMD which may hinder surgeons’ field of view. The additional load of wearing additional glasses whilst handling the surgical tools for the procedures may affect the performance of the surgeons too.
The current state of MIS surgery requires the surgeon to constantly look at a 2D computer screen while navigating the instruments within the patient. To mitigate this restriction, Kockro et al. proposed the idea of projecting the information gathered from the camera probe onto the patient with a spatial projection display, mimicking an instant “X-ray” vision of the patient[28]. This helps the surgeon to have a better 3D visualization and reduces the cognitive load needed to come up with a relation in terms of depth and direction of movement, when navigating the instruments through the 2D computer screen. Though this system proved successful in mapping the patient with the camera probe, the real time overlay might not be reliable due the computational processing time and the capabilities of the camera in capturing tight and dark spaces within the patient.
Nowadays, medical images are typically acquired before the actual surgery. The acquired images can be visualized in 3D, calibrated and adjusted during the pre-operation process. Computer Aided Surgery (CAS) is now a broad term referring to all the different clinical and engineering methods where computer is directly applied to surgery including interventional radiology. The different methods include navigation, medical robotics, virtual and augmented realities. Yang et al. from NUS utilized these methods in radiofrequency (RF) ablation of tumors[29]. The medical images obtained during the pre-operation can be overlaid onto the patient to aid the surgeon in positioning the RF needles accurately. The overlaid images will greatly improve the accuracy of the surgical process to reduce complications and accidents to happen on the patient[30,31]. In addition, hand gestures are introduced[32] as an HCI to control the apparatus if the surgeons are unable to touch and handle additional mechanical instruments in the surgical room. Figure 2 illustrates an example of the set-up of the AR spatial display in the surgical room. The example of an overlay of the position of the needle is also shown. The use of AR in the surgical procedure has garnered success in improving the accuracy and speed in the MIS and is a step forwards introducing novel tools into the surgical room.
However, most of the research methods requiring the use of spatial projection display has its disadvantages such as occlusion of the projection, lighting conditions and the need for strict calibration of the position of the camera and projector. Tablet-based AR display is the next best alternative and is able to provide additional information to the surgeon at critical moments[33]. With possible customization, the tablet could be mounted at the surgeon’s convenience without any disruption of the procedure. The cued information and reduction in cognitive load allow the surgeon to concentrate on more important tasks. In general, both forms of display can be set up within reasonable time without causing any delay to the actual surgical procedure. The setup do not hinder the space in the surgical room too.
5 VR/AR in manufacturing
The manufacturing sector has been a key contribution to the Singapore’s economic growth for the last several decades. With the growing competition in the various business sectors, the industry constantly seeks ways to improve the efficiency. In the recent years, there has been an increase in emphasis on safety in Singapore and it is now an important factor in the workplace. Since the current industry is still heavily operator-centered in Singapore, the introduction of VR/AR into the workplace aims to increase the efficiency and improve the safety of operators.
Computer Aided Design (CAD) is key in manufacturing process for designing parts and components. Pang et al. from another NUS team[34] introduced the idea of utilizing AR in this process to help designers view CAD parts out of their computer screens and in the physical world. This provides designers with a better 3D perspective and increases their geometric visualization of the CAD components. This improved 3D visualization reduces the cognitive load for users and results in an increase in the efficiency of the design process. With the progression of object tracking for VR/AR, designers will be able to use bare-hand gestures to manipulate the virtual objects[35]. This form of HCI allows the designers to move and alter the size of the components, thus improving the interaction and experience during the design process.
At the operational level, paper and computer-based tasks can be replaced with AR tasks which can enhance the interaction and experience of the operator performing the task. The information needed by the operator can be augmented and filtered onto the physical workspace using a graphic user interface (GUI) and the operator will be able to manipulate this information with basic bare-hand gestures[36]. The ability to filter information needed by the operator in real-time reduces the cognitive load required while working on the given task. With the evolution of smart machines, the GUI can also be used to manipulate these machines, making the whole process more intuitive and effective. AR can also be utilized similar to the design process which operators will be able to simulate an assembly process by manipulating virtual objects within the physical environment[37]. The virtual simulation allows the operator to further understand the given task and creates room for trial and errors. This results in an increase in speed and accuracy during the actual assembly process and reduces the occurrence of a mishap.
The experience of controlling robots remotely, ranging from life size robots to robot arms can be enhanced with VR. By attaching a camera onto the head of the robot and using an HMD on the other end an operator. The operator will be able to control the robot whilst viewing the environment of the robot[38]. This increases the operator’s situational awareness. With the addition of controlling the remote robots with bare-hand gestures, the interaction of controlling the robot will be made more intuitive to the operator and reduces the need for complicated hardware for example joysticks that are specifically designed for that situation.
Introducing VR/AR into the manufacturing processes allows users to have a better 3D geometric visualization, intuitive controlling of machine operations, and simulations/practices. These increase the human performance of users in terms of execution speed, accuracy and cognitive load. However, the experience of the implementations is limited due to the lack of haptic feedback between the interaction of the virtual objects. Although there are techniques to calculate the force feedback and display the numerical information to users, physical tactile feedback is still not well developed and feasible. For example, the use of tactile gloves is costly and complicated with a lot of restrictions in certain situations.
6 Discussion
As shown in the previous sections, VR/AR has promise in enabling a more efficient training procedure and operation process. This leads to motivation for research in these areas in order to establish the effectiveness of incorporating VR/AR into those situations.
Current manufacturing application of VR/AR in the workplace is relatively new despite a strong research interest in academia. This is largely due to the high cost of operation of VR/AR technologies. The projection devices and other hardware can be expensive. Software licensing are not cheap to acquire and maintain too. In addition, the current VR/AR hardware and software often turn out to be not user friendly. All these discourage businesses and corporations from embracing VR/AR for commercial uses. This impact is prominent in Singapore since the shortage of local workers drove the demography in the manufacturing industry to be dominated by foreign workers with cheaper labor cost. As the emphasis on workplace safety grows, a trend to use VR/AR for safety training rises. Through the immersive experience of VR/AR, workers can experience the effects of dangerous actions and subsequent causes by simulating careless or mistaking operations. A sense of awareness and responsibility can be developed for trainees adhering safety in workplaces such as the construction sites[39].
Virtual simulation for medical procedures helps to create various virtual situations to boost the learning process especially for trainee surgeons whilst a live surgery becomes not easily accessible. However, the learning outcome of simulated training only benefits largely for trainees or junior surgeons. The current state of virtual simulation focuses on the hand eye coordination through mechanical controllers. Such virtual simulation may still lack of some interactivity with the virtual objects. This is due to the difficulty of modelling and predicting the interactions between the virtual instruments and virtual organs as the human body is a complicated system which could not be treated as a single entity when modelling it. There are many biological factors to be taken into consideration when modelling the biomechanics. Nonetheless, there are more investigations on algorithmic techniques such as machine learning to aid the modeling and interaction process. Once the virtual objects can be modelled systematically, medical simulation will offer a more immersive experience to the trainees enabling a better learning of critical thinking through different situations simulated in the VR surgical room.
Introducing AR into the surgical room will be beneficial to surgeons by superimposing value-added information during surgery. However, the research and application in this field are still in its early stage and many factors affect its growth. For instance, the projection latency, miscalibration or error in display may occur that can potentially distort the surgeon’s view leading towards mistakes in critical decision making. The human eyes are sensitive to visual cues and the if the additional AR display is not properly superimposed, it will affect surgeons during operations[40].
The integration of different disciplines in Industry 4.0 will bring about different areas such as Artificial Intelligence and Cyber-Physical Systems (CPS) to VR and AR. This will be essentially beneficial towards introducing VR and AR into medical simulation and surgical rooms respectively. One example is the use of cognitive engine. Cognitive engine consists of a knowledge base, reasoner and machine learning component. It is meant to augment the human operator in planning, analysis and decision making. The knowledge base may be collated from medical books, medical records, research papers, and expert knowledge of medical practitioners. During pre-operation, cognitive engine can suggest a sequence of surgical procedures based on the available tools in the operation theater[41]. The cognitive engine is also capable of streamlining possible surgical actions during the procedure based on instantaneous states of the patient, surgeons and surgical tools. Multiple knowledge bases may be accessed to find references for managing rare surgical cases. With VR/AR technologies, the possibilities of can be simulated and the suggested actions or potential risks can be projected to the surgeon[42].
Robots have been introduced into the surgical room. One such example is during the RF process, which can be enhanced using a robot to execute the insertion of a RF needle to eliminate the natural shaky hand movements of the surgeon[30,31]. The intelligence component could also be integrated to robots which provides help during the surgical procedures. A shared control system may be formed to help surgeons utilizing robots during the surgical room to aid the surgeons on the possible consequences of the actions or findings[43]. Nonetheless, the intelligent robots will maintain its status quo of not overriding the surgeons on executing the actions but come into as an aid.
Both the manufacturing process and the surgical room today increasingly involves the use of highly complex devices. Together with the recent advancements in embedded software and network connectivity, it is inevitable that both of which in the future will leverage CPS technologies to facilitate dataflow among physical and cyber components[44]. This leads to higher levels of communication and coordination within the surgical room[45,46] and the manufacturing process[47]. This is especially useful for surgical procedures whom benefits from the heavy flow of information. Such information-based architectures promotes a more systematic approach for surgical procedures[48] and have been studied for their effectiveness in improving surgical outcomes[49,50]. In this aspect, VR and AR technologies are uniquely poised to interface surgeons within CPS, presenting data or augmenting their capabilities during the surgical procedures[33] and the manufacturing process. Figure 3 shows the information flow within a CPS and where VR and AR can be integrated to facilitate the delivering of the information to the users in production workcells. We are experimenting different VR and AR methods with the CPS-based workcells.
In light of the Covid-19 pandemic, there are increasing number of people required to work from home. This has resulted in the shift in operations online. Awareness of VR/AR technology has risen as companies seeking better div to their operations and training programs. Businesses are moving online where shop spaces and inventory can be viewed in a virtual replica[51-53]. Customers can visit the virtual shops anytime and from anywhere to do shopping of products of their interests.
The increase in awareness of VR/AR for businesses and corporations in Singapore will, in return lead to a further development of VR/AR research. Hopefully the human performance improvement with the aid of VR/AR technology can help the industry in terms of interest and profit making.
7 Conclusion
In summary, VR/AR has been proven to improve users’ human performance including the enhancement of hand-eye coordination, the increase of 3D visualization skills, and better knowledge transfer and less memory requirements. These benefits allow users to do task faster, more accurately and reduces cognitive load resulting in better human performance due to VR/AR based immersive experience developed. The virtual environment allows different scenarios and options to be recreated in order to suit users’ needs. It could also keep users in a safe environment. Virtual environment would be space saving and essential for small countries such as Singapore too. Research has shown many positive findings using AR/VR with most of the trials and experiments performed in a laboratory setting. While there is a good trend recently with more industrial applications being developed, it will take some time for AR/VR being widely adapted in business. Though it is controversial in qualifying and quantifying the increase of human performance, there are research such as brain activity using electroencephalography is promising and growing[54].
This paper surveys the state-of-the-art of research, current research trends, findings and limitations in VR/AR on the effect on human performance in Singapore from the perspective of NUS. Some of the applications of VR/AR in Singapore are presented. This research and application listed here is not exhaustive. For example, the use of audio within the VR/AR experience to increase the immersive process[55-57] and usage of VR/AR for psychological and physical therapy[58-61]. Educators in Alice Lee Centre for Nursing Studies of NUS use virtual patients to train nursing students and hone their communication skills before their clinical postings[62]. Researchers at the Keio-NUS CUTE Center have conducted many VR and AR research works. They create an interesting interactive, multisensory VR game[63]. with four sensory streams fused together simultaneously to achieve realism. As Singapore becomes the first in South East Asia to introduce 5G in which is often recognized as the accelerator for VR/AR, the technology can be widely accepted for many industrial applications in the years to come.

Reference

1.

Krueger M W. Artificial reality. Reading, Massachusetts: Addison-Wesley Publishing, 1991

2.

Steuer J. Defining virtual reality: dimensions determining telepresence. Journal of Communication, 1992, 42(4): 73–93 DOI:10.1111/j.1460-2466.1992.tb00812.x

3.

Milgram P, Takemura H, Utsumi A, Kishino F. Augmented reality: a class of displays on the reality-virtuality continuum. In: Proc SPIE 2351, Telemanipulator and Telepresence Technologies, 1995, 2351, 282–292 DOI:10.1117/12.197321

4.

Milgram P, Kishino F. A taxonomy of mixed reality visual displays. IEICE Transactions on Information and Systems, 1994, 77(12): 1321–1329

5.

Wickens C D, Hollands J G, Banbury S, Parasuraman R. Engineering psychology and human performance. New York, Psychology Press, 2015 DOI:10.4324/9781315665177

6.

Zhai S M, Hunter M, Smith B A. The metropolis keyboard―an exploration of quantitative techniques for virtual keyboard design. In: Proceedings of the 13th annual ACM symposium on User interface software and technology-UIST '00. SanDiego, California, USA, NewYork, Press ACM, 2000, 119–128 DOI:10.1145/354401.354424

7.

Bliss J P, Tidwell P D, Guest M A. The effectiveness of virtual reality for administering spatial navigation training to firefighters. Presence: Teleoperators & Virtual Environments, 1997, 6(1): 73–86 DOI:10.1162/pres.1997.6.1.73

8.

Lee J H, Ku J, Cho W, Hahn W Y, Kim I Y, Lee S M, Kang Y, Kim D Y, Yu T, Wiederhold B K, Wiederhold M D, Kim S I. A virtual reality system for the assessment and rehabilitation of the activities of daily living. CyberPsychology & Behavior, 2003, 6(4): 383–388 DOI:10.1089/109493103322278763

9.

Arora S, Sevdalis N, Aggarwal R, Sirimanna P, Darzi A, Kneebone R. Stress impairs psychomotor performance in novice laparoscopic surgeons. Surgical Endoscopy, 2010, 24(10): 2588–2593 DOI:10.1007/s00464-010-1013-2

10.

SYFC–Singapore Youth Flying Club. Available from: https://www.syfc.sg/

11.

Seletar Flying Club. Available from: http://www.seletar-flying-club.org/

12.

Learn to Fly in Singapore | Flight School Singapore. Available from: http://flightschool.sg/

13.

Technology's Role in Training Safer Doctors. Available from: http://nusmedicine.nus.edu.sg/newsletter/issue25/in-vivo/technology-s-role-in-training-safer-doctors

14.

Cai Y, Chui C, Ye X Z, Wang Y P, Anderson J H. VR simulated training for less invasive vascular intervention. Computers & Graphics, 2003, 27(2): 215–221 DOI:10.1016/s0097-8493(02)00278-9

15.

Anderson J H, Chui C, Cai Y, Wang Y, Li Z, Ma X, Nowinski W L, Solaiyappan M, Murphy K J, Gailloud P, Venbrux A C. Virtual reality training in interventional radiology: the Johns Hopkins and kent ridge digital laboratory experience. Seminars in Interventional Radiology, 2002, 19(2): 179–185 DOI:10.1055/s-2002-32796

16.

Lian Z, Chui C K, Teoh S H. A biomechanical model for real-time simulation of PMMA injection with haptics. Computers in Biology and Medicine, 2008, 38(3): 304–312 DOI:10.1016/j.compbiomed.2007.10.009

17.

Wang Y P, Chui C, Lim H, Cai Y Y, Mak K. Real-time interactive simulator for percutaneous coronary revascularization procedures. Computer Aided Surgery, 1998, 3(5): 211–227 DOI:10.3109/10929089809149843

18.

Chiang P, Zheng J M, Yu Y, Mak K H, Chui C K, Cai Y. A VR simulator for intracardiac intervention. IEEE Computer Graphics and Applications, 2013, 33(1): 44–57 DOI:10.1109/mcg.2012.47

19.

Chui C K, Nguyen H T, Wang Y P, Mullick R, Ragahavan R. Potential field of vascular anatomy for real-time computation of catheter navigation. Proceedings of the Visible Human Project Conference, 1996, 113–114

20.

Rasool S, Sourin A. Image-driven virtual simulation of arthroscopy. The Visual Computer, 2013, 29(5): 333–344 DOI:10.1007/s00371-012-0736-6

21.

Rasool S, Sourin A, Xia P J, Weng B, Kagda F. Towards hand-eye coordination training in virtual knee arthroscopy. In: Proceedings of the 19th ACM Symposium on Virtual Reality Software and Technology-VRST '13. Singapore, York New, Press ACM, 2013, 17–26 DOI:10.1145/2503713.2503715

22.

Chui C K, Ong J S K, Lian Z Y, Wang Z L, Teo J, Zhang J, Yan C H, Ong S H, Wang S C, Wong H K, Teo C L, Teoh S H. Haptics in computer-mediated simulation: Training in vertebroplasty surgery. Simulation & Gaming, 2006, 37(4): 438–451 DOI:10.1177/1046878106291667

23.

Zhang J, Zhou J, Huang W, Qin J, Yang T, Liu J, Su Y, Chui C K, Chang S. GPU-friendly gallbladder modeling in laparoscopic cholecystectomy surgical training system. Computers & Electrical Engineering, 2013, 39(1): 122–129 DOI:10.1016/j.compeleceng.2012.05.012

24.

Xiong L F, Chui C K, Teo C L. Reality based modeling and simulation of gallbladder shape deformation using variational methods. International Journal of Computer Assisted Radiology and Surgery, 2013, 8(5): 857–865 DOI:10.1007/s11548-013-0821-y

25.

Qin J, Pang W M, Nguyen B P, Ni D, Chui C K. Particle-based simulation of blood flow and vessel wall interactions in virtual surgery. In: Proceedings of the 2010 Symposium on Information and Communication Technology-SoICT '10. Hanoi, Viet nam, New York, ACM Press, 2010, 128–133 DOI:10.1145/1852611.1852636

26.

Muller-Wittig W, Bockholt U, Arcos J L L, Vossl G. Enhanced training environment for minimally invasive surgery. In: Proceedings Tenth IEEE International Workshop on Enabling Technologies: Infrastructure for Collaborative Enterprises. WET ICE 2001. Cambridge, MA, USA, IEEE, 2001, 269–272 DOI:10.1109/enabl.2001.953426

27.

Ho N, Wong P M, Chua M, Chui C K. Virtual reality training for assembly of hybrid medical devices. Multimedia Tools and Applications, 2018, 77(23): 30651–30682 DOI:10.1007/s11042-018-6216-x

28.

Kockro R A, Tsai Y T, Ng I, Hwang P, Zhu C G, Agusanto K, Hong L X, Serra L. Dex-rayaugmented reality neurosurgical navigation with a handheld video probe. Neurosurgery, 2009, 65(4): 795–808 DOI:10.1227/01.neu.0000349918.36700.1c

29.

Yang L, Chui C, Chang S. Design and development of an augmented reality robotic system for large tumor ablation. International Journal of Virtual Reality, 2009, 8(1): 27–35 DOI:10.20870/ijvr.2009.8.1.2710

30.

Wen R, Chui C K, Ong S H, Lim K B, Chang S K Y. Projection-based visual guidance for robot-aided RF needle insertion. International Journal of Computer Assisted Radiology and Surgery, 2013, 8(6): 1015–1025 DOI:10.1007/s11548-013-0897-4

31.

Yang T, Yang L, Liu J, Chui C K, Huang W, Zhang J, Zhou J, Lee B H, Tan N M, Wong W K D, Yin F, Chang K Y, Su Y. Robotic device for use in image-guided robot assisted surgical training. Patent, US8764448B2, 2014-07-01

32.

Wen R, Tay W L, Nguyen B P, Chng C B, Chui C K. Hand gesture guided robot-assisted surgery based on a direct augmented reality interface. Computer Methods and Programs in Biomedicine, 2014, 116(2): 68–80 DOI:10.1016/j.cmpb.2013.12.018

33.

Wen R, Chng C, Chui C. Augmented reality guidance with multimodality imaging data and depth-perceived interaction for robot-assisted surgery. Robotics, 2017, 6(2): 13 DOI:10.3390/robotics6020013

34.

Pang Y, Nee A Y C, Khim Ong S, Yuan M L, Youcef-Toumi K. Assembly feature design in an augmented reality environment. Assembly Automation, 2006, 26(1): 34–43 DOI:10.1108/01445150610645648

35.

Wang Z, Ng L X, Ong S K, Nee A Y C. Assembly planning and evaluation in an augmented reality environment. International Journal of Production Research, 2013, 51: 7388–7404 DOI:10.1080/00207543.2013.837986

36.

Yew A W W, Ong S K, Nee A Y C. Towards a griddable distributed manufacturing system with augmented reality interfaces. Robotics and Computer-Integrated Manufacturing, 2016, 39: 43–55 DOI:10.1016/j.rcim.2015.12.002

37.

Wang X, Ong S K, Nee A Y C. Real-virtual components interaction for assembly simulation and planning. Robotics and Computer-Integrated Manufacturing, 2016, 41: 102–114 DOI:10.1016/j.rcim.2016.03.005

38.

Yew A W W, Ong S K, Nee A Y C. Immersive augmented reality environment for the teleoperation of maintenance robots. Procedia CIRP, 2017, 61: 305–310 DOI:10.1016/j.procir.2016.11.183

39.

Jtc Safety Induction Course-SCAL Academy Pte Ltd. Available from: https://scal-academy.com.sg/courses/jtc-safety-induction-course

40.

Tang S L, Kwoh C K, Teo M Y, Sing N W, Ling K V. Augmented reality systems for medical applications. IEEE Engineering in Medicine and Biology Magazine, 1998, 17(3): 49–58 DOI:10.1109/51.677169

41.

Tan X, Chng C B, Duan B, Ho Y, Wen R, Chen X, Lim K B, Chui C K. Cognitive engine for robot-assisted radio-frequency ablation system. Acta Polytechnica Hungarica, 2017, 14, 129–145 DOI:10.12700/APH.14.1.2017.1.9

42.

Tan X. Cognitive engine and deep reinforcement learning for robot-assisted surgery. Dissertation for the Doctoral Degree, Singapore, National University of Singapore, ScholarBank@NUS Repository, 2019

43.

Chng C B. Spherical mechanism design and application for robot-assisted surgery. Dissertation for the Doctoral Degree, Singapore, National University of Singapore, ScholarBank@NUS Repository, 2019

44.

Liu X F, Shahriar M R, Al Sunny S M N, Leu M C, Hu L W. Cyber-physical manufacturing cloud: Architecture, virtualization, communication, and testbed. Journal of Manufacturing Systems, 2017, 43, 352–364 DOI:10.1016/j.jmsy.2017.04.004

45.

Joerger G, Rambourg J, Gaspard-Boulinc H, Conversy S, Bass B L, Dunkin B J, Garbey M. A cyber-physical system to improve the management of a large suite of operating rooms. ACM Transactions on Cyber-Physical Systems, 2018, 2(4): 34 DOI:10.1145/3140234

46.

Li Y T, Jacob M, Akingba G, Wachs J P. A cyber-physical management system for delivering and monitoring surgical instruments in the OR. Surgical Innovation, 2013, 20(4): 377–384 DOI:10.1177/1553350612459109

47.

Wang L H, Törngren M, Onori M. Current status and advancement of cyber-physical systems in manufacturing. Journal of Manufacturing Systems, 2015, 37, 517–527 DOI:10.1016/j.jmsy.2015.04.008

48.

Okamoto J, Masamune K, Iseki H, Muragaki Y. Development concepts of a Smart Cyber Operating Theater (SCOT) using ORiN technology. Biomedizinische Technik. Biomedical Engineering, 2018, 63(1): 31–37 DOI:10.1515/bmt-2017-0006

49.

Chng C B, Wong P M, Ho N, Tan X Y, Chui C K. Towards a cyber-physical systems based operating room of the future. In: OR 2.0 Context-Aware Operating Theaters and Machine Learning in Clinical Neuroimaging. Cham: Springer International Publishing, 2019: 47–55 DOI:10.1007/978-3-030-32695-1_6

50.

Chng C B, Chia D W T, Cao Y, Yo K, Fujie M G, Chui C K. A cyber-physical system approach to immobilization of patient on radiation treatment. In: 2019 IEEE 23rd International Conference on Intelligent Engineering Systems (INES). Gödöllő, Hungary, IEEE, 2019, 153–158 DOI:10.1109/ines46365.2019.9109457

51.

Group-Based Upgrading Projects-SME Centre@SCCCI. Available from: https://smecentre-sccci.sg/group-based-upgrading-projects

52.

Silversea Media Group|Immersive Media Company, Silversea Media Group. Available from: https://www.silversea-media.com/

53.

Lai L. Immersive tech opens new world of opportunities for businesses. SGSME.SG, 2020. Available from: https://www.sgsme.sg/news/immersive-tech-opens-new-world-opportunities-businesses

54.

Siong L C. Training and assessment of hand-eye coordination with electroencephalography. Dissertation for the Doctoral Degree, Singapore, National University of Singapore, ScholarBank@NUS Repository, 2015

55.

Zhou Z Y, Cheok A D, Qiu Y, Yang X. The role of 3-D sound in human reaction and performance in augmented reality environments. IEEE Transactions on Systems, Man, and Cybernetics-Part A: Systems and Humans, 2007, 37(2): 262–272 DOI:10.1109/tsmca.2006.886376

56.

Ranjan R, Gan W S. Natural listening over headphones in augmented reality using adaptive filtering techniques. ACM Transactions on Audio, Speech, and Language Processing, 2015, 23(11): 1988–2002 DOI:10.1109/taslp.2015.2460459

57.

Hong J Y, He J, Lam B, Gupta R, Gan W. Spatial audio for soundscape design: recording and reproduction. Applied Sciences, 2017, 7(6): 627 DOI:10.3390/app7060627

58.

Zhao M, Ong S K, Nee A Y C. An augmented reality-assisted therapeutic healthcare exercise system based on bare-hand interaction. International Journal of Human-computer Interaction, 2016, 32(9): 708–721 DOI:10.1080/10447318.2016.1191263

59.

Shen Y, Gu P W, Ong S K, Nee A Y C. A novel approach in rehabilitation of hand-eye coordination and finger dexterity. Virtual Reality, 2012, 16(2): 161–171 DOI:10.1007/s10055-011-0194-x

60.

Shah L B I, Torres S, Kannusamy P, Chng C M L, He H G, Klainin-Yobas P. Efficacy of the virtual reality-based stress management program on stress-related variables in people with mood disorders: the feasibility study. Archives of Psychiatric Nursing, 2015, 29(1): 6–13 DOI:10.1016/j.apnu.2014.09.003

61.

Chua S H, Zhang H M, Hammad M, Zhao S, Goyal S, Singh K. ColorBless: augmenting visual information for colorblind people with binocular luster effect. ACM Transactions on Computer-Human Interaction, 2015, 21(6): 32 DOI:10.1145/2687923

62.

NUSteam creates interactive, VRgame multisensory. Available from: https://news.nus.edu.sg/research/nus-team-creates-interactive-multisensory-vr-game

63.

Shorey S, Ang E, Yap J, Ng E D, Lau S T, Chui C K. A virtual counseling application using artificial intelligence for communication skills training in nursing education: development study. Journal of Medical Internet Research, 2019, 21(10): e14658 DOI:10.2196/14658