Chinese
Adv Search
Home | Accepted | Article In Press | Current Issue | Archive | Special Issues | Collections | Featured Articles | Statistics

2020,  2 (5):   394 - 408   Published Date:2020-10-20

DOI: 10.1016/j.vrih.2020.06.002
1 Introduction2 VR research and application development in biomedical sciences 2.1 Research for VR-enhanced simulator for intracardiac intervention 2.1.1   Fundamental research 2.1.2   VR simulator for intracardiac intervention 2.2 VR-enabled cellular image analysis 2.2.1   Fundamental research 2.2.2   A VR-enhanced collaborative system 2.3 Serious game for children with autism 2.3.1   Autism cognitive rehabilitation program for executive functioning skills (A.C.R.E.S.) 2.3.2   In-app data processing 3 VR research and application development in engineering 3.1 Heavy crane lifting 3.1.1   Fundamental research 3.1.2   VR crane simulator 3.2 Intelligent building imaging 3.2.1   Fundamental research 3.2.2   Intelligent scanning and automatic BIM reconstruction 3.3 Virtual Singapore 3.3.1   Fundamental research 3.3.2   Applications of virtual Singapore 3.4 Augmented reality for industry 4.0 3.4.1   Four research focuses 3.4.2   Industry 4.0 4 VR research and application development in education 4.1 CAVE-based learning of virtual product design 4.1.1   CAVE installations in Singapore 4.1.2   Visualization and VR for product design 4.2 Virtual pink dolphins for special needs education 4.2.1   Immersive room 4.2.2   Virtual pink dolphin project 4.3 VR for public education 4.3.1   VR SARS protein roller-coaster 4.3.2   VR virus theme park 4.3.3   VR Bio X game 4.4 VR technology-enhanced learning 4.4.1   Virtual & augmented reality flipped classroom 4.4.2   Immersive and interactive learning environments 5 VR research and application development in humanity, social sciences, and heritage 5.1 VR heritage 5.1.1   VR Haw Par Villa 5.1.2   Heritage visualization 5.2 NTU campus walkthrough 5.2.1   Dynamic-streaming-enabled walkthrough 5.2.2   NTU virtual campus 5.3 Virtual human, social robots, and telepresence 5.3.1   Virtual humans and social robots 5.3.2   Telepresence 5.4 Virtual perception and social interaction 5.4.1   Virtual perception 5.4.2   Social interaction 5.5 Virtual fashion 5.5.1   Fundamental research 5.5.2   Virtual try-on and fashion customization 6 Conclusion

Abstract

In 1981, Nanyang Technological Institute was established in Singapore to train engineers and accountants to keep up with the fast-growing economy of the country. In 1991, the institute was upgraded to Nanyang Technological University (NTU). NTU holds the rank for world’s top young university for six consecutive years according to the Quacquarelli Symonds (QS) world university ranking. Virtual Reality (VR) research began in NTU in the late 1990s. NTU’s colleges, schools, institutes, and centers have contributed toward the excellence of VR research. This article briefly describes the VR research directions and activities in NTU.

Content

1 Introduction
Located in west Singapore, Nanyang Technological University (NTU) comprises a 500-acre campus and is listed among the top-15 most beautiful university campuses in the world. In 2019, NTU[1] had 25088 students (including 6719 international students), 935 research only staff, and 1582 academic staff (including 853 international staff); furthermore, 6443 undergraduate and 649 doctoral degrees were awarded, and 5868 new undergraduate and 550 new doctoral students had enrolled. As an international university, NTU comprises 100 nationalities. With more than 300 academic partners worldwide, international exchange students are ubiquitous on the green campus. NTU comprises four colleges, i.e., College of Business, College of Engineering, College of Science, and College of Humanities, Arts, and Social Sciences. In addition to these four colleges, NTU comprises the Lee Kong Chian School of Medicine (jointly established with Imperial College London), the Graduate College, and five autonomous institutes including the National Institute of Education, and the S. Rajaratnam School of International Studies. NTU hosts a number of research centers of excellence such as the Institute for Media Innovation, Energy Research Institute @ NTU, NTU Institute of Science and Technology for Humanity, Nanyang Institute of Technology in Health and Medicine, Singapore Centre for Environmental Life Sciences and Engineering, Nanyang Environment & Water Research Institute, and the Earth Observatory of Singapore<underline underline-style="single">​. Furthermore, the university collaborates with major corporations such as Alibaba, Rolls-Royce, and Dyson in establishing joint laboratories on campus to achieve research objectives that are relevant to the society.
As a research-intensive university, NTU has significantly contributed to VR research, with focus on fundamental research, including but not limited to geometric modelling, image-based three-dimensional (3D) reconstruction, and digital geometry processing. Furthermore, NTU has partnered with various corporations to develop VR applications in Biomedical Sciences, Engineering, and Education. In this paper, Section 2 presents NTU’s VR research and application development in Biomedical Sciences. Section 3 discusses NTU’s VR research and application development in Engineering. Section 4 presents NTU’s VR research and application development in Education. Section 5 focuses on NTU’s VR research and development in Humanity and Social Sciences. Finally, Section 6 concludes this article.
2 VR research and application development in biomedical sciences
Over the years, NTU has developed a number of VR projects. Basic research was first developed and then used for various applications. In this section, we discuss the VR research conducted in the Strategic Research Program of VR and Soft Computing (srp VR & SC) and the School of Mechanical & Aerospace Engineering in collaboration with local and international corporations.
2.1 Research for VR-enhanced simulator for intracardiac intervention
The Strategic Research Program of VR and Soft Computing was established in NTU in the late 1990s to promote education, research, and industrial collaboration in VR and artificial intelligence (AI). Several VR projects pertaining to biomedical science have been developed through the program. In collaboration with Gleneagles Hospital in Singapore[2], a project was conducted to design and develop a VR-enhanced system for cardiac intervention using patient-specific tagged Magnetic resonance imaging (MRI) data.
2.1.1   Fundamental research
The program srp VR & SC focuses significantly on various areas, including geometric modeling, digital geometry processing, and image processing. Several algorithms have been developed for 3D progressive reconstruction[3], cardiac motion vector extraction from tagged MRI data[4], and geometry-based interaction modeling[5].
2.1.2   VR simulator for intracardiac intervention
Figure 1 shows the VR-enhanced simulation system for intracardiac intervention using tagged MRI data[6]. The simulator was designed to train medical students and junior surgeons to perform minimally invasive cardiac intervention. For example, both the red-white and blue-white striped wires (Figure 1b) are simulating the catheter. They will be inserted into the heart to search for the location to inject stem cells. However, the location to inject the stem cell cannot be determined easily because the heart beats continuously, and the heart wall moves continuously. The aim is to find a non-slip contact for the injection.
2.2 VR-enabled cellular image analysis
The School of Mechanical & Aerospace Engineering at NTU (Singapore) and the School of Medicine at the University of Toronto (Canada), particularly Opas’s Lab of Medicine and Pathology[7], have a long-term partnership in developing an innovative solution for the interactive visualization and quantification of cellular images.
2.2.1   Fundamental research
Apart from joint PhD student training, the two above-mentioned parties have collaborated to investigate both fundamental and application problems associated with cellular structures. Through their joint effort, several algorithms have been designed and developed for a better understanding of volumetric cellular structures, with focus on 3D border extraction and cell clustering from laser fluorescent microscopic confocal image stacks[8-10] and other cellular image processing techniques[11,12].
2.2.2   A VR-enhanced collaborative system
For 3D confocal microscopic images and those built on top of several algorithms and techniques mentioned above, a VR-enabled system, CellStudio was developed for the visualization and analysis of volumetric cellular image data[13]. This system was established in earlier VR efforts at NTU; its customized design was based on a CRT display, an interactive stylus, a pair of active shutter glasses, and an emitter, as shown in Figure 2.
2.3 Serious game for children with autism
NTU and the Institute of Mental Health[14] in Singapore collaborated to create a game-based learning application for children with autism spectrum disorder (ASD) to train their executive function skills, such as planning and sorting.
2.3.1   Autism cognitive rehabilitation program for executive functioning skills (A.C.R.E.S.)
As computer-assisted learning has proven effective in the learning of children with ASD[15], game-assisted learning could prove to be effective as well for their executive functioning skills training. An iPad application comprising three different games for children of nine to twelve years old (primary 3 to primary 6) was developed to train their planning and sorting skills. Each game presents different locations (home, supermarket, and school) and has ten levels, which increase in difficulty as the player progresses in the game. The home and supermarket games are aimed at training their planning skills, whereas the school game aims at training their sorting skills. The supermarket game, in particular, is designed to improve their route-planning skill as it requires them to obtain items listed in their shopping list in the shortest distance possible (Figure 3a)[16]. Furthermore, three parent-component levels are present in each game so that parents can participate in their children’s development by having them implement those tasks in real life. One can implement this by having children search for items in their parents’ customized shopping list at a physical supermarket store.
2.3.2   In-app data processing
As shown in Figure 3a[16], while the players are playing the games, data are collected, pushed, and stored in real time using the Firebase Real-time Database, a cloud-hosted database. The data include the players’ performance (number of stars received), amount of time required for completion, and errors generated in each game level. This is to measure the players’ progress without explicitly imposing a test setting on them.
3 VR research and application development in engineering
3.1 Heavy crane lifting
NTU and PEC Limited[17] in Singapore have collaborated to develop an innovative solution for heavy crane lifting under two research collaboration agreements.
3.1.1   Fundamental research
In this industrial collaboration project, heavy lifting in highly complex industrial environments was investigated. Safety and productivity are two main concerns in lifting tasks, particularly when large and heavy loads are involved. Traditionally, lifting is performed manually, which is error prone and time consuming. The research purpose was to develop an automatic and intelligent path planning system for highly complex environments to generate, in real-time or near real-time, safe and optimized lifting paths using AI and VR technology. Innovative solutions for path planning that use the advantages of GPU programming[18] have been developed[19-22].
3.1.2   VR crane simulator
A simulator (Figure 4a) was designed[23] for the vocational training of heavy crane operations. Tower cranes, mobile cranes, and crawler cranes were digitally modeled and rigged. The building information model (BIM)[24], plant design management systems, or discretized environments captured using a laser scanner in the form of point clouds[25] were used as input for the lifting simulator[26]. For the crane simulator, trainees can use interactive devices, such as a joystick and a steering wheel, to mimic the control of cranes. In addition to gaining interactive experience, they can immerse themselves in a virtual industrial environment through stereoscopic visualization. The safety warnings will be triggered if the lifting load is about to collide with the environment. Moreover, the operation steps during the session performed by trainees can be recorded using the crane simulator for debriefing purposes. An optimal lifting path (Figure 4b) will be used as a reference to evaluate the lifting operations during training with respect to safety and productivity.
3.2 Intelligent building imaging
The SJ-NTU Corporation Laboratory is a joint venture between NTU and Surbana Jurong Consultants Private Limited[27] in Singapore with funding support from Singapore’s National Research Foundation. This project aims to develop intelligent solutions for the reconstruction of the BIM representation from multimodal images (Figure 5).
3.2.1   Fundamental research
In this new industry project under the corporate laboratory, fundamental research includes innovation in artificial intelligence algorithms, point-cloud-based segmentation and classification, digital geometry processing, and 3D reconstruction. The research purpose is to develop a pipeline for automatic and intelligent recognition and reconstruction of BIMs from multimodal images.
3.2.2   Intelligent scanning and automatic BIM reconstruction
Multimodal images will be generated to capture environments using smart scanning techniques, including LiDAR and photogrammetry. Image fusion algorithms will be investigated before building components such as walls and windows, whereas mechanical and electrical pipes will be recognized automatically or semi-automatically to form their corresponding BIM representation by leveraging the latest AI techniques. VR technology is seamlessly integrated in the work process for the interactive visualization and modification of the digital geometry and 3D BIM models.
3.3 Virtual Singapore
3.3.1   Fundamental research
In a project under the sponsorship of the National Research Foundation, the School of Computer Science and Engineering[28] collaborated with GovTech[29] to investigate a LiDAR-based geometry and machine for the virtualization and semantics enrichment of 3D city models. Used in all models in Virtual Singapore, CityGML is an open-data model in an XML-based format for the storage and exchange of virtual 3D city models. It is an application schema for the Geography Markup Language version 3.1.1 (GML3) and an extensible international standard for spatial data exchange[30] issued by the Open Geospatial Consortium and ISO TC211. The aim of the development of CityGML is to establish a common definition for the basic entities, attributes, and relations of a 3D city model.
3.3.2   Applications of virtual Singapore
Virtual Singapore[31] is a large-scale city model of Singapore that enables public and private sectors to develop solutions for business and research applications. It is cost effective for the sustainable maintenance of 3D city models, allowing the reuse of the same data in different application fields. Owing to its rich and dynamic data environment, Virtual Singapore provides a collaborative platform for virtual experimentation and test bedding, research and development, simulation, planning, and decision-making with applications in construction, building and maintenance, infrastructure and resource management, urban planning, etc.
3.4 Augmented reality for industry 4.0
3.4.1   Four research focuses
Fraunhofer Singapore[32] is jointly founded by Fraunhofer-Gesellschaft and Fraunhofer IGD in Europe, as well as NTU with funding support from the National Research Foundation under Singapore’s RIE2020 initiative. Fraunhofer Singapore is hosted by NTU’s School of Computer Science and Engineering in partnership with Technische Universitaet Darmstadt (Germany) and Graz University of Technology (Austria). Fraunhofer Singapore focuses on four research areas: VR and augmented reality (AR), intuitive human-machine interaction, digital content generation, and 3D reconstruction.
3.4.2   Industry 4.0
AR/VR and mixed reality technologies were particularly investigated to develop solutions for smart industrial engineering, maintenance, and industrial applications in Industry 4.0 and beyond. Fraunhofer Singapore and the School of Civil and Environmental Engineering[33] have collaborated to investigate the psychophysiological evaluation of seafarers to improve training in a maritime virtual simulator[34].
4 VR research and application development in education
4.1 CAVE-based learning of virtual product design
4.1.1   CAVE installations in Singapore
The first CAVE was invented by Cruz-Neira et al. in the Electronic Visualization Laboratory at the University of Illinois, Chicago[35]. Two CAVEs were installed in Singapore’s Science Center[36] and the Institute of High Performance Computing (IHPC)[37] in the late 1990s and early 2000s.
4.1.2   Visualization and VR for product design
In the early 2000s, a course for visualization and VR for product design was offered for Year 4 students by the School of Mechanical and Aerospace Engineering (course code: M498) with the objective to teach product design using visualization and VR technology[38]. The IHPC facilitated course M498 students in performing CAVE walkthroughs (Figure 6a) for a virtual hostel that they designed in the course (Figure 6b).
4.2 Virtual pink dolphins for special needs education
Established by former NTU President, Professor Bertil Andersson in 2008, the Institute for Media Innovation (IMI)[39] is dedicated to creating an environment where technology and creativity can coexist and develop.
4.2.1   Immersive room
This is a 320° circular projection VR environment with five high-end active projectors installed in the institute’s immersive room. Each is mirror projected to a panel on the circular screen. The common area of two neighborhood projectors must be edge blended to form a smooth transition in the area[40].
4.2.2   Virtual pink dolphin project
A virtual dolphinarium was designed for children with autism to enable their interaction with virtual pink dolphins to improve their communication skills[40]. Figure 7 shows a child with autism playing with the virtual pink dolphins in the immersive room. This project[41,42] is a collaboration between IMI, Underwater World Singapore[43], AWWA Special School[44] in Singapore, Suzhou Industrial Park Renai Special School[45] in China, as well as Utrecht University[46] and Windesheim University of Applied Sciences[47] in the Netherlands.
4.3 VR for public education
VR is an excellent tool for public education. In the following, three interactive VR games are presented to showcase their superiority for the public education of viruses. To develop these interactive VR games, fundamental research[48,49] as well as protein surface and structure modeling[50] are desired.
4.3.1   VR SARS protein roller-coaster
This is an interactive VR game designed in NTU for SARS virus education (Figure 8a). It was exhibited in Gallery #10 of Singapore Art Museum[51] from September 2003 to October 2004[52]. The purpose of this exhibition was to promote the convergence of art, science, and technology with a protein roller-coaster as an interactive medium to enable the public, particularly young students, to better understand the primary, secondary, and tertiary structures of SARS virus proteins.
4.3.2   VR virus theme park
This interactive game designed at NTU was exhibited at Zero Meter Hall in the Shanghai Oriental Pearl Tower[53] in May 2005 (Figure 8b). The virus theme park involved 12 viruses, including HIV and SARS[54]. The week-long public exhibition attracted over 70000 visitors.
4.3.3   VR Bio X game
This is an interactive VR serious game designed at NTU for learning protein secondary structures through X gaming[55]. Several virus proteins were modeled in the form of the Great Wall of China to represent the protein backbone structure (Figure 8c). Players ride on a motor bike along the Great Wall and perform various X game actions, such as 360° spinning, to learn protein amino acid sequences as well as protein secondary and tertiary structures.
4.4 VR technology-enhanced learning
The Hive (Figure 9a) is a new student learning hub at the NTU campus that supports technology-enhanced learning. Among the 56 classrooms in the Hive, only one is a traditional lecture theater, while the remainder are designed for team-based and inquiry-based learning.
4.4.1   Virtual & augmented reality flipped classroom
Figure 9b shows a flipped classroom in the Hive. The classroom is designed to host approximately 30 students for team-based learning. Each team with a maximum of six students is supported by a VR-ready system. All six VR systems in the classroom can operate either independently or in a synchronized manner.
4.4.2   Immersive and interactive learning environments
The VR flipped classroom is an immersive and interactive learning environment[56] with integrated hardware, software, pedagogy, and content. The hardware for each table includes a high-end VR-ready computer, head-mounted display, stereographic TV display, and other interactive devices. The software comprises system integration and optimization accelerated by a graphics processing unit. Content is created to support curriculum-based learning in engineering, whereas in-depth learning pedagogy is developed for inquiry-based and team-based learning.
Currently, the School of Mechanical and Aerospace Engineering[57], the School of Chemical and Biological Engineering[58], Centre for IT Services[59], and National Institute of Education[60] are collaborating to develop VR technology-enhanced learning for NTU students.
5 VR research and application development in humanity, social sciences, and heritage
In addition to biomedical sciences, engineering, and education, VR is useful in many other fields. Researchers at NTU have investigated VR applications in humanity, social sciences, and heritage.
5.1 VR heritage
5.1.1   VR Haw Par Villa
Haw Par Villa is a Singaporean attraction that was developed by Mr. Aw Boon Haw and Mr. Aw Boon Par approximately a century ago. To popularize the villa among the youth, we developed this project using VR technology by digitizing selected culture and heritage from the villa for the teaching of Chinese values. With the support from the Singapore Tourism Promotion Board, Madam Snake White, Jiang Taigong, Smiling Buddha, the Eight Immortals, Ten Courts of Hell, etc. were digitized using a laser scanner, 3D reconstruction, and 3D printing before generating an AR application for heritage education.
Laser scanning technology (Figure 10a) was applied to digitize identified items in the villa. Subsequently, 3D mapping was conducted to reconstruct digital 3D models (Figure 10b) from the captured point clouds through the processes of point cloud data fusion, segmentation, etc.[61]. From the digitized 3D models, 3D prints were created and then used as triggers for AR applications. The applications developed enabled users to learn the culture and heritage of the Haw Par Villa (Figure 10c). Part of the work has been published as a special issue in the journal, "Presence"[62].
5.1.2   Heritage visualization
At the School of Art, Design, and Media[63], heritage visualization and potential speculative reconstructions in digital space were investigated, and a special case study based on Cyprus’s Medieval Church of St. Anne in Famagusta was discussed. The research finding was published in DISEGNARECON as a special issue on advanced technologies for historical city visualization[64].
5.2 NTU campus walkthrough
5.2.1   Dynamic-streaming-enabled walkthrough
The Centre for Augmented and Virtual Reality[65] was jointly established by the College of Science and College of Engineering in partnership with EON Reality[66] for education, training, and research. The center is equipped with the latest AR/VR technology and leading infrastructure for course development and training.
A walkthrough application of NTU’s Yunnan Garden was developed at the center based on an efficient resource scheduling scheme for the out-of-core dynamic streaming of a 3D scene. The entire scene was stored in the cloud and relevant scene data were streamed to a client’s Android mobile device in real time.
5.2.2   NTU virtual campus
In the early 2000s, a virtual reality project was performed with the School of Computer Science and Engineering to develop a virtual NTU campus[67]. The VR model of the campus was developed based on the MultiGen-Paradigm and Vega as well as 3D web visualization technology for VR walkthrough applications using virtual reality modeling language.
5.3 Virtual human, social robots, and telepresence
The BeingThere Centre at the Institute for Media Innovation[39] is a three-party international collaboration involving NTU, ETH (Zurich), and the University of North Carolina at Chapel Hill. The three active research areas at the center are virtual humans, social robots, and telepresence. The BeingThere Centre is now renamed the BeingTogether Centre[68] after the first-phase completion of BeingThere.
5.3.1   Virtual humans and social robots
Substantial efforts have been expended in modeling multi-party interactions among virtual characters, social robots, and humans[69,70]. More specifically, the social and humanoid robot Nadine developed at the IMI has received significant interest worldwide[71].
5.3.2   Telepresence
Telepresence is a major research topic at both the BeingThere Centre and BeingTogether Centre. Immersive 3D telepresence[72], autofocus AR eyeglasses for both real-world and virtual imagery[73], and extended depth-of-field volumetric near-eye AR displays[74] have been developed. Owing to Covid-19, virtual humans and social robots are expected to play a more active role in virtual meeting and telepresence.
5.4 Virtual perception and social interaction
5.4.1   Virtual perception
At the School of Social Science[75] within the College of Humanity, Art, and Social Sciences, researchers are investigating vision perception and neuroscience through a multi-disciplinary approach combining psychophysics, electrophysiology, eye tracking, and VR[76]. Additionally, computational modeling was developed for hierarchical information processing.
5.4.2   Social interaction
At the WKW School of Communication and Information[77] within the College of Humanity, Art, and Social Sciences, game-based social interactions are studied with focus on cyber well-being and cyber wellness, as well as the effects of digital games on adolescents’ social and psychological development[78]. This research will have potential applications in the post-pandemic era when working from home becomes the new norm.
5.5 Virtual fashion
5.5.1   Fundamental research
VR can be used in virtual try-on and fashion simulations. A system[79] developed by the School of Computer Science and Engineering and the IMI comprises data extraction, animated body adaptation, and garment prepositioning and simulation. A Kinect sensor was used to capture a customer’s data using RGB, depth, and motion sensors. To generate a customized 3D body from a template model, a statistical analysis method was proposed to estimate anthropometric measurements from the partial information extracted from the Kinect sensor. A constrained Laplacian-based deformation algorithm was then applied to deform the template model to match the obtained anthropometric measurements before shape refinement was carried out based on contours.
5.5.2   Virtual try-on and fashion customization
The fast-growing trend in online shopping necessitates VR applications[80]. Fundamental research including modeling, visualization, and animation can be used for customizing fashion designs. Virtual try-on simulations employing the Kinect sensor can be used for online fashion shopping. Ideally, the research should accommodate the fashion industry to provide appropriate customization services to end users.
6 Conclusion
This article provides an overview of the research and development of VR performed at NTU. Various VR projects developed in different schools, research centers/labs, and institutes were presented. Emphasis was placed on fundamental research and application development in VR across NTU campus, which focused on four areas: biomedical sciences, engineering, education, and humanity, social sciences, and heritage. As VR technology is evolving rapidly, it was not possible to include all the related research activities at NTU herein.

Reference

1.

Nanyang Technological University. Available from: https://www.ntu.edu.sg

2.

GleneaglesHospital. Available from: https://www.gleneagles.com.sg/

3.

Chiang P, Zheng J M, Mak K H, Thalmann N M, Cai Y Y. Progressive surface reconstruction for heart mapping procedure. Computer-Aided Design, 2012, 44(4): 289–299 DOI:10.1016/j.cad.2011.11.004

4.

Chiang P, Cai Y Y, Mak K H, Zheng J M. A B-spline approach to phase unwrapping in tagged cardiac MRI for motion tracking. Magnetic Resonance in Medicine, 2013, 69(5): 1297–1309 DOI:10.1002/mrm.24359

5.

Chiang P, Cai Y Y, Mak K H, Soe E M, Chui C K, Zheng J M. A geometric approach to the modeling of the catheter-heart interaction for VR simulation of intra-cardiac intervention. Computers & Graphics, 2011, 35(5): 1013–1022 DOI:10.1016/j.cag.2011.07.007

6.

Chiang P, Zheng J M, Yu Y, Mak K H, Chui C K, Cai Y Y. A VR simulator for intracardiac intervention. IEEE Computer Graphics and Applications, 2013, 33(1): 44–57 DOI:10.1109/mcg.2012.47

7.

The OPAS's Lab of Medicine and Pathology. Available from: http://sites.utoronto.ca/mocell

8.

Guan Y Q, Cai Y Y, Lee Y T, Opas M. An automatic method for identifying appropriate gradient magnitude for 3D boundary detection of confocal image stacks. Journal of Microscopy, 2006, 223(1): 66–72 DOI:10.1111/j.1365-2818.2006.01600.x

9.

Indhumathi C, Cai Y Y, Guan Y Q, Opas M. 3D boundary extraction of confocal cellular images using higher order statistics. Journal of Microscopy, 2009, 235(2): 209–220

10.

Indhumathi C, Cai Y Y, Guan Y Q, Opas M. An automatic segmentation algorithm for 3D cell cluster splitting using volumetric confocal images. Journal of Microscopy, 2011, 243(1): 60–76 DOI:10.1111/j.1365-2818.2010.03482.x

11.

Indhumathi C, Cai Y Y, Guan Y Q, Opas M, Zheng J. Adaptive-weighted cubic B-spline using lookup tables for fast and efficient axial resampling of 3D confocal microscopy images. Microscopy Research and Technique, 2012, 75(1): 20–27 DOI:10.1002/jemt.21017

12.

Guan Y Q, Cai Y Y, Zhang X, Lee Y T, Opas M. Adaptive correction technique for 3D reconstruction of fluorescence microscopy images. Microscopy Research and Technique, 2008, 71(2): 146–157 DOI:10.1002/jemt.20536

13.

Guan Y, Cai Y Y, Opas M, Xiong Z W, Lee Y T. A VR enhanced collaborative system for 3d confocal microscopic image processing and visualization. International Journal of Image and Graphics, 2006, 6(2): 231–250 DOI:10.1142/s0219467806002215

14.

Institute of Mental Health. https://www.imh.com.sg/

15.

Goldsmith T R, LeBlanc L A. Use of technology in interventions for children with autism. Journal of Early and Intensive Behavior Intervention, 2004, 1(2): 166–178 DOI:10.1037/h0100287

16.

Mohd Taib S F B, Zhang Y Z, Cai Y Y, Goh T J. Supermarket route-planning game: A serious game for the rehabilitation of planning executive function of children with ASD. In: VR, Simulations and Serious Games for Education. Singapore: Springer Singapore, 2018, 111–119 DOI:10.1007/978-981-13-2844-2_10

17.

PECLimited. Available from: http://www.peceng.com

18.

Sanders J, Kandrot E. Cuda By Example: An Introduction to General-Purpose GPU Programming. Addison-Wesley Professional, 2010

19.

Cai P P, Cai Y Y, Chandrasekaran I, Zheng J M. Parallel genetic algorithm based automatic path planning for crane lifting in complex environments. Automation in Construction, 2016, 62: 133–147 DOI:10.1016/j.autcon.2015.09.007

20.

Cai P P, Chandrasekaran I, Zheng J M, Cai Y Y. Automatic path planning for dual-crane lifting in complex environments using a prioritized multiobjective PGA. IEEE Transactions on Industrial Informatics, 2018, 14(3): 829–845 DOI:10.1109/tii.2017.2715835

21.

Chen Y, Cai Y Y, Zheng J M, Thalmann D. Accurate and efficient approximation of clothoids using bézier curves for path planning. IEEE Transactions on Robotics, 2017, 33(5): 1242–1247 DOI:10.1109/tro.2017.2699670

22.

Dutta S, Cai Y Y, Huang L H, Zheng J M. Automatic re-planning of lifting paths for robotized tower cranes in dynamic BIM environments. Automation in Construction, 2020, 110: 102998 DOI:10.1016/j.autcon.2019.102998

23.

Cai P P, Chandrasekaran I, Cai Y Y, Chen Y, Wu X Q. Simulation-enabled vocational training for heavy crane operations. In: Simulation and Serious Games for Education. Singapore: Springer Singapore, 2016, 47–59 DOI:10.1007/978-981-10-0861-0_4

24.

Volk R, Stengel J, Schultmann F. Building Information Modeling (BIM) for existing buildings: Literature review and future needs. Automation in Construction, 2014, 38: 109–127 DOI:10.1016/j.autcon.2013.10.023

25.

Linsen L. Point Cloud Representation. Technical Report, Faculty of Computer Science. University of Karlsruhe, Germany, 2001

26.

Huang L H, Zhang Y Z, Zheng J M, Cai P P, Dutta S, Yue Y F, Thalmann N, Cai Y Y. Point cloud based path planning for tower crane lifting. In: Proceedings of Computer Graphics International 2018 on-CGI 2018. Bintan, Island, Indonesia, New York, ACM Press, 2018, 211–215 DOI:10.1145/3208159.3208186

27.

Surbana Jurong. https://surbanajurong.com/

28.

School of Computer Science & Engineering. Available from: https://www.ntu.edu.sg/home/asjmzheng/

29.

GovTech. Available from: https://www.tech.gov.sg/media/technews/virtual-singapore

30.

Open Geospatial Consortium. Available from: https://www.ogc.org/standards/citygml

31.

VirtualSingapore. Available from: https://www.nrf.gov.sg/programmes/virtual-singapore

32.

FraunhoferSingapore. Available from: https://www.fraunhofer.sg/en/research/industrial-immersive-technologies.html

33.

School of Civil and Environmental Engineering. Available from: https://www.ntu.edu.sg/cce

34.

Liu Y S, Lan Z R, Cui J, Krishnan G, Sourina O, Konovessis D, Ang H E, Mueller-Wittig W. Psychophysiological evaluation of seafarers to improve training in maritime virtual simulator. Advanced Engineering Informatics, 2020, 44, 101048 DOI:10.1016/j.aei.2020.101048

35.

Cruz-Neira C, Sandin D J, DeFanti T A, Kenyon R V, Hart J C. The CAVE: audio visual experience automatic virtual environment. Communications of the ACM, 1992, 35(6): 64–72 DOI:10.1145/129888.129892

36.

Science Centre Singapore. Available from: https://www.science.edu.sg/

37.

Institute of High-performance Computing. Available from: https://www.a-star.edu.sg/ihpc

38.

Lee Y T, Au C K. Virtual design education in NTU. In: Proceedings of 40th ISAGA Annual Conference, Singapore, 2009

39.

Institute for Media Innovation. Available from: https://www.ntu.edu.sg/imi

40.

Cai Y Y, Chia N K H, Thalmann D, Kee N K N, Zheng J M, Thalmann N M. Design and development of a virtual dolphinarium for children with autism. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2013, 21(2): 208–217 DOI:10.1109/tnsre.2013.2240700

41.

Cai Y Y, van Joolingen W, Walker Z. VR, Simulations and serious games for education. Singapore: Springer Singapore, 2019 DOI:10.1007/978-981-13-2844-2

42.

Cai Y Y, Goei S L, Trooster W. Simulation and serious games for education. Singapore: Springer Singapore, 2017 DOI:10.1007/978-981-10-0861-0

43.

Underwater World Singapore. Available from: https://en.wikipedia.org/wiki/Underwater_World,_Singapore

44.

AWWA Special School. Available from: https://edu.awwa.org.sg/

45.

Suzhou Industrial Park Renai Special School. Available from: http://nanjixiong.com/archiver/tid-65527.html

46.

Utrecht University. https://www.uu.nl/staff/wrvanjoolingen

47.

Windesheim University of Applied Sciences. Available from: https://www.windesheim.com/research/education/educational-needs-in-inclusive-learning-environments/contact

48.

Chen W, Yu R D, Zheng J M, Cai Y Y, Au C. Triangular Bézier sub-surfaces on a triangular Bézier surface. Journal of Computational and Applied Mathematics, 2011, 235(17): 5001–5016 DOI:10.1016/j.cam.2011.04.030

49.

Zheng J, Cai Y. Interpolation over arbitrary topology meshes using a two-phase subdivision scheme. IEEE Transactions on Visualization and Computer Graphics, 2006, 12(3): 301–310 DOI:10.1109/tvcg.2006.49

50.

Chen W, Zheng J M, Cai Y Y. Kernel modeling for molecular surfaces using a uniform solution. Computer-Aided Design, 2010, 42(4): 267–278 DOI:10.1016/j.cad.2009.10.003

51.

Singapore Art Museum. Available from: https://www.singaporeartmuseum.sg

52.

Cai Y Y, Lu B F, Fan Z W, Chan C W, Lim K T, Qi L, Li L. Proteins, immersive games and music. Leonardo, 2006, 39(2): 135–137 DOI:10.1162/leon.2006.39.2.135

53.

Shanghai Oriental Pearl Tower. Available from: https://en.wikipedia.org/wiki/Oriental_Pearl_Tower

54.

Cai Y Y, Lu B F, Zheng J M, Li L. Immersive protein gaming for bio edutainment. Simulation & Gaming, 2006, 37(4): 466–475 DOI:10.1177/1046878106293677

55.

Cai Y Y, Lu B F, Fan Z W, Indhumathi C, Lim K T, Chan C W, Jiang Y, Li L. Bio-edutainment: Learning life science through X gaming. Computers & Graphics, 2006, 30(1): 3–9 DOI:10.1016/j.cag.2005.10.003

56.

Cai Y. 3D Immersive and Interactive Learning. Springer Verlag, Singapore, 2013

57.

School of Mechanical & Aerospace Engineering. Available from: http://research.ntu.edu.sg/expertise/academicprofile/Pages/StaffProfile.aspx?ST_EMAILID=MYYCAI

58.

School of Chemical & Biological Engineering. Available from: https://research.ntu.edu.sg/expertise/academicprofile/Pages/StaffProfile.aspx?ST_EMAILID=pgunawan&CategoryDescription=Energy

59.

Services Centrefor IT. Available from: http://enewsletter.ntu.edu.sg/itconnect/2018-03/Pages/VARTEL.aspx?AspxAutoDetectCookieSupport=1

60.

National Institute of Education. https://www.nie.edu.sg/students/project/afd-06-16-cy

61.

Cai Y Y, Zheng J M, Zhang Y Z, Wu X Q, Chen Y, Tan B Q, Yang B Y, Liu T R, Thalmann N. Madam snake white: a case study on virtual reality continuum applications for Singaporean culture and heritage at haw par villa. Presence: Teleoperators and Virtual Environments, 2018, 26(4): 378–388 DOI:10.1162/pres_a_00303

62.

Ch'ng E, Cai Y Y, Thwaites H. Special issue on VR for culture and heritage: the experience of cultural heritage with virtual reality (part II): guest editors' introduction. Presence: Teleoperators and Virtual Environments, 2018, 26(4): 3–4 DOI:10.1162/pres_e_00311

63.

School of Art, Design and Media. Available from: http://research.ntu.edu.sg/expertise/academicprofile/Pages/StaffProfile.aspx?ST_EMAILID=MWALSH

64.

Walsh M, Bernardello R. Heritage Visualization and Potential Speculative Reconstructions in Digital Space: The Medieval Church of St. Anne in Famagusta, Cyprus. Disegnarecon, 2018

65.

Centre for Augmented and Virtual Reality. Available from: https://cos.ntu.edu.sg/CAVR

66.

EONReality. Available from: https://eonreality.com/

67.

Sourin A. Nanyang Technological University virtual campus virtual reality project. IEEE Computer Graphics and Applications, 2004, 24(6): 6–8 DOI:10.1109/mcg.2004.57

68.

BeingTogetherCentre. Available from: https://imi.ntu.edu.sg/IMIResearch/ResearchAreas/Pages/BeingTogetherCentre.aspx

69.

Yumak Z, Ren J F, Thalmann N M, Yuan J S. Modelling multi-party interactions among virtual characters, robots, and humans. Presence: Teleoperators and Virtual Environments, 2014, 23(2): 172–190 DOI:10.1162/pres_a_00179

70.

Tahir Y, Dauwels J, Thalmann D, Magnenat Thalmann N. A user study of a humanoid robot as a social mediator for two-person conversations. International Journal of Social Robotics, 2018, 1–14 DOI:10.1007/s12369-018-0478-3

71.

Thalmann N M, Tian L, Yao F P. Nadine: A social robot that can localize objects and grasp them in a human way. In: Lecture Notes in Electrical Engineering. Singapore: Springer Singapore, 2017, 1–23 DOI:10.1007/978-981-10-4235-5_1

72.

Fuchs H, State A, Bazin J C. Immersive 3D telepresence. Computer, 2014, 47(7): 46–52 DOI:10.1109/mc.2014.185

73.

Chakravarthula P, Dunn D, Akşit K, Fuchs H. FocusAR: auto-focus augmented reality eyeglasses for both real world and virtual imagery. IEEE Transactions on Visualization and Computer Graphics, 2018, 24(11): 2906–2916 DOI:10.1109/tvcg.2018.2868532

74.

Rathinavel K, Wang H P, Blate A, Fuchs H. An extended depth-at-field volumetric near-eye augmented reality display. IEEE Transactions on Visualization and Computer Graphics, 2018, 24(11): 2857–2866 DOI:10.1109/tvcg.2018.2868570

75.

School of Social Sciences. Available from: http://research.ntu.edu.sg/expertise/academicprofile/pages/StaffProfile.aspx?ST_EMAILID=XUHONG

76.

Ying H J, Burns E, Lin X Y, Xu H. Ensemble statistics shape face adaptation and the cheerleader effect. Journal of Experimental Psychology: General, 2019, 148(3): 421–436 DOI:10.1037/xge0000564

77.

Wee Kim Wee School of Communication and Information. Available from: http://research.ntu.edu.sg/expertise/academicprofile/pages/StaffProfile.aspx?ST_EMAILID=CHENHH

78.

Chen V H H, Wilhelm C, Joeckel S. Relating video game exposure, sensation seeking, aggression and socioeconomic factors to school performance. Behaviour & Information Technology, 2019, 1–13 DOI:10.1080/0144929x.2019.1634762

79.

Zhang Y Z, Zheng J M, Magnenat-Thalmann N. Cloth simulation and virtual try-on with kinect based on human body adaptation. In: Gaming Media and Social Effects. Singapore: Springer Singapore, 2013, 31–50 DOI:10.1007/978-981-4560-32-0_3

80.

Magnenat-Thalmann N. Modeling and simulating bodies and garments. London: Springer London, 2010 DOI:10.1007/978-1-84996-263-6