Chinese
Adv Search
Home | Accepted | Article In Press | Current Issue | Archive | Special Issues | Collections | Featured Articles | Statistics

2019,  1 (2):   136 - 162   Published Date:2019-4-20

DOI: 10.3724/SP.J.2096-5796.2019.0008
1 Introduction2 Paradigms of haptic display driven by computing platform 2.1 Definition of haptic HCI paradigms 2.2 History of paradigm shift in the past 30 years 3 Desktop haptics 3.1 Motivation 3.2 Desktop haptic feedback device 3.2.1   Definition 3.2.2   Function and specifications 3.2.3   Classification 3.2.4   Typical commercial devices and research prototypes 3.3 Haptic rendering 3.4 Typical applications 4 Surface haptics 4.1 Motivation 4.2 Tactile device for touch screen 4.2.1   Mechanical vibration 4.2.2   Micro roughness display 4.2.3   Friction modulation 4.3 Tactile rendering 4.4 Typical applications 5 Wearable haptics 5.1 Motivation 5.2 Wearable haptic device 5.3 Haptic rendering 6 Future research topics 6.1 Handheld haptic devices for VR 6.2 Multimodal haptic devices 6.3 High-fidelity haptic rendering 6.4 Understanding on human haptics 7 Conclusions

Abstract

Immersion, interaction, and imagination are three features of virtual reality (VR). Existing VR systems possess fairly realistic visual and auditory feedbacks, and however, are poor with haptic feedback, by means of which human can perceive the physical world via abundant haptic properties. Haptic display is an interface aiming to enable bilateral signal communications between human and computer, and thus to greatly enhance the immersion and interaction of VR systems. This paper surveys the paradigm shift of haptic display occurred in the past 30 years, which is classified into three stages, including desktop haptics, surface haptics, and wearable haptics. The driving forces, key technologies and typical applications in each stage are critically reviewed. Toward the future high-fidelity VR interaction, research challenges are highlighted concerning handheld haptic device, multimodal haptic device, and high fidelity haptic rendering. In the end, the importance of understanding human haptic perception for designing effective haptic devices is addressed.

Content

1 Introduction
In 1965, Ivan Sutherland proposed the concept “the ultimate display”, which represents the birth of virtual reality (VR)[1]. In his seminal work, he introduced three features of VR: immersion, interaction, and imagination. In past 50 years, thanks to the research in computer graphics and sound synthesis, existing VR systems possess fairly realistic visual and auditory feedback. However, haptic feedback is far from user’s perceptual expectations. The experiences of haptic sensation in most VR systems are rather poor compared to the abundant haptic properties that human can perceive in the physical world.
Haptic feedback is indispensable for enhancing immersion, interaction, and imagination of VR systems. Interaction can be enhanced by haptic feedback as users can directly manipulate virtual objects, and obtain immediate haptic feedback. Immersion of the VR system can be enhanced in terms of providing more realistic sensation to mimic the physical interaction process. Imagination of users can be inspired when haptics can provide more cues for user to mentally construct an imagined virtual world beyond spatial and/or temporal limitations.
The haptic sensation obtained through virtual interaction is severely poor compared to the sensation obtained through physical interaction. In our physical life, the haptic channel is pervasively used, such as perception of stiffness, roughness and temperature of the objects in external world, or manipulation of these objects and motion or force control tasks such as grasping, touching or walking etc. In contrary, in virtual world, haptic experiences are fairly poor in both quantity and quality. Most commercial VR games and movies only provide visual and auditory feedbacks, and a few of them provide simple haptic feedback such as vibrations. With the booming of VR in many areas such as medical simulation and product design, there is an urgent requirement to improve the realism of haptic feedback for VR systems, and thus to achieve equivalent sensation comparable to the interaction in a physical world.
2 Paradigms of haptic display driven by computing platform
2.1 Definition of haptic HCI paradigms
As shown in Figure 1, the paradigm of human-computer interaction (HCI) can be defined with three components: human user, interface device, and virtual environment synthesized by computer. In order to define the paradigm of haptic HCI, each component is elaborated as follows.
Considering the human user, we need to study both the perception and manipulation characteristics of the haptic channel during the bilateral communication between human and computer. For the perception aspect, human perceptual system mainly includes diverse kinesthetic and cutaneous receptors in our body, which located in skin, muscles or tendons. For the manipulation/action aspect, we need to consider the motor control parameters, including degree-of-freedom (DoF) of motion or force tasks, the magnitude and resolution for motion and force signals for diverse manipulation tasks.
Considering the interface device, its functions include sensing and actuation. For sensing, a device needs to sense/track human manipulation information such as motion/force signals, and then transmit this information to control a virtual avatar of the device in the virtual environment. For actuation, the device receives simulated force signals from the virtual environment, and then reproduces these forces by actuators such as electric motors to exert these forces on user’s body.
Considering the virtual environment, diverse tasks can be simulated, including surgical simulation, mechanical assembly, computer games etc. Whenever there are direct contacts between the user-driven avatar and the manipulated objects, haptic feedback devices can be used to display the resultant contact forces/torques to users.
2.2 History of paradigm shift in the past 30 years
In 1948, the first mechanical force feedback master-slave manipulator was developed in Argonne National Laboratory by Raymond Goertz[2]. The master arm is the pioneer of today’s haptic display. With the advancement of computing platform, haptic display evolved accordingly. The impact between haptic display and computing platform are two folded. In one aspect, each computing platform required an effective way for human-computer interaction, and thus to maintain the users’ work efficiency. In the other aspect, the haptic display required a powerful computing platform that is capable to support its function and performance specifications.
In the past 30 years, the evolution of computing platform can be summarized in three eras: personal computer, mobile internet, and virtual reality based on wearable computers. Accordingly, the paradigm of haptic HCI can be classified into three stages: desktop haptics, surface haptics, and wearable haptics. Figure 2 illustrates the typical features of each paradigm.
In a human-machine interaction system, a human user is manipulating a haptic device, and the mechanical interface is mapped into the avatar in virtual environment (VE), and typical task. With the evolution of the haptic HCI paradigm, following components are also evolving, including the body organ of the human user, the metaphor of the interface device, the controlled avatar, and motion/force dimensions supported by the paradigm.
In desktop haptics, the user’s hand is holding the stylus of the device, and thus to control a virtual tool such as surgical scalpel, mechanical screwdriver etc. The simulated motion/force dimensions in desktop haptics are six, including three translations and three rotations of the virtual tool.
In surface haptics, the user’s fingertip slides along the touchscreen of a mobile phone with typical gestures such as panning, zooming and rotating etc., and thus to control a finger avatar to feel the texture and/or shape of virtual objects. The simulated motion/force dimensions in surface haptics are two within the planar surface of the touchscreen.
In wearable haptics, the user’s hand is wearing a haptic glove, and thus to control a virtual hand-shaped avatar with diverse simulated gestures such as grasping, pinching, lifting etc. The motion dimensions are 22 in terms of the DoF of human’s hand, and the force dimensions are dynamically changing depending on the number and topology of contact points between the virtual hand and the manipulated objects.
3 Desktop haptics
3.1 Motivation
Mainstream interaction paradigm in the era of personal computer is Windows-Icons-Menus-Pointer (WIMP) interface paradigm[3], in which the computer mouse is used as a user-friendly interface to enable highly efficient human-computer interaction. While the mouse can capture the two dimensional movement of user’s hand on a desktop surface, it cannot provide force feedback on user’s hand when the virtual avatar collides with a virtual obstacle.
In order to solve the limitation of the classic HCI paradigm, an enhanced mouse, desktop force feedback devices like a multi-link robotic arm has been created to provide both motion control and force feedback between user and the virtual environment. Figure 3 shows the comparison between a computer mouse and a desktop haptic device.
In the era of person computer, the mainstream way of haptic interaction is multi-joint force feedback devices fixed on a desk or ground. The interactive metaphor of the desktop force feedback interface is that users interact with the virtual environment through the handle, and the virtual avatar is the 6-DOF motor rigid tool, such as surgical scalpel, mechanical screwdriver etc. The contact force is transmitted from the desktop haptic device to user’s hands when the virtual avatar contacts or collides with the objects in the virtual environment, thereby the users can obtain the force feedback experience of manipulating the virtual objects.
3.2 Desktop haptic feedback device
3.2.1   Definition
A desktop haptic device is a multi-joint robotic arm with a stylus that held in user’s hand, the device is able to track the movement of the stylus, and provide force feedback on the stylus. The device is normally fixed on a table-top or the ground.
In order to fulfil above functions, there are three major components. First, there are position or force sensors in each joint or the end-effector of the robotic arm, which are used to measure the movement of the stylus driven by the user (or the exerted force on the stylus by the user). Second, there are actuators proving the feedback force or torques. Third, there are transmission mechanisms to transmit torque from the joint to the end-effector.
3.2.2   Function and specifications
Compared with traditional robotics arms, the major challenges for a haptic device is to simulate the sensation of interacting with both free space and constrained space. In free space, the device should follow or allow the motion of the user and exert as less as possible resistance on user’s hand. In constraint space, the device should provide a sufficient range of impedance to simulate contact constraints from virtual objects with diverse physical properties.
Salisbury et al. summarized the three criteria/requirements for a good haptic device: (1) Free space must feel free; (2) Solid virtual objects must feel stiff; (3) Virtual constraints must not be easily saturated[4].
These criteria can be translated following design specifications, including low back-drive friction, low inertia, highly adjustable impedance range, large force/torque, high position sensing resolution, sufficient workspace for the simulated task.
3.2.3   Classification
Figure 4 illustrates the taxonomy of desktop haptic devices based on diverse classification criteria. According to the kinematic structure, haptic devices can be classified into serial, parallel, and hybrid structures. According to the control principle, haptic devices can be classified into impedance and admittance control. Different actuation principles can be used, including electric motor, pneumatic and hydraulic actuations, and novel actuators.
The characteristics of serial structure mechanism are as follows: simple topological structure, easy to provide large workspace, the large rotation angle for the end-link, flexible operation, relatively easy to solve the forward kinematics and the inverse driving force/moment solution. However, this kind of mechanism has the following disadvantages: the mechanical stiffness is relatively small; the error of the end-effector is large because the error from each joint may be accumulated; most of the driving motors and transmission links are mounted on the moving arms, which increases the inertia of the system and leads to the poor dynamic performance.
The parallel structure mechanism offers potential advantages compared with the serial mechanism, with multi-branch form, high overall stiffness, strong payload capacity. While the errors of each joint will accumulate to the end-effector in serial mechanism, in contrast, there is no error accumulation or amplification in parallel mechanism, and the error of the end-effector is relatively small. Similarly, the driving motors in the parallel mechanism are usually mounted on the frame, which lead to a small inertia, small dynamical load and fast dynamic performance. However, the available workspace of the parallel mechanism is small and the rotation range of the end-effector (movable platform) is limited under the same structure size. The solving process of the forward kinematics solution and the inverse driving force/moment solution are relatively complex and difficult.
Hybrid structure mechanism combines the advantages of the serial and parallel mechanisms, which increases the system stiffness and reduce the error of the end-effector under the premise of simplifying the forward kinematics solution and the inverse driving force/moment solution. But the torque feedback on the rotation axis of the end-effector is not easy to be realized for the 6-DoF mechanism.
Based on the control principle, haptic devices can be classified into impedance display and admittance display. Impedance display realizes the effect of force feedback by measuring the motion of the operator and applying the feedback force to the operator. Admittance display realizes the effect of force feedback by measuring the active force applied by operator to the devices and controlling the motion of the force feedback device.
3.2.4   Typical commercial devices and research prototypes
In recent years, a large number of desktop force feedback interfaces have emerged with the rapid development of sensors, robotics and other technologies, such as the Phantom series of SensAble Inc., the Omega and Delta series of Force Dimension Inc. Figure 5 and Figure 6 show the major commercial desktop devices and some representative research prototypes respectively.
3.3 Haptic rendering
Haptic rendering refers to the process of computing and generating forces in response to user interactions with virtual objects[4]. Figure 7 shows the pipeline of haptic rendering.
The development of haptic rendering algorithms can be attributed to the driving forces from both the advancement of force feedback devices and the requirements of force feedback applications. In 1950, the force feedback interface was used in the master-slave operation control of the nuclear environment, while the research of the haptic rendering algorithms had not received much attention. It was not until 1993 that the introduction of the Phantom force feedback device invented by MIT professor K. Salisbury and his student T. Massie led to the research boom in the haptic rendering algorithms for desktop force feedback devices. The mainstream of the early research is the 3-DoF Haptic rendering algorithms. The principle is to represent the virtual avatar of the handhold device as a point with three dimensional motions, which can interact with objects of virtual environment and produce three dimensional forces. Then the forces are fed back to the operator by the stylus of the force feedback device to make the operator feel contact forces between the virtual avatar and virtual objects.
Due to the demand driven in different application fields, various 3-DoF haptic rendering algorithms have been studied, including rigid body interaction, elastic body interaction, fluid simulation and topological changing force interaction. The classical algorithms include: god-object, virtual proxy, ray casting[5–7]. Obviously, the 3-DoF haptic rendering algorithms cannot simulate the multi-region contacts between objects. When the tool avatar contacts with objects with multi-region contacts, the avatar will penetrate the obstacle, which greatly reduces the simulation fidelity. In addition, 3-DoF haptic rendering algorithms cannot simulate the interactive torques between objects, which results in unrealistic haptic sensation.
With the advent of 6-DoF desktop force feedback devices and the realistic force interaction requirements for simulating complex scenes, Puterbaugh et al. from Boeing Inc. first proposed the concept of 6-DoF Haptic rendering in 1999[8]. The aim is to solve the problem of multi-point and multi-region interaction simulation with non-penetrating contacts between complex shape objects, and the computation of six-dimensional force and torque for simulating rigid body interaction. Various 6-DoF haptic rendering algorithms have been studied due to the demand driven in different application fields. In 2008, Lin and Otaduy published a book that systematically summarized the work of several well-known scholars in haptic rendering field[9]. Up to now, many problems remain to be solved for achieving realistic 6-DoF Haptic rendering.
Based on the collision response algorithms, 6-DoF haptic rendering algorithms can be classified into penalty-based and constraint-based approaches. In the penalty-based approach, contact constraints are modeled as springs whose elastic energy increases when the haptic tool penetrates into an object in the virtual environment. Penalty forces are computed as the negative gradient of the elastic energy, which pushes the graphic tool toward a non-penetrating configuration[10]. As pointed by Ortega et al, these methods allow the virtual objects to interpenetrate, which may lead to missing collisions and the well-known pop-through effect[11]. In contrast to the penalty-based approach, constraint-based approach is an analytical and global method for the collision response. Constraint-based approaches attempt to constrain the pose of the graphic tool to be free of penetration, while the haptic tool can penetrate into objects. Constraint-based methods[12] achieve accurate simulation by using a linear complementary problem model for the normal and friction contact constraints, which is computationally expensive and cannot handle large number of multi-region contacts between complex-shaped objects or large-scale deformation at the kHz haptic rate. To solve this problem, Wang et al. proposed an efficient constraint-based rendering algorithm using configuration-based optimization and hierarchical sphere-tree modeling. As shown in Figure 8a, complex-shaped objects such as bunny and dragon are modeled using hierarchical sphere-tree. Efficient collision detection can be performed within 1ms, and the non-penetrated configuration of the graphic tool is solved as a quadratic programming optimization problem, which can be efficiently solved by the classic active-set methods. As shown in Figure 8, this configuration-based optimization approach is able to simulate fine haptic manipulation with multi-region contacts between rigid objects[13], rigid tool and deformable objects[14], and sharp features[15] and pathological decay modeling[16].
3.4 Typical applications
Typical application of desktop haptic systems include virtual surgery, mechanical assembly, and other tool-based entertainment such as virtual sculpture.
Figure 9 shows the hardware of dental surgery simulation system, and graphical scenarios of three typical dental operations, including dental inspection, periodontal depth probing, and implant surgery. The goal of the simulation system is to simulate the physical contacts between dental instruments and diverse tissues including teeth, gingiva, tongue and diverse pathological changes such as calculus and decay. The stiffness and friction coefficient difference between these tissues can be simulated and provide force feedback on the stylus of the haptic device, and the system can be used to train the fine motor skill of trainees, along with hand-eye coordination skill. The system include two force feedback devices, a half-silvered mirror system to provide visual-haptic collocation display, a supporting rest for fingertip, and foot pedal. The system can simulate typical dental surgeries including periodontal surgery, dental preparation surgery, and implant surgery.
Figure 10 shows the haptic interaction system for simulating mechanical assembly of an air craft engine. The user is controlling the haptic device to insert a splined shaft into a splined hole. Force and torque feedback can be produced to simulate the tight clearance during the assembly process. As the hole is deep and the clearance is small, the quality of the assembly totally rely on haptic sensation.
4 Surface haptics
4.1 Motivation
Since 2005, surface haptics has gradually become a research hotspot with the rise of the mobile devices such as mobile phone, tablet PC, and multi-user collaborative touch devices with a large screen.
Different from desktop haptic devices that simulate indirect contacts between hands and objects (i.e., tool-mediated interaction), surface haptics aim to simulate direct contacts between bare fingers and objects, for example, users can touch and feel the contour or the roughness of an image displayed on the screen of a mobile phone (Figure 11).
Figur 11 Touch and feel the tactile features of an image through a touch screen.
4.2 Tactile device for touch screen
The main challenges of surface haptic feedback is to embed all the sensing and actuating components within the compact space of a mobile device.
According to the actuation principle, as shown in Figure 12, tactile feedback devices can be classified into three categories: tactile feedback devices based on mechanical vibration, tactile feedback devices that change the surface shape, tactile feedback devices with variable surface friction coefficients. In this section, we survey these three approaches with respect to their principles, key technologies and representative systems, along with their pros and cons.
4.2.1   Mechanical vibration
Vibrotactile feedback or mechanical vibration is the most widely used tactile feedback for touch screen. The most typical example is the vibration used for the incoming call or timer reminder on a mobile phone. The vibrotactile feedback of touchscreen can be obtained by direct contact using fingers[17,18] or indirect contact using a stylus[19].
Vibrotactile rendering devices stimulate the operator’s fingers to present the tactile sensation by controlling the tactile actuators such as eccentric rotating-mass (ERM) actuators, linear resonant actuators (LRA), voice coils, solenoid actuators or piezoelectric actuators. In 2001, Fukumoto et al. used a voice coil vibrator for mobile phones and PDAs to render tactile feedback through single frequency vibrations[17]. Yatani et al. developed a tactile feedback device with vibration motors (SemFeel) in 2009[18]. As shown in Figure 13a, the five motors embedded in the back of the phone could realize three vibration modes of positioning, single and cycle to help users to distinguish different button clicks. In 2010, Yang et al. adopted the tactile display board composed of 12 vibrating panels for mobile phones, as shown in Figure 13b, which improved the feedback fidelity, however it was bulky and the tactile feedback was separated from the visual display[20]. Sawada et al. used shape memory alloy with screw structures as the micro-vibrator to develop a tactile reproduction pen, which can be controlled to generate virtual textures[19]. At present, almost all mobile phones and smart watches have vibration feedback functions. For example, Immersion has been providing Samsung with the OEM service of the TouchSense technology. TI has developed a series of driver chips of piezoelectric vibrator, and Apple has used the linear motor to render tactile feedback on the iPhone 6.
Although mechanical vibration has been widely used, the ability of rendering diversified tactile features such as frictions and textures is poor. Poupyrev, Maruyama and Rekimoto use the vibration of the piezoelectric actuator to improve the tactile feedback for handhold mobile devices[21]. They designed a thin beam-shaped piezoelectric sheet (about 0.5mm), called TouchEngine, which has the advantages of low voltage (20V), low energy consumption (3mW) and fast response (less than 5ms) as shown in Figure 14. By mounting the piezoelectric sheet on the base of the handhold devices, the users can feel the tactile feedback. In a menu selection experiment, the user’s speed of completion task can be improving 22% with the help of the tactile feedback. The piezoelectric actuator has a fast response speed and can generate crisp tactile feedback, such as the touch of keystrokes. This crisp feedback can also be used to enhance the tactile perception of virtual keystrokes, scrollbars and menus.
4.2.2   Micro roughness display
Tactile array is an intuitive way to realize tactile feedback, which utilizes a two-dimensional array composed of pins, electrodes, air pressured chambers or voice coils to stimulate the operator’s skin, and thus to form the spatial force distribution that reflects the tactile properties of an object surface. As shown in Figure 15, Kim et al. developed a dynamic reshaping input terminal device in 2007, which consists of a 48pin×32pin array and can reproduce information such as pictures and characters[22]. As shown in Figure 16, Kajimoto et al. developed a haptic device based on an electrode array in 2010, which directly stimulates the human nerve to produce a tactile sensation by electrical signals[23]. They solved the problem of the transparent electrode and applied the device to a mobile terminal. The device has the characteristics of large power consumption, high cost and complicated structure. The excitation voltage is up to 300V, which is easy to cause the tingling sensation and cannot present fine tactile effects.
Distributed actuators are used to generate the physical image of a virtual component by changing the unevenness of the touch surface, thereby improving the realism the operation. Harrison and Hudson developed the tactile feedback device actuated by the pneumatic chambers, which can fill the air into the customized-designed airbag button[24]. As show in Figure 17, positive pressure makes the button to formulate convex shape, negative pressure causes the button to become concave, and the button remains horizontal when not inflated. The design can also generate the effect of touch screen by using a rear projection projector. Preliminary experiment results show that compared to the smooth touch screen button, the airbag button can improve the user's speed of discriminating different buttons, and the tactile effect of the airbag button is very close to the physical button. Kim et al. designed the Dotview-2 tactile feedback device, which uses an array of 1536 (32×48) pins to display haptic images[25].
4.2.3   Friction modulation
Friction modulation devices reproduce tactile information through the change of the lateral force within a horizontal plane, including the squeeze-film effect and the electrostatic effect. Compared with the devices using an actuator array, the force output performance of friction modulation device is good in continuity and thus it can present fine tactile effects.
Tactile feedback devices based on the squeeze-film effect generates an air film by applying high-frequency vibrations. The contact force between a finger and the device surface is altered by changing the friction coefficient through modulating the frequency and intensity of the vibration to simulate textures. In 1995, Watanabe et al. developed an ultrasonic vibration plate device that produces squeeze-film effect, so that the operator can perceive roughness[26]. In 2013, Carter et al. introduce the UltraHaptics, which employs focused ultrasound to project discrete points of haptic feedback through the display and directly on to users’ unadorned hands[27]. As shown in Figure 18, Winfield et al. developed a TPaD device based on piezoelectric vibrations to generate squeeze-film effect in 2007, which can reduce the friction coefficient of the screen surface[28]. In 2010, Marchuk et al. developed LATPaD (Figure 19), a large-scale device prototype on the basis of TPaD[29]. As shown in Figure 20, Mullenbach et al. integrated the aforementioned results into the portable terminal TPaD Fire in 2013, which can present customizable textures on the glass surface with feedback forces exceeding 100mN[30]. Yang et al. developed a device that generates the resonance through piezoelectric actuators mounted on the back of a resonator[31]. Standing waves are formed on the surface of the resonator, and the squeeze-film effect are produced when the finger contacts with the surface of the resonator. As shown in Figure 21, the vibration mode is a continuous standing wave with 21 antinodes with a natural frequency of 53563Hz.
Tactile feedback devices based on the electro-static effect create a capacitance between the finger and the electrode of the tactile device. The user’s finger can perceive electrostatic friction forces under the action of the excitation signal that applied to the electrode, and the tactile sensation is exhibited by changing the friction force[32,33]. In 1953, Mallinckrodt et al. discovered the electrostatic tactile effect[34]. Linjama et al. developed the e-sense system in 2009[35]. The TeslaTouch system developed by Bau et al. produced the electrostatic force up to 250mN[36]. As shown in Figure 22, the device can reproduce diverse image textures on the screen surface of the display. In 2012, NOKIA Labs collaborates with the University of Cambridge to develop the ET[37], a surface tactile feedback system that can produce the graphene-based electrostatic force. As shown in Figure 23, the biggest advantage is that it can be applied to deformable touch screens. As shown in Figure 24, Senseg developed the Feelscreen system to produce electrostatic force on a 7-inch tablet in 2014. The devices can render the shape, texture and even softness, and have the characteristics of low power consumption and light weight.
The limitation of the electro-static effect is that tactile sensation cannot be perceived by fingers under static conditions, and the friction force is in the range of several hundred mNs, which restricts the improvement of the realistic sensation.
4.3 Tactile rendering
Tactile rendering algorithms obtain the position of users’ fingers, and produce force output to the device. Tactile rendering algorithms normally consists of four components: object modelling, collision detection, and force generation.
At present, tactile rendering algorithms for surface haptic device are still in their early phase. Most systems utilized image-based rendering approaches. In 2008, Vasudevan et al. proposed a texture tactile rendering model based on the product of images and force rendering masks[38]. Saga et al. proposed the tactile rendering algorithm that uses the gradient information of the image to map the amplitude parameters of the excitation signal in 2012[39]. Kim et al. proposed the tactile rendering algorithm[40], which maps the amplitude parameters of excitation signals by the product of the image gradient vector and the finger motion velocity vector, and thus realizes the geometric feature reproduction of the image. Currently, the tactile rendering algorithm is limited to the tactile feedback of two-dimensional images, and mainly uses the amplitude characteristics of excitation signals. The rendering algorithms for three-dimensional or even multi-dimensional features such as texture, shape and roughness need to be further studied. These algorithms may fully utilize the parameter characteristics of the excitation signals such as amplitude, frequency and polarity etc.
4.4 Typical applications
Tactile devices have been used for assisting visually impaired people to read texts or images on the webpage[41,42], for enhancing the efficiency of menu or button manipulation on mobile touchscreens[43,44,45,46], for developing mobile computer games with diversified tactile sensations. For example, Liu et al. developed a fruit sorting games on an electrostatic feedback device as shown in Figure 25 [47]. During the game, five types of fruits including watermelons, apples, pears, tomatoes and cherries appear in random positions in the top area of the screen, and subsequently move in a randomized direction corresponding to down, left or right with a constant speed. Users are required to drag the randomly appearing fruit to one of the target fruit images that possess the same shape as the selected fruit. Experimental results indicate that the added tactile feedback enhances both user’s performance and interest in the game.
In the future, touchscreens with tactile feedback may find potential applications in mobile phones for improving the efficiency and accuracy of fine manipulations such as panning[47], pinching and spreading gestures[48]. For automobile industry, tactile feedback will be beneficial for eye-free interaction with the touchscreen for enhancing the safety during driving. For effective communication and teamwork in classrooms or meeting rooms, using large-sized screen mounted on a wall, tactile feedback will be helpful for intuitive manipulation of graphical charts of widgets for assisting brainstorming discussions.
5 Wearable haptics
5.1 Motivation
In recent years, low cost HMDs such as Oculus Rift and HTC Vive indicate the booming era of VR. When a user experiences virtual objects in high-fidelity 3D graphically rendered virtual environments, it is human’s instinct to touch, grasp and manipulate the virtual objects.
The development of VR requires novel type of haptic feedback that can allow users to move in a large workspace, provide force feedback to the whole hand, and support users to use diverse gestures with the full DoF of fingers for fine manipulation.
5.2 Wearable haptic device
Haptic glove is a typical form of wearable haptic devices. Its main functions include multi-DoF whole hand motion tracking, and providing distributed force and tactile feedback to fingertips and the palm. Compared with desktop haptic force feedback devices such as Phantom Desktop, haptic gloves are able to allow users to touch and manipulate remote or virtual objects in an intuitive and direct way via the dexterous manipulation and sensitive perception capabilities of our hands. A well designed glove could provide force and tactile feedback that realistically simulates touching and manipulating objects at a high update rate, while being light weighted and low cost.
For simplicity, according to the mechanoreceptors in user’s body (Figure 1), current wearable haptic devices include three types: kinesthetic/korce feedback devices, tactile feedback devices, and integrated feedback devices. Each type can be further classified by different criteria. For example, based on the fixed link, we can classify wearable haptic devices into dorsal-based, palm-based and digit-based. Based on actuation type, we can classify them into electrical motors, pneumatic or hydraulic actuators, novel actuators using functional materials etc. Readers may refer to existing survey about force feedback gloves[49] or wearable haptic devices[50].
Driven by strong application needs from virtual reality, lots of startup companies developed haptic gloves. For the convenience of readers aiming to quickly construct a virtual reality system with wearable haptic feedback, here we summarize existing commercial haptic gloves and other wearable haptic devices in Table 1. Figure 26 shows the photos of several represented commercial force feedback gloves, including CyberGrasp, H-glove, Dexmo, Haptx, Plexus, and Vrgluv.
Existing commercial haptic feedback gloves

Name of the

device

Company Motion track DoF

Force

feedback

Tactile

feedback

Actuation

principle

Sensing

principle

Typical

features

CyberGrasp[51] Immersion One actuator per finger

Electric

motor

22-sensor CyberGlove device Feel the size and shape of virtual objects
H-glove[52] Haption Each finger possesses 3 DoF Force feedback on 3 fingers

Electric

motor

Possibility to attach it to a Virtuose 6D
Dexmo[53] DextaRobotics Tracking 11 DoF for hand Force feedback on 5 fingers Three linear resonant actuators (LRA)

Electric

motor

Rotary sensors Feel the shape, size and stiffness of virtual objects
Haptx glove[54] Haptx Tracking 6 DoF per digit Lightweight exoskeleton applies up to 4 lbs per finger 130 stimuli points Microfluidic array Magnetic tracking Feel the shape, texture and motion of virtual objects; sub-millimeter precision
Plexus[55] Plexus 21 DoF per hand One tactile actuator per finger Using tracing ada-pters for the Vive, Oculus and Windows MR devices Track with 0.01 degree precision
Sense glove[56] Sense Glove 20 DoF finger tracking Force-feedback is applied to each fingertip in the flexion or grasping direction One haptic motor per finger Electric motor Sensor Feel the shape and density of virtual objects
Avatar VR[57] NeuroDigital Tech. Full Finger Tracking Ten vibrotactile actuators Vibrotactile array 6x 9-AXIS IMUs Track the movements of chest, arms and hands
Maestro[58] Contact CI 13 DoF hand tracking 5 Exotendon restriction mechanism Vibration feedback at each fingertip Motion capture system combined with flex sensors Precise gesture capture, nuanced vibration feedback and smart finger restriction
Senso Glove[59] Senso Device Full 3d tracking for each finger 5 vibration motors 7 IMU sensors Precisely track fingers and hands position in space and provide haptic feedback
Dextres glove[60] EPFL and ETH Zurich Holding force on each finger Electrostatic attraction Weighing less than 8 grams per finger
VRgluv[61] VRgluv 12 DoF for the fingers on each hand Apply up to 5 lbs of varying force per finger DC motors 5 sensors per finger Simulate stiffness, shape, and mechanical features
In order to provide more realistic haptic sensations, more research prototypes of force feedback gloves have been developed[49,50]. Figure 27a shows a lightweight glove using pneumatic-driven rigid actuators and an underactuated mechanism with a single transmission link, which effectively reduces the weight of the glove while maintaining the sufficient workspace[62]. To increase the comfortability for wearing[63], soft bending actuators with fiber-reinforcement was used to develop a rigid-soft coupled glove (Figure 27b). Inspired by thin-shaped adhesive bandages, layer jamming thin sheet actuators were used to develop a very light weighted glove with only 25 grams for producing torque feedback on the joint of five fingers (Figure 27c).
Figure 28 show several representative tactile feedback devices. In Figure 28a, the device can provide normal indentation to the fingertip through a moving platform[64]. Figure 28b shows a lateral skin-stretch device using parallel mechanisms[65]. Figure 28c shows a 3-DoF cutaneous force feedback device for exerting tactile stimuli on fingertip[66]. While these devices can provide tactile stimuli, the user’s finger may penetrate into the virtual objects because no kinesthetic feedback was provided. One open research question is to develop lightweight wearable devices that can provide integrated kinesthetic and cutaneous feedback on fingertips and the palm.
5.3 Haptic rendering
In addition to high-performance force-feedback gloves, the hand-based haptic rendering algorithms and software is another important engine to boost the prosperity of wearable haptics. In hand-based haptic rendering, the user is wearing a haptic glove, to control a hand avatar to touch and/or grasp virtual objects, and get force feedback on fingertips or even whole hand surface.
Wearable haptics is the inherent demand of virtual reality, which aims to obtain more intuitive gesture control of a hand avatar and get the sensation of multi-point contacts between a hand and objects, which can greatly enhance the immersive sensation of virtual reality manipulations. Hand-based haptic rendering will be useful in several application fields, including surgery, industrial manufacturing, e-business and entertainment etc. Figure 29 shows three examples of hand-based haptic rendering, including dexterous manipulation of rotating a pencil[67], virtual maintenance of mechanical structures[68], and grasping a deformable object[69]. In addition, Pacchierotti et al. presented a novel wearable 2-DoF cutaneous device for the finger, which is able to convey informative cutaneous stimuli and effectively improves the perceived effectiveness[70]. Maisto et al. presented the experimental evaluation of two wearable haptic systems in three augmented reality applications, which showed that providing haptic feedback through the considered wearable device significantly improves the performance of all the considered tasks[71].
The problem of hand-based haptic rendering can be conceptualized as the following pipeline. First, the joint angles of the hand avatar are measured in real time from a glove. With the predefined configuration and geometric/physical properties of the virtual objects, the task of haptic rendering is to solve three type of outputs, including the contact force (especially the feedback force on virtual fingertips), the configuration of the hand avatar that satisfy the contact constraints, and the dynamic response of the manipulated objects (including motion and/or deformation). Furthermore, we need to make sure the haptic rendering and visual rendering are consistent both temporally and spatially (i.e., the haptic and visual representation of the hand avatar is registered in real time).
It should be noted that the haptic rendering pipeline in this paper is aiming for combining with force-feedback gloves based on the control principle of impedance display. In literatures, most force-feedback gloves are using the control principle of impedance display. There are only very few gloves that use admittance display. For admittance display, it needs extra force sensors to measure the contact force between the user’s finger and the glove.
The research of the hand-based force rendering algorithms evolved through three stages. In the early stage, lots of heuristic force synthesis algorithm are proposed[72,73,74]. Principle of this type of methods includes definition of typical manipulation gestures, and then several gesture models are constructed offline. By performing rough collision detection based on simplified geometric model of virtual hand and bounding boxes, the collision response is solved based on gesture binding based on the proximity of predefined gestures and actual gestures, in which grasps were maintained by constraining the relative position and orientation of the object with respect to the palm. For example, Zachmann et al. used the distribution of contacts among finger phalanges, thumb phalanges and palm to discriminate between push, precision grasp and power grasp[72]. The advantage of these methods is that they allow stable grasps of virtual objects, however, they may lead to unrealistic contacts between the virtual hand and objects and sometimes ignore object-object collisions. Furthermore, these methods have to approximate the hand motions of the user, which lead to degraded simulation fidelity.
In the second stage, the force rendering algorithm is based on collision detection and accurate collision response[75,76]. The main principle is that the virtual hand is modeled using articulated multiple rigid hinged links, where palm and phalanges are represented by interconnected rigid bodies. Accurate collision detection is performed based on leaf elements of a hierarchical representation such as oct-tree, and multi-point contact collision response is solved based on penalty or constraint approaches. For example, Borst and Indugula[75] introduced the use of such models, by using linear and torsional spring-dampers between the simulated palm and phalanges and their tracked configurations. This method was shown to be sufficiently efficient to allow dexterous interaction with haptic feedback on the palm and fingers[76]. These rigid body models are computationally efficient, but they do not account for skin deformation under contact, which is not only necessary for a more realistic-looking interaction and motion, but also to simulate friction at contact surfaces more accurately.
The third stage is a fine force rending algorithm that takes into account the physiological and mechanical properties of the finger[67,77,78]. The main principle is to consider the physiological structure of the human hand including bones, muscles and skin, the nonlinear stress-strain characteristics of the muscle, and the detailed physiological phenomena such as the variation of the contact force modulated by the local deformation between the fingertip and the manipulated object. For example, Talvas et al. proposed an approach based on novel aggregate constraints for simulating dexterous grasping using soft fingers[67]. The soft finger model can simulate the normal pressure of the contact point, along with the friction and contact torque.
6 Future research topics
To develop high-fidelity devices that can simulate the abundant haptic properties that human can perceive in the physical world, for the purpose of improved immersion, interaction and imagination, we identify the gaps from the state of the art as future research challenges in the field.
6.1 Handheld haptic devices for VR
Considering the limitation of wearable haptic devices such as difficult to be put on/off and difficult to adapt for users with different hand sizes, in recent years, handheld haptic devices have emerged as the dominant interaction interface for commercial VR devices (e.g., Oculus Rift and HTC Vive).
Compared to desktop haptic devices (such as Phantom Desktop) or wearable haptic devices (such as CyberGrasp), handheld devices have several unique features to be used in VR. First, they can support large scale body movement. Furthermore, they are easy to be put on and off that one can simply pick it up and begin using it without having to strap anything to the fingers. Nowadays, existing commercial VR controllers such as HTV Vive can only provide global vibrotactile feedback, which greatly reduce the user’s haptic experience. One open challenge is how to achieve more abundant haptic feedback patterns within the compact volume of a handheld device, such as localized and diverse spatial-temporal vibrotactile patterns, texture feedback, thermal feedback, skin-stretch, softness, contact etc. These patterns should be carefully designed and thus users can easily differentiate and recognize them without requiring long learning durations. The solution to the above challenge will be crucial in designing future virtual reality systems for entertainment or education applications.
6.2 Multimodal haptic devices
In order to support highly immersive virtual reality experience, it is an important topic to develop multi-modal haptic devices. This device should be able to reproduce multi-properties of virtual/remote objects, support multi-gestures of human hands to perform fine manipulation, and stimulate multi-receptors (including cutaneous and kinesthetic receptors) of human haptic channel. Haptically perceivable properties of an object include three categories: material and geometric properties. Principal material properties include five aspects[79]: hardness (hard/soft), warmness (warm/cold), macro roughness (uneven/flat), fine roughness (rough/smooth), and friction (moist/dry, sticky/slippery). Geometric properties generally comprise shape (both global shape and local geometric features) and size (such as area, volume, perimeter, bounding-box volume, and so on)[80].
As defined by the metaphor of “the digital clay”[81,82], the user can touch the multi-modal haptic device with bare hand, i.e., without wearing any sensors or actuators on his/her hand. The device can track the movement of the users’ hand, and can produce diverse haptic stimuli in a spatial collocated and temporarily registered manner.
One promising topic is to explore soft haptics, which utilizes the latest technologies in soft robotics based on novel flexible and extensible materials. Some promising solutions have been explored, for example, to produce a device that can change its shape and softness using particle jamming[82], change its surface roughness using electrostatic effect[83], change its softness using tension control of a thin film[84], change its virtual tissue stiffness[85] and virtual volumetric shapes[86] using magnetic repulsion technology. How to integrate these haptic modalities within a soft structure remain open challenges. One potential topic is to develop flexible actuators for multimodal haptic devices using novel functional materials, such as pneumatic networks[87], shape memory polymers[88], and liquid crystal elastomers[89].
6.3 High-fidelity haptic rendering
Data-driven approaches based on large-scale dataset will be useful for modeling, rendering and evaluation of haptic devices. Several haptic texture databases based on tool-mediated interaction have been established for texture rendering or recognition and classification of texture features. Culbertson et al. established a haptic texture database including 100 materials of haptic texture interaction signals measured via a hand-held pen-type measurement tool sliding across material surfaces[90]. Strese et al. introduced a haptic texture database, which included 43 haptic textures for tool-mediated texture recognition and classification[91]. Later, Strese et al. used a handheld measurement tool to record friction, hardness, macroscopic roughness, microscopic roughness, and warmth as sliding a finger across 108 surface materials[92]. Zheng et al. developed a haptic texture database, which contained the acceleration signals captured from 69 kinds of material surfaces via a hand-held rigid tool[93].
More effort should be devoted to develop large scale, open source and multimodal haptic datasets. The dataset should support typical gestures, and include necessary force/motion signals including normal and friction force, sliding velocity, acceleration, vibration, contact area etc. These signals are necessary for estimating physical properties including softness, friction and textures etc. With large scale samples being collected and measured, machine learning method such as Convolutional Neural Networks (CNN) could be used to estimate the value of the physical parameters.
The challenge of haptic rendering for wearable haptics is to meet the contradictive requirements from realism and efficiency. For the realism, haptic rendering need to ensure simulation of human hand manipulating diverse objects with diverse physical properties. The hand has over 20 DoFs for movement, and it should include three layers with nonhomogeneous physical properties. The manipulate objects can be solid, elastic, plastic and even fluid. Computational difficulties need to be solved for efficient and accurate collision detection, collision response to ensure no penetration, force computation model to deal with the contact forces distributed at multiple contact regions between the hand and the objects. For the efficiency, 1kHz update rate is normally needed to simulate hard contacts between hand and stiff objects. Parallel computing approaches using multicore or GPU could be explored.
6.4 Understanding on human haptics
Understanding the perception characteristics of multimodal haptic cues is a prerequisite to develop effective haptic devices. Understanding of the interaction effect between two types of haptic stimuli may provide guidelines for designing multimodal haptic devices. For example, the tactile sensory acuity and the perceived intensity of tactile stimuli can be influenced by the temperature of the device contacting the skin and by the temperature of the skin itself.
Haptic illusion has been widely used to meet the compact-sized requirement of wearable haptic devices. A number of illusions related to weight perception have been demonstrated, for example, thermal/weight[94], size/weight[95], and material/weight[96] illusions, as well as the “golf-ball” illusion[97]. These variations in weight perception may provide guidelines for developing new haptic devices. Although there are well-known illusions (such as skin stretch for simulating weight sensation, thermal grill effect[98,99] etc., the question remains open to effectively utilize these illusion effects to design multimodal haptic devices and investigate novel and effective haptic illusions.
7 Conclusions
In 1965, the concept of “ultimate display” was proposed, which represents the birth of virtual reality. Until today this concept still remains a dream to be fulfilled. In order to develop immersive and interactive VR systems, haptic feedback is an indispensable component. With the development of the computing platforms in the past 30 years, the paradigm of haptics interaction has shifted through three stages, namely desktop haptics, surface haptics and wearable haptics. To meet the goal of “ultimate display”, more powerful haptic devices will need to be developed. Cross-disciplinary endeavors will need to be made for innovative solutions to high fidelity haptic devices, where novel sensors and actuators using advanced and smart functional materials will find a place. The goal will be focused on providing multimodal haptic stimuli to accommodate with the marvelous perceptual capabilities of human.

Reference

1.

Sutherland I E. The ultimate display. Multimedia: From Wagner to virtual reality, 1965, 506−508

2.

Goertz R C. Master-Slave Manipulator. Office of Scientific and Technical Information (OSTI), 1949 DOI:10.2172/1054625

3.

van Dam A. Post-WIMP user interfaces. Communications of the ACM, 1997, 40(2): 63–67 DOI:10.1145/253671.253708

4.

Salisbury K, Brock D, Massie T, Swarup N, Zilles C. Haptic rendering: programming touch interaction with virtual objects. In: Proceedings of the 1995 symposium on Interactive 3D graphics. Monterey, California, USA: ACM, 1995: 123–130 DOI:10.1145/199404.199426

5.

Srinivasan M A, Basdogan C. Haptics in virtual environments: Taxonomy, research status, and challenges. Computers & Graphics, 1997, 21(4): 393–404 DOI:10.1016/s0097-8493(97)00030-7

6.

von Thomas Ammann S, Ruspini D C, Kolarov K, Khatib O. The haptic display of complex graphical environments. In: The 24th annual Conference on Computer Graphics and Interactive Techniques, 1997, 345–352

7.

Massie T, Salisbury K. The PHANToM Haptic Interface: A Device for Probing Virtual Objects. In: ASME International Mechanical Engineering Congress and Exhibition, 1994, 295–302

8.

Puterbaugh U W M K. Six degree-offreedom haptic rendering using voxel sampling. In: ACM SIGGRAPH, 1999: 401–408

9.

Lin M C, Otaduy M. Haptic Rendering: Foundations, Algorithms and Applications. In: CRC Press, 2008

10.

Otaduy M A, Garre C, Lin M C. Representations and algorithms for force-feedback display. Proceedings of the IEEE, 2013, 101(9): 2068–2080 DOI:10.1109/jproc.2013.2246131

11.

Ortega M, Redon S, Coquillart S. A six degree-of-freedom god-object method for haptic display of rigid bodies with surface properties. IEEE Transactions on Visualization and Computer Graphics, 2007, 13(3): 458–469 DOI:10.1109/tvcg.2007.1028

12.

MacLean K E. Haptic interaction design for everyday interfaces. Reviews of Human Factors and Ergonomics, 2008, 4(1): 149–194 DOI:10.1518/155723408x342826

13.

Wang D X, Shi Y J, Liu S, Zhang Y R, Xiao J. The influence of handle-avatar mapping uncertainty on torque fidelity of 6-DOF haptic rendering. World Haptics Conference (WHC). Daejeon, South Korea, 2013: 325–330 DOI:10.1109/WHC.2013.6548429

14.

Wang D X, Shi Y J, Liu S, Zhang Y R, Xiao J. Haptic simulation of organ deformation and hybrid contacts in dental operations. IEEE Transactions on Haptics, 2014, 7(1): 48–60 DOI:10.1109/toh.2014.2304734

15.

Yu G, Wang D X, Zhang Y R, Xiao J. Simulating sharp geometric features in six degrees-of-freedom haptic rendering. IEEE Transactions on Haptics, 2015, 8(1): 67–78 DOI:10.1109/toh.2014.2377745

16.

Wang D X, Zhang Y R, Hou J X, Wang Y, Lv P, Chen Y G, Zhao H. Idental: A haptic-based dental simulator and its preliminary user evaluation. IEEE Transactions on Haptics, 2012, 5(4): 332–343 DOI:10.1109/toh.2011.59

17.

Fukumoto M, Sugimura T. Active click: tactile feedback for touch panels. In: Extended Abstracts on Human Factors in Computing Systems. Seattle, Washington: ACM, 2001: 121–122 DOI:10.1145/634067.634141

18.

Yatani K, Truong K N. SemFeel: a user interface with semantic tactile feedback for mobile touch-screen devices. In: Proceedings of the 22nd annual ACM symposium on User interface software and technology. Victoria, BC, Canada: ACM, 2009: 111–120 DOI:10.1145/1622176.1622198

19.

Sawada H, Takeda Y. Tactile pen for presenting texture sensation from touch screen. 2015 8th International Conference on Human System Interaction (HSI). Warsaw, Poland, 2015, 334–339 DOI:10.1109/HSI.2015.7170689

20.

Yang G H, Jin M S, Jin Y, Kang S. T-mobile: Vibrotactile display pad with spatial and directional information for hand-held device. 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems. Taipei, Taiwan, China: 2010, 5245–5250 DOI:10.1109/IROS.2010.5651759

21.

Poupyrev I, Rekimoto J, Maruyama S. TouchEngine: a tactile display for handheld devices. In: Human Factors in Computing Systems. 2002, 20(25): 644–645

22.

Kim S H, Sekiyama K, Fukuda T, Tanaka K, Itoigawa K. Development of dynamically Re-formable input device in tactile and visual interaction. In: International Symposium on Micro-NanoMechatronics and Human Science, Nagoya, Japan: 2007: 544–549 DOI:10.1109/MHS.2007.4420914

23.

Kajimoto H. Enlarged electro-tactile display with repeated structure. In: IEEE World Haptics Conference, Istanbul, Turkey: IEEE, 2011: 575–579 DOI:10.1109/WHC.2011.5945549

24.

Harrison C, Hudson S E. Providing dynamically changeable physical buttons on a visual display. In: the SIGCHI Conference on Human Factors in Computing Systems. Boston, MA, USA: 2009, 299–308

25.

Kim S, Sekiyama K, Fukuda T. User-adaptive interface with reconfigurable keypad for in-vehicle information systems. In: International Symposium on Micro-NanoMechatronics and Human Science. Nagoya, Japan, 2008, 501–506 DOI:10.1109/MHS.2008.4752504

26.

Watanabe T, Fukui S. A method for controlling tactile sensation of surface roughness using ultrasonic vibration. In: IEEE International Conference on Robotics and Automation. Nagoya, Japan: IEEE, 1995: 1134–1139 DOI:10.1109/ROBOT.1995.525433

27.

Carter T, Seah S A, Long B, Drinkwater B, Subramanian S. UltraHaptics: multi-point mid-air haptic feedback for touch surfaces. In: Proceedings of the 26th annual ACM symposium on User interface software and technology. St. Andrews, Scotland, United Kingdom: ACM, 2013: 505–514 DOI:10.1145/2501988.2502018

28.

Winfield L, Glassmire J, Colgate J E, Peshkin M. T-PaD: tactile pattern display through variable friction reduction. In: Second Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. Tsukaba, Japan: 2007, 421–426 DOI:10.1109/WHC.2007.105

29.

Marchuk N D, Colgate J E, Peshkin M A. Friction measurements on a large area TPaD. In: IEEE Haptics Symposium. Waltham, MA, USA: IEEE, 2010: 317–320 DOI:10.1109/HAPTIC.2010.5444636

30.

Mullenbach J, Shultz C, Piper A M, Peshkin M, Colgate J E. Surface haptic interactions with a TPad tablet. In: Proceedings of the adjunct publication of the 26th annual ACM symposium on User interface software and technology. St. Andrews, Scotland, United Kingdom: ACM, 2013: 7–8 DOI:10.1145/2508468.2514929

31.

Yang Y, Zhang Y R, Lemaire-Semail B, Dai X W. Enhancing the Simulation of Boundaries by Coupling Tactile and Kinesthetic Feedback. Haptics: Neuroscience, Devices, Modeling, and Applications. Berlin, Heidelberg: Springer Berlin Heidelberg, 2014: 19–26 DOI:10.1007/978-3-662-44196-1_3

32.

Vezzoli E, Messaoud W B, Amberg M, Giraud F, Lemaire-Semail B, Bueno M A. Physical and perceptual independence of ultrasonic vibration and electrovibration for friction modulation. IEEE Transactions on Haptics, 2015, 8(2): 235–239 DOI:10.1109/toh.2015.2430353

33.

Kim H, Kang J, Kim K D, Lim K M, Ryu J. Method for providing electrovibration with uniform intensity. IEEE Transactions on Haptics, 2015, 8(4): 492–496 DOI:10.1109/toh.2015.2476810

34.

Mallinckrodt E, Hughes A L, Sleator W. Perception by the skin of electrically induced vibrations. Science, 1953, 118(3062): 277–278 DOI:10.1126/science.118.3062.277

35.

Linjama J, Mäkinen V. E-sense screen: Novel haptic display with capacitive electrosensory interface. In: 4th Workshop for Haptic and Audio Interaction Design, 2009

36.

Bau O, Poupyrev I, Israr A, Harrison C. TeslaTouch: electrovibration for touch surfaces. In: Proceedings of the 23nd annual ACM symposium on User interface software and technology. New York, New York, USA: ACM, 2010: 283–292 DOI:10.1145/1866029.1866074

37.

Radivojevic Z, Beecher P, Bower C, Haque S, Andrew P, Hasan T, Bonaccorso F, Ferrari A C, Henson B. Electrotactile touch surface by using transparent graphene. In: Proceedings of the 2012 Virtual Reality International Conference. Laval, France: ACM, 2012: 1–3 DOI:10.1145/2331714.2331733

38.

Vasudevan H, Manivannan M. Tangible images: runtime generation of haptic textures from images. 2008 Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. Reno, NE, USA, 2008: 357–360 DOI:10.1109/HAPTICS.2008.4479971

39.

Saga S, Deguchi K. Lateral-force-based 2. 5-dimensional tactile display for touch screen. IEEE Haptics Symposium (HAPTICS), Vancouver, BC, Canada: IEEE, 2012: 15–22 DOI:10.1109/HAPTIC.2012.6183764

40.

KimS-C, Israr A, Poupyrev I. Tactile rendering of 3D features on touch surfaces. In: Proceedings of the 26th annual ACM symposium on User interface software and technology. St. Andrews, Scotland, United Kingdom: ACM, 2013: 531−538 DOI:10.1145/2501988.2502020

41.

Xu C, Israr A, Poupyrev I, Bau O, Harrison C. Tactile display for the visually impaired using TeslaTouch. In: Extended Abstracts on Human Factors in Computing Systems. Vancouver, BC, Canada: ACM, 2011, 317–322 DOI:10.1145/1979742.1979705

42.

Israr A, Bau O, KimS-C, Poupyrev I. Tactile feedback on flat surfaces for the visually impaired. In: Human Factors in Computing Systems. Austin, Texas, USA: ACM, 2012: 1571–1576 DOI:10.1145/2212776.2223674

43.

Hoggan E, Brewster S A, Johnston J. Investigating the effectiveness of tactile feedback for mobile touchscreens. In: the SIGCHI Conference on Human Factors in Computing Systems. Florence, Italy, 2008, 1573–1582

44.

Levesque V, Oram L, MacLean K, Cockburn A, Marchuk N D, Johnson D, Colgate J E, Peshkin M A. Enhancing physicality in touch interaction with programmable friction. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. Vancouver, BC, Canada: ACM; 2011: 2481–2490 DOI:10.1145/1978942.1979306

45.

Lévesque V, Oram L, MacLean K. Exploring the design space of programmable friction for scrolling interactions. Haptics Symposium (HAPTICS). Vancouver, BC, Canada: IEEE, 2012, 23–30 DOI:10.1109/HAPTIC.2012.6183765

46.

Shin H, Lim J M, Lee J U, Lee G, Kyung K U. Effect of tactile feedback for button GUI on mobile touch devices. ETRI Journal,2014,36(6):979–987 DOI:10.4218/etrij.14.0114.0028

47.

Liu G H, Sun X Y, Wang D X, Liu Y, Zhang Y R. Effect of electrostatic tactile feedback on accuracy and efficiency of Pan gestures on touch screens. IEEE Transactions on Haptics, 2018, 11(1): 51–60 DOI:10.1109/toh.2017.2742514

48.

Tran J J, Trewin S, Swart C, John B E, Thomas J C. Exploring pinch and spread gestures on mobile devices. In: Proceedings of the 15th international conference on Human-computer interaction with mobile devices and services. Munich, Germany: ACM, 2013: 151–160 DOI:10.1145/2493190.2493221

49.

Wang D X, Song M, Naqash A, Zheng Y K, Xu W L, Zhang Y R. Toward whole-hand kinesthetic feedback: A survey of force feedback gloves. IEEE Transactions on Haptics, 2018: 1 DOI:10.1109/toh.2018.2879812

50.

Pacchierotti C, Sinclair S, Solazzi M, Frisoli A, Hayward V, Prattichizzo D. Wearable haptic systems for the fingertip and the hand: Taxonomy, review, and perspectives. IEEE Transactions on Haptics, 2017, 10(4): 580–600 DOI:10.1109/toh.2017.2689006

51.

CyberGlove Systems LLC.http://www.cyberglovesystems. com/

52.

HGlove-HAPTIONSA.https://www.haption.com/en/products-en/hglove-en.html

53.

http://www.dextarobotics.com

54.

HaptX|HapticglovesforVR training, simulation, anddesign. http://haptx.com/

55.

Plexus/High-performanceVR/AR Gloves.http://plexus.im/

56.

Sense Glove | Sense Glove.https://www.senseglove.com/

57.

Avatar VR-NeuroDigital Technologies.http://www.neurodigital.es/avatarvr/

58.

Contact CI.https://contactci.co/

59.

Senso|Senso Glove|Senso Suit-probably the best controller for Virtual and Augmented reality.https: //senso. me/

60.

Ultra-light gloves let users “touch” virtual objects". https://actu.epfl.ch/news/ultra-light-gloves-let-users-touch-virtual-objects/

61.

VRgluv-Force Feedback Gloves for Virtual Reality. https://vrgluv.com/#contact

62.

Zheng Y K, Wang D X, Wang Z Q, Zhang Y, Zhang Y R, Xu W L. Design of a lightweight force-feedback glove with a large workspace. Engineering, 2018, 4(6): 869–880 DOI:10.1016/j.eng.2018.10.003

63.

Zhang Y, Wang D X, Wang Z Q, Wang Y P, Wen L, Zhang Y R. A two-fingered force feedback glove using soft actuators. Haptics Symposium (HAPTICS), San Francisco, CA, USA: IEEE, 2018: 186–191 DOI:10.1109/HAPTICS.2018.8357174

64.

Gabardi M, Solazzi M, Leonardis D, Frisoli A. A new wearable fingertip haptic interface for the rendering of virtual shapes and surface features. Haptics Symposium (HAPTICS). Philadelphia, PA, USA: IEEE, 2016: 140–146 DOI:10.1109/HAPTICS.2016.7463168

65.

Schorr S B, Okamura A M. Three-dimensional skin deformation as force substitution: Wearable device design and performance during haptic exploration of virtual environments. IEEE Transactions on Haptics, 2017, 10(3): 418–430 DOI:10.1109/toh.2017.2672969

66.

Prattichizzo D, Chinello F, Pacchierotti C, Malvezzi M. Towards wearability in fingertip haptics: A 3-DoF wearable device for cutaneous force feedback. IEEE Transactions on Haptics, 2013, 6(4): 506–516 DOI:10.1109/toh.2013.53

67.

Talvas A, Marchal M, Duriez C, Otaduy M A. Aggregate constraints for virtual manipulation with soft fingers. IEEE Transactions on Visualization and Computer Graphics, 2015, 21(4): 452–461 DOI:10.1109/tvcg.2015.2391863

68.

Jacobs J, Stengel M, Froehlich B. A generalized God-object method for plausible finger-based interactions in virtual environments. IEEE Symposium on 3D User Interfaces (3DUI). Costa Mesa, CA, USA: 2012, 43–51 DOI:10.1109/3DUI.2012.6184183

69.

Cui T, Song A G, Xiao J. Modeling global deformation using circular beams for haptic interaction. RSJ International Conference on Intelligent Robots and Systems. St. Louis, MO, USA: IEEE, 2009, 1743–1748 DOI:10.1109/IROS.2009.5354710

70.

Pacchierotti C, Salvietti G, Hussain I, Meli L, Prattichizzo D. The hRing: A wearable haptic device to avoid occlusions in hand tracking. Haptics Symposium (HAPTICS), Philadelphia, PA, USA, IEEE, 2016: 134–139 DOI:10.1109/HAPTICS.2016.7463167

71.

Maisto M, Pacchierotti C, Chinello F, Salvietti G, de Luca A, Prattichizzo D. Evaluation of wearable haptic systems for the fingers in augmented reality applications. IEEE Transactions on Haptics, 2017, 10(4): 511–522 DOI:10.1109/toh.2017.2691328

72.

Zachmann G, Rettig A. Natural and robust interaction in virtual assembly simulation. In: Eighth ISPE International Conference on Concurrent Engineering: Research and Applications (ISPE/CE2001), 2001, 425–434

73.

Moehring M, Froehlich B. Pseudo-physical interaction with a virtual car interior in immersive environments. In: the 11th Eurographics conference on Virtual Environments. Aalborg, Denmark: 2005, 181−–189

74.

Holz D, Ullrich S, Wolter M, Kuhlen T. Multi-Contact Grasp Interaction for Virtual Environments. Journal of Virtual Reality and Broadcasting, 2008, 5

75.

Borst C W, Indugula A P. Realistic virtual grasping. In: VR 2005, Virtual Reality. Bonn, Germany: IEEE, 2005, 91–98 DOI:10.1109/VR.2005.1492758

76.

Ott R, de Perrot V, Thalmann D, Vexo F. MHaptic: a haptic manipulation library for generic virtual environments. In: International Conference on Cyberworlds. Hannover, Germany, 2007: 338–345 DOI:10.1109/CW.2007.54

77.

Gourret J P, Thalmann N M, Thalmann D. Simulation of object and human skin formations in a grasping task. ACM SIGGRAPH Computer Graphics, 1989, 23(3): 21–30 DOI:10.1145/74334.74335

78.

Garre C, Hernández F, Gracia A, Otaduy M A. Interactive simulation of a deformable hand for haptic rendering. In: IEEE World Haptics Conference. Istanbul, Turkey: IEEE, 2011, 239−244 DOI:10.1109/WHC.2011.5945492

79.

Okamoto S, Nagano H, Yamada Y. Psychophysical dimensions of tactile perception of textures. IEEE Transactions on Haptics, 2013, 6(1): 81–93 DOI:10.1109/toh.2012.32

80.

Lederman S J, Klatzky R L. Haptic perception: A tutorial. Attention, Perception & Psychophysics, 2009, 71(7): 1439–1459 DOI:10.3758/app.71.7.1439

81.

Rossignac J, Allen M, Book W J, Glezer A, Ebert-Uphoff I, Shaw C, Rosen D, Askins S, Bai J, Bosscher P, Gargus J, Kim B M, Llamas I, Nguyen A, Yuan G, Zhu H H. Finger sculpting with Digital Clay: In: 3D shape input and output through a computer-controlled real surface. 2003 Shape Modeling International. Seoul, South Korea: 2003, 229–231 DOI:10.1109/SMI.2003.1199620

82.

Stanley A A, Okamura A M. Deformable model-based methods for shape control of a haptic jamming surface. IEEE Transactions on Visualization and Computer Graphics, 2017, 23(2): 1029–1041 DOI:10.1109/tvcg.2016.2525788

83.

Lu L, Zhang Y, Guo X. A Surface Texture Display for Flexible Virtual Objects. In: AsiaHaptics Conference, 2018

84.

Fani S, Ciotti S, Battaglia E, Moscatelli A, Bianchi M. W-FYD: A wearable fabric-based display for haptic multi-cue delivery and tactile augmented reality. IEEE Transactions on Haptics, 2018, 11(2): 304–316 DOI:10.1109/toh.2017.2708717

85.

Tong Q, Yuan Z, Liao X, Zheng M, Yuan T, Zhao J. Magnetic Levitation Haptic Augmentation for Virtual Tissue Stiffness Perception. IEEE Transactions on Visualization and Computer Graphics, 2018, 24(12): 3123−3136

86.

Adel A, Micheal M M, Self M A, Abdennadher S, Khalil I S M, Rendering of Virtual Volumetric Shapes Using an Electromagnetic-Based Haptic Interface. In: 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Madrid, Spain: 2018, 1–9

87.

Mosadegh B, Polygerinos P, Keplinger C, Wennstedt S, Shepherd R F, Gupta U, Shim J, Bertoldi K, Walsh C J, Whitesides G M. Pneumatic networks for soft robotics that actuate rapidly. Advanced Functional Materials, 2014, 24(15): 2163–2170 DOI:10.1002/adfm.201303288

88.

Ge Q, Sakhaei A H, Lee H, Dunn C K, Fang N X, Dunn M L. Multimaterial 4D printing with tailorable shape memory polymers. Scientific Reports, 2016, 6: 31110 DOI:10.1038/srep31110

89.

Ware T H, McConney M E, Wie J J, Tondiglia V P, White T J. Voxelated liquid crystal elastomers. Science, 2015, 347(6225): 982–984 DOI:10.1126/science.1261019

90.

Culbertson H, López Delgado J J, Kuchenbecker K J. One hundred data-driven haptic texture models and open-source methods for rendering on 3D objects. IEEE Haptics Symposium (HAPTICS), Houston, TX, USA: 2014, 319–325 DOI:10.1109/HAPTICS.2014.6775475

91.

Strese M, Lee J Y, Schuwerk C, Han Q F, Kim H G, Steinbach E. A haptic texture database for tool-mediated texture recognition and classification. In: International Symposium on Haptic, Audio and Visual Environments and Games (HAVE) Proceedings. Richardson, TX, USA: IEEE, 2014, 118–123 DOI:10.1109/HAVE.2014.6954342

92.

Strese M, Boeck Y, Steinbach E. Content-based surface material retrieval. World Haptics Conference (WHC). Munich, Germany: IEEE, 2017, 352–357 DOI:10.1109/WHC.2017.7989927

93.

Zheng H T, Fang L, Ji M Q, Strese M, Ozer Y, Steinbach E. Deep learning for surface material classification using haptic and visual information. IEEE Transactions on Multimedia, 2016, 18(12): 2407–2416 DOI:10.1109/tmm.2016.2598140

94.

Stevens J C. Thermal intensification of touch sensation: Further extensions of the Weber phenomenon. Sensory Processes, 1979, 3(3), 240–248

95.

Charpentier A. Analyse experimentale de quelques elements de la sensation de poids. Archive de Physiologie normale et pathologiques, 1981, 3, 122–135

96.

Ellis R R, Lederman S J. The material-weight illusion revisited. Perception & Psychophysics, 1999, 61(8): 1564–1576 DOI:10.3758/bf03213118

97.

Ellis R R, Lederman S J. The golf-ball illusion: Evidence for top-down processing in weight perception. Perception, 1998, 27(2): 193–201 DOI:10.1068/p270193

98.

Craig A, Bushnell M. The thermal grill illusion: Unmasking the burn of cold pain. Science, 1994, 265(5169): 252–255 DOI:10.1126/science.8023144

99.

Leung A Y, Wallace M S, Schulteis G, Yaksh T L. Qualitative and quantitative characterization of the thermal grill. Pain, 2005, 116(1): 26–32 DOI:10.1016/j.pain.2005.03.026