A Generalized Tracking Wall Approach to the Haptic Simulation of Tip Forces During Needle Insertion
Haptic simulation of needle insertion requires both a needle-tissue interaction model and a method to render the outputs of this model into real-time force feedback for the user. In comparison with interaction models, rendering methods in the literature have seen little development and are either oversimplified or too computationally complex. Therefore, this study introduces the Generalized Tracking Wall (GTW) approach, a haptic rendering method inspired by the proxy approach. It aims to accurately simulate the interaction between a needle tip and soft tissues without the complex calculations of tissue deformations. The essence of the proposed method is that it associates an algorithm based on the energetic analysis of cutting with a contact model capable of simulating viscoelasticity and nonlinearity. This association proved to be a potent tool to faithfully replicate the different phases of needle insertion while adhering to underlying physics. Multi-layered-tissue insertions are also considered. The performance and generecity of the GTW are first evaluated through simulations. Then, the GTW is experimentally compared to empirical methods inspired by the literature.
A Novel Ungrounded Haptic Device for Generation and Orientation of Force and Torque Feedbacks
To provide deeper immersion for the user in the virtual environments, both force and torque feedbacks are required rather than the mere use of visual and auditory ones. In this paper, we develop a novel propeller-based Ungrounded Handheld Haptic Device (UHHD) that delivers both force and torque feedbacks in one device to help the user experience a realistic sensation of immersion in a three-dimensional (3D) space. The proposed UHHD uses only a pair of propellers and a set of sliders to continuously generate the desired force and torque feedbacks up to 15N and 1N.m in magnitude in less than 370ms, respectively. The produced force and torque feedbacks are oriented in a desired direction using a gimbal mechanism where the propellers are mounted inside in such a way that a simple structure is obtained. These features facilitate the control of the proposed UHHD and enhance its practicality in various applications. To prove the capability of the system, we model it and elaborate on the force and torque analyses. Next, we develop a robust parallel force/position controller to tackle the structured and unstructured uncertainties. Finally, a measurement setup is manufactured to experimentally evaluate the performance of the UHHD and the controller. The implementation of the controller on the developed UHHD prototype shows that a satisfactory control performance is achievable in terms of offering the desired force and torque feedbacks.
A Visuo-Haptic System for Nodule Detection Training: Insights from EEG and behavioral analysis
Medical palpation is a key skill for clinicians. It is typically trained using animal and synthetic models, which however raise ethical concerns and produce high volumes of consumables. An alternative could be visuo-haptic simulations, despite their training efficacy has not been proved yet. The assessment of palpatory skills requires objective methods, that can be achieved by combining performance metrics with electroencephalography (EEG). The goals of this study were to: (i) develop a visuo-haptic system to train nodule detection, combining a Geomagic Touch haptic device with a visuo-haptic simulation of a skin patch and a nodule, implemented using SOFA framework; (ii) assess whether this system could be used for training and evaluation. To do so, we collected performance and EEG data of 19 subjects performing multiple repetitions of a nodule detection task. Results revealed that participants could be divided in low and high performers; the former applied a greater pressure when looking for the nodule and showed a higher EEG alpha (8.5 - 13 Hz) peak at rest; The latter explored the skin remaining on its surface and were characterized by low alpha power. Furthermore, alpha power positively correlated with error and negatively with palpation depth. Altogether, these results suggest that alpha power might be an indicator of performance, denoting an increase in vigilance, attention, information processing, cognitive processes, and engagement, ultimately affecting strategy and performance. Also, the combination of EEG with performance data can provide an objective measure of the user's palpation ability.
Perceptual Constancy in the Speed Dependence of Friction During Active Tactile Exploration
Fingertip friction is a key component of tactile perception. In active tactile exploration, friction forces depend on the applied normal force and on the sliding speed chosen. We have investigated whether humans perceive the speed dependence of friction for textured surfaces of materials, which show either increase or decrease of the friction coefficient with speed. Participants perceived the decrease or increase when the relative difference in friction coefficient between fast and slow sliding speed was more than 20 %. The fraction of comparison judgments which were in agreement with the measured difference in friction coefficient did not depend on variations in the applied normal force. The results indicate a perceptual constancy for fingertip friction with respect to self-generated variations of sliding speed and applied normal force.
HM-Array: A Novel Haptic Magnetism-based Leader-follower Platform for Minimally Invasive Robotic Surgery
Ensuring the safety and authenticity of haptic feed2 back is crucial in the domain of surgical operations, particularly in procedures like Natural Orifice Transluminal Endoscopic Surgery (NOTES) and Minimally Invasive Robotic Surgery (MIRS). To enhance the control efficiency of the robotic operating console, we propose a haptic magnetism-based array (HM7 Array). This system employs a solenoid array and a detection stylus to achieve controller localization without the need for additional sensors, while simultaneously generating haptic effects. The device effectively controls the surgical robot's pose through a localization-haptic combined loopback. The entire system is scheduled on a finite state machine (FSM), seamlessly fusing localization and haptic generation. Psychometric evaluations con14 ducted through user studies have demonstrated the device's pre15 cision and accuracy. Teleoperation experimental results further confirm its potential value in surgical treatments and broader medical haptic applications.
Multichannel Vibrotactile Glove: Validation of a new device designed to sense vibrations
There is a growing interest in using the tactile modality as a compensation or sensory augmentation tool in various fields. The Multichannel Vibrotactile Glove was designed to meet the needs of these diverse disciplines and overcome the limitations of current sound-to-touch technologies. Using 12 independent haptic exciters on each finger's back and on the palm, the device can convey acoustic information to cutaneous vibrotactile receptors with precise control of the location, frequency, timing, and intensity. A staircase method was used to model vibration detection thresholds at six frequencies (100, 200, 250, 500, 800, 1000 Hertz) for each actuator position (All, Thumb, Index, Major, Middle, Pinky, Palm) and both hands (Right, Left). No between hand difference was observed and all finger actuators provided consistent thresholds, except for the Palm which exhibited higher thresholds. Spatial summation effects were observed when all actuators were activated simultaneously. Detection thresholds significantly increased at 100 Hertz and above 500 Hertz. These findings confirm that the system provides uniform stimulation across hands and actuators. Overall, the Multichannel Vibrotactile Glove provides the freedom to send various acoustic features to individual actuators, providing a versatile tool for research and a potential technology to substitute, compensate, or extend sensory perception.
Two rapid alternatives compared to the staircase method for the estimation of the vibrotactile perception threshold
Wearable vibrotactile devices seem now mature for entering the daily lives and practices of more and more users. However, vibrotactile perception can greatly differ between individuals, in terms of psychophysics and physiology, not to mention higher levels (cognitive or affective for example). Broadly-distributed and affordable vibrotactile devices hence must be adapted to each user's own perception, first of all by delivering intensity levels that are in the perceptible range of the user. This implies determining the user's own thresholds of perception, and then adapting the devices' output levels. Classical methods for the estimation of thresholds elicit too long procedures, and little is known about the reliability of other methods in the vibrotactile domain. This article focuses on two alternative methods for the estimation of amplitude thresholds in the vibrotactile modality ("increasing-intensity" and "decreasing-intensity" methods), and compares their estimations to the estimations from a staircase method. Both rapid methods result in much shorter test durations, and are found less stressful and tiring than the classic method, while showing threshold estimations that are never found to differ by more than 1.5 JND from the estimations by the classic method.
Investigating the Kappa Effect Elicited Through Concurrent Visual and Tactile Stimulation
The experience of time and space in subjective perception is closely connected. The Kappa effect refers to the phenomenon where the perceived duration of the time interval between stimuli is influenced by the spatial distance between them. In this study, we aimed to explore the Kappa effect from a psychophysical perspective. We investigated participants' perception of temporal duration in the sub-second range by delivering visual and tactile inputs through wearable devices placed on both the palm and the forearm. We compared the impact of unimodal sensory stimulation, involving either visual or tactile stimuli, with different bimodal stimulation conditions. Our results revealed that the illusory effect on inter-stimulus duration perception can be observed in both unimodal conditions, although the distortions were significantly more pronounced in vision. In the multimodal stimulation condition, where visual stimuli were presented at non-equidistant spatial locations, the integration of tactile input did not reduce the Kappa effect, regardless of the spatial location of the tactile stimuli. However, when the visual stimuli were equidistant in space, regardless of the spatial location of the tactile stimuli, the Kappa effect disappeared. These results can shed light on the effect played by multimodality on the perception of space and time.
Neuromuscular Interfacing for Advancing Kinesthetic and Teleoperated Programming by Demonstration of Collaborative Robots
This study addresses the challenges of Programming by Demonstration (PbD) in the context of collaborative robots, focusing on the need to provide additional degrees of programming without hindering the user's ability to demonstrate trajectories. PbD enables an intuitive programming of robots through demonstrations, allowing non-expert users to teach robot skills without coding. The two main PbD modalities, observational and kinesthetic, have limitations when it comes to programming the diverse functionalities offered by modern collaborative robots. To overcome these limitations, the study proposes the use of a wearable human-robot interface based on surface Electromyography (sEMG) to measure the forearm's muscle co-contraction level, enabling additional programming inputs through hand stiffening level modulations without interfering with voluntary movements. Vibrotactile feedback enhances the operator's understanding of the additional programming inputs during PbD tasks. The proposed approach is demonstrated through experiments involving a collaborative robot performing an industrial wiring task. The results showcase the effectiveness and intuitiveness of the interface, allowing simultaneous programming of robot compliance and gripper grasping. The framework, applicable to both teleoperation and kinesthetic teaching, demonstrated effectively in an industrial wiring task with a 100% success rate over the group of subjects. Furthermore, the presence of vibortactile feedback showed an average decrease of programming errors of 33%, and statistical analyses confirmed the subjects' ability to correctly modulate co-contraction levels. This innovative framework augments programming by demonstration by integrating neuromuscular interfacing and introducing structured programming logics, providing an intuitive human-robot interaction for programming both gripper and compliance in teleoperation and kinesthetic teaching.
Design and Characterisation of Multi-cavity, Fluidic Haptic Feedback System for Mechano-tactile Feedback
Numerous studies have indicated that the use of a closed-loop haptic feedback system, which offers various mechano-tactile stimuli patterns with different actuation methods, can improve the performance and grasp control of prosthetic hands. Purely mechanical-driven feedback approaches for various mechano-tactile stimuli patterns, however, have not been explored. In this paper, a multi-cavity fluidic haptic feedback system is introduced with details of design, fabrication, and validation. The multi-cavity haptic feedback system can detect the physical touch with direction at the fingertip sensor. The direction of the force is reflected in the form of pressure deviation in the multi-cavity fingertip sensor. The feedback actuator generates various mechano-tactile stimuli patterns according to the pressure deviation from the fingertip sensor. Hence, users can identify the force with direction according to the stimuli patterns. The haptic feedback system is validated through two experiments. The initial experiment characterises the system and establishes the relationship between the fingertip sensor and feedback actuator. The subsequent experiment, a human interaction test, confirms the system's capability to detect force with directions and generate corresponding tactile stimuli in the feedback actuator. The outcomes corroborate the idea that participants are generally capable of discerning changes in angle.
Perceiving Synchrony: Determining Thermal-tactile Simultaneity Windows
In cutaneous displays in which both tactile and thermal signals are presented, it is important to understand the temporal requirements associated with presenting these signals so that they are perceptually synchronous. Such synchrony is important to provide realistic touch experiences in applications involving object recognition and social touch interactions. In the present experiment the temporal window within which tactile and warm thermal stimuli are perceived to occur at the same time was determined. A Simultaneity Judgment Task was used in which pairs of tactile and thermal stimuli were presented on the hand at varying stimulus onset asynchronies, and participants determined whether the stimuli were simultaneous or not. The results indicated that the average simultaneity window width was 1041 ms. The average point of subjective simultaneity (PSS) was -569 ms, indicating that participants perceived simultaneity best when the warm thermal stimulus preceded the tactile stimulus by 569 ms. These findings indicate that thermal and tactile stimuli do not need to be displayed simultaneously for the two stimuli to be perceived as being synchronous and therefore the timing of such stimuli can be adjusted to maximize the likelihood that they will both be perceived.
VT-SGN:Spiking Graph Neural Network for Neuromorphic Visual-Tactile Fusion
Current issues with neuromorphic visual-tactile perception include limited training network representation and inadequate cross-modal fusion. To address these two issues, we proposed a dual network called visual-tactile spiking graph neural network (VT-SGN) that combines graph neural networks and spiking neural networks to jointly utilize the neuromorphic visual and tactile source data. First, the neuromorphic visual-tactile data were expanded spatiotemporally to create a taxel-based tactile graph in the spatial domain, enabling the complete exploitation of the irregular spatial structure properties of tactile information. Subsequently, a method for converting images into graph structures was proposed, allowing the vision to be trained alongside graph neural networks and extracting graph-level features from the vision for fusion with tactile data. Finally, the data were expanded into the time domain using a spiking neural network to train the model and propagate it backwards. This framework effectively utilizes the structural differences between sample instances in the spatial dimension to improve the representational power of spiking neurons, while preserving the biodynamic mechanism of the spiking neural network. Additionally, it effectively solves the morphological variance between the two perceptions and further uses complementary data between visual and tactile. To demonstrate that our approach can improve the learning of neuromorphic perceptual information, we conducted comprehensive comparative experiments on three datasets to validate the benefits of the proposed VT-SGN framework by comparing it with state-of-the-art studies.
Tactile Weight Rendering: A Review for Researchers and Developers
Haptic rendering of weight plays an essential role in naturalistic object interaction in virtual environments. While kinesthetic devices have traditionally been used for this aim by applying forces on the limbs, tactile interfaces acting on the skin have recently offered potential solutions to enhance or substitute kinesthetic ones. Here, we aim to provide an in-depth overview and comparison of existing tactile weight rendering approaches. We categorized these approaches based on their type of stimulation into asymmetric vibration and skin stretch, further divided according to the working mechanism of the devices. Then, we compared these approaches using various criteria, including physical, mechanical, and perceptual characteristics of the reported devices. We found that asymmetric vibration devices have the smallest form factor, while skin stretch devices relying on the motion of flat surfaces, belts, or tactors present numerous mechanical and perceptual advantages for scenarios requiring more accurate weight rendering. Finally, we discussed the selection of the proposed categorization of devices together with the limitations and opportunities for future research. We hope this study guides the development and use of tactile interfaces to achieve a more naturalistic object interaction and manipulation in virtual environments.
Passive Realizations of Series Elastic Actuation: Effects of Plant and Controller Dynamics on Haptic Rendering Performance
We introduce minimal passive physical realizations of series (damped) elastic actuation (S(D)EA) under closed-loop control to determine the effect of different plant parameters and controller gains on the closed-loop performance of the system and to establish an intuitive understanding of the passivity bounds. Furthermore, we explicitly derive the feasibility conditions for these passive physical equivalents and compare them to the necessary and sufficient conditions for the passivity of S(D)EA under velocity-sourced impedance control (VSIC) to establish their relationship. Through the passive physical equivalents, we rigorously compare the effect of different plant dynamics (e.g., SEA and SDEA) on the system performance. We demonstrate that passive physical equivalents make the effect of controller gains explicit and establish a natural means for effective impedance analysis. We also show that passive physical equivalents promote co-design thinking by enforcing simultaneous and unbiased consideration of (possibly negative) controller gains and plant parameters. We demonstrate the usefulness of negative controller gains when coupled with properly designed plant dynamics. Finally, we provide experimental validations of our theoretical passivity results and comprehensive characterizations of the haptic rendering performance of S(D)EA under VSIC.
Validation of Snaptics: A Modular Approach to Low-Cost Wearable Multi-Sensory Haptics
Wearable haptic devices provide touch feedback to users for applications including virtual reality, prosthetics, and navigation. When these devices are designed for experimental validation in research settings, they are often highly specialized and customized to the specific application being studied. As such, it can be difficult to replicate device hardware due to the associated high costs of customized components and the complexity of their design and construction. In this work, we present Snaptics, a simple and modular platform designed for rapid prototyping of fully wearable multi-sensory haptic devices using 3D-printed modules and inexpensive off-the-shelf components accessible to the average hobbyist. We demonstrate the versatility of the modular system and the salience of haptic cues produced by wearables constructed with Snaptics modules in two human subject experiments. First, we report on the identification accuracy of multi-sensory haptic cues delivered by a Snaptics device. Second, we compare the effectiveness of the Snaptics Vibrotactile Bracelet to the Syntacts Bracelet, a high-fidelity wearable vibration feedback bracelet, in assisting participants with a virtual reality sorting task. Results indicate that participant performance was comparable in perceiving cue sets and in completing tasks when interacting with low-cost Snaptics devices as compared to a similar research-grade haptic wearables.
Tactile Perception in Upper Limb Prostheses: Mechanical Characterization, Human Experiments, and Computational Findings
Tactile feedback is essential for upper-limb prostheses functionality and embodiment, yet its practical implementation presents challenges. Users must adapt to non-physiological signals, increasing cognitive load. However, some prosthetic devices transmit tactile information through socket vibrations, even to untrained individuals. Our experiments validated this observation, demonstrating a user's surprising ability to identify contacted fingers with a purely passive, cosmetic hand. Further experiments with advanced soft articulated hands revealed decreased performance in tactile information relayed by socket vibrations as hand complexity increased. To understand the underlying mechanisms, we conducted numerical and mechanical vibration tests on four prostheses of varying complexity. Additionally, a machine-learning classifier identified the contacted finger based on measured socket signals. Quantitative results confirmed that rigid hands facilitated contact discrimination, achieving 83% accuracy in distinguishing index finger contacts from others. While human discrimination decreased with advanced hands, machine learning surpassed human performance. These findings suggest that rigid prostheses provide natural vibration transmission, potentially reducing the need for tactile feedback devices, which advanced hands may require. Nonetheless, the possibility of machine learning algorithms outperforming human discrimination indicates potential to enhance socket vibrations through active sensing and actuation, bridging the gap in vibration-transmitted tactile discrimination between rigid and advanced hands.
Optimized Sandwich and Topological Structures for Enhanced Haptic Transparency
Humans rely on multimodal perception to form representations of the world. This implies that environmental stimuli must remain consistent and predictable throughout their journey to our sensory organs. When it comes to vision, electromagnetic waves are minimally affected when passing through air or glass treated for chromatic aberrations. Similar conclusions can be drawn for hearing and acoustic waves. However, tools that propagate elastic waves to our cutaneous afferents tend to color tactual perception due to parasitic mechanical attributes such as resonances and inertia. These issues are often overlooked, despite their critical importance for haptic devices that aim to faithfully render or record tactile interactions. Here, we investigate how to optimize this mechanical transmission with sandwich structures made from rigid, lightweight carbon fiber sheets arranged around a 3D-printed lattice core. Through a comprehensive parametric evaluation, we demonstrate how this design paradigm provides superior haptic transparency, regardless of the lattice types. Drawing an analogy with topology optimization, our solution approaches a foreseeable technological limit. It offers a practical way to create high-fidelity haptic interfaces, opening new avenues for research on tool-mediated interactions.
Effect of Finger Moisture on Tactile Perception of Electroadhesion
We investigate the effect of finger moisture on the tactile perception of electroadhesion with 10 participants. Participants with moist fingers exhibited markedly higher threshold levels. Our electrical impedance measurements show a substantial reduction in impedance magnitude when sweat is present at the finger-touchscreen interface, indicating increased conductivity. Supporting this, our mechanical friction measurements show that the relative increase in electrostatic force due to electroadhesion is lower for a moist finger.
3D-Printed Models for Optimizing Tactile Braille & Shape Display
Existing market-available refreshable Braille displays (RBDs) offer limited functionality at a high cost, hindering accessibility for individuals with blindness and visual impairment for teaching and learning purposes. This motivates us to develop a multi-functional, compact, and affordable RBD tailored for educational institutes to enhance teaching and learning experiences. We propose the development of BLISS (Braille Letters and Interactive Shape Screen), a novel RBD, that BLISS presents a unique configuration arrangement of Braille cells that accommodates up to six letters at a time and shapes by reusing the Braille pins. To determine the optimal specifications, including size, Braille cell spacing, and pin configuration, we fabricated and evaluated 3D-printed sets, mimicking how BLISS would display letters and shapes. We tested 36 variants of 3D-printed sets with 8 individuals with blindness and visual impairment and found that conventional Braille spacing is insufficient for accurately representing shapes. Hence, BLISS will introduce a novel design that uses a pin configuration to raise the extra pins to present shapes and lower them for Braille letters, providing dual-mode operation. Our findings show the potential of BLISS to display both Braille letters and shapes on the same refreshable display, offering a novel, compact, and cost-effective solution.
Effects of Wall and Freespace Damping Levels on Virtual Wall Stiffness Classification
Virtual damping is often employed to improve stability in virtual environments, but it has previously been found to bias perception of stiffness, with its effects differing when it is introduced locally within a wall/object or globally in both the wall and in freespace. Since many potential applications of haptic rendering involve not only comparisons between two environments, but also the ability to recognize rendered environments as belonging to different categories, it is important to understand the perceptual impacts of freespace and wall damping on stiffness classification ability. This study explores the effects of varying levels of freespace and wall damping on users' ability to classify virtual walls by their stiffness. Results indicate that freespace damping improves wall classification if the walls are damped, but will impair classification of undamped walls. These findings suggest that, in situations where users are expected to recognize and classify various stiffnesses, freespace damping can be a factor in narrowing or widening gaps in extended rate-hardness between softer and stiffer walls.
Characterization, Experimental Validation and Pilot User Study of the Vibro-Inertial Bionic Enhancement System (VIBES)
This study presents the characterization and validation of the VIBES, a wearable vibrotactile device that provides high-frequency tactile information embedded in a prosthetic socket. A psychophysical characterization involving ten able-bodied participants is performed to compute the Just Noticeable Difference (JND) related to the discrimination of vibrotactile cues delivered on the skin in two forearm positions, with the goal of optimising vibrotactile actuator position to maximise perceptual response. Furthermore, system performance is validated and tested both with ten able-bodied participants and one prosthesis user considering three tasks. More specifically, in the Active Texture Identification, Slippage and Fragile Object Experiments, we investigate if the VIBES could enhance users' roughness discrimination and manual usability and dexterity. Finally, we test the effect of the vibrotactile system on prosthetic embodiment in a Rubber Hand Illusion (RHI) task. Results show the system's effectiveness in conveying contact and texture cues, making it a potential tool to restore sensory feedback and enhance the embodiment in prosthetic users.