Cooperative Navigation for Mixed Human–Robot Teams Using Haptic Feedback
In this paper, we present a novel cooperative navigation control for human-robot teams. Assuming that a human wants to reach a final location in a large environment with the help of a mobile robot, the robot must steer the human from the initial to the target position. The challenges posed by cooperative human-robot navigation are typically addressed by using haptic feedback via physical interaction. In contrast to that, in this paper we describe a different approach, in which the human-robot interaction is achieved via wearable vibrotactile armbands. In the proposed work the subject is free to decide her/his own pace. A warning vibrational signal is generated by the haptic armbands when a large deviation with respect to the desired pose is detected by the robot. The proposed method has been evaluated in a large indoor environment, where fifteen blindfolded human subjects were asked to follow the haptic cues provided by the robot. The participants had to reach a target area, while avoiding static and dynamic obstacles. Experimental results revealed that the blindfolded subjects were able to avoid the obstacles and safely reach the target in all of the performed trials. A comparison is provided between the results obtained with blindfolded users and experiments performed with sighted people.
Evaluation of a predictive approach in steering the human locomotion via haptic feedback
In this paper, we present a haptic guidance policy to steer the user along predefined paths, and we evaluate a predictive approach to compensate actuation delays that humans have when they are guided along a given trajectory via sensory stimuli. The proposed navigation policy exploits the nonholonomic nature of human locomotion in goal directed paths, which leads to a very simple guidance mechanism. The proposed method has been evaluated in a real scenario where seven human subjects were asked to walk along a set of predefined paths, and were guided via vibrotactile cues. Their poses as well as the related distances from the path have been recorded using an accurate optical tracking system. Results revealed that an average error of 0.24 m is achieved by using the proposed haptic policy, and that the predictive approach does not bring significant improvements to the path following problem for what concerns the distance error. On the contrary, the predictive approach achieved a definitely lower activation time of the haptic interfaces.
Haptic Guidance in Dynamic Environments Using Optimal Reciprocal Collision Avoidance
Human guidance in situations where the users cannot rely on their main sensory modalities, such as assistive or search-and-rescue scenarios, is a challenging task. In this paper, we address the problem of guiding users along collision-free paths in dynamic environments, assuming that they cannot rely on their main sensory modalities. In order to safely guide the subjects, we adapt the Optimal Reciprocal Collision Avoidance to our specific problem. The proposed algorithm takes into account the stimuli which can be displayed to the users and the motion uncertainty of the users when reacting to the provided stimuli. The proposed algorithm was evaluated in three different dynamic scenarios. A total of 18 blindfolded human subjects were asked to follow haptic cues in order to reach a target area while avoiding real static obstacles and moving users. Three metrics such as time to reach the goal, length of the trajectories, and minimal distance from the obstacles are considered to compare results obtained using this approach and experiments performed without visual impairments. Experimental results reveal that blindfolded subjects are successfully able to avoid collisions and safely reach the targets in all the performed trials. Although in this paper we display directional cues via haptic stimuli, we believe that the proposed approach can be general and tuned to work with different haptic interfaces and/or feedback modalities.
Haptic Wrist Guidance Using Vibrations for Human-Robot Teams
Human-Robot teams can efficiently operate in several scenarios including Urban Search and Rescue (USAR). Robots can access areas too small or deep for a person, can begin surveying larger areas that people are not permitted to enter and can carry sensors and instruments. One important aspect in this cooperative framework is the way robots and humans can communicate during rescue operation. Vision and audio modalities may result not efficient in case of reduced visibility or high noise. A promising way to guarantee effective communications between robot and human in a team is the exploitation of haptic signals. In this work, we present a possible solution to let a robot guide the position of a human operator’s hand by using vibrations. We demonstrate that an armband embedding four vibrating motors is enough to guide the wrist of an operator along a predefined path or in a target location. The results proposed can be exploited in human-robot teams. For instance, when the robot detects the position of a sensible target, it can guide the wrist of the operator in such position following an optimal path.