Connectivity-Maintenance Teleoperation of a UAV Fleet with Wearable Haptic Feedback

2020, IEEE Transactions on Automation Science and Engineering, IF 4.938

This paper presents the design of a decentralized connectivity-maintenance algorithm for the teleoperation of a team of multiple UAVs, together with an extensive human subject evaluation in virtual and real environments. The proposed connectivity-maintenance algorithm enhances earlier works by improving their applicability, safety, effectiveness, and ease of use, by including: (i) an airflow-avoidance behavior that avoids stack downwash phenomena in rotor-based aerial robots; (ii) a consensus-based action for enabling fast displacements with minimal topology changes by having all follower robots moving at the leader's velocity; (iii) an automatic decrease of the minimum degree of connectivity, enabling an intuitive and dynamic expansion/compression of the formation; and (iv) an automatic detection and resolution of deadlock configurations, i.e., when the robot leader cannot move due to counterbalancing connectivity- and external-related inputs. We also devised and evaluated different interfaces for teleoperating the team as well as different ways of receiving information about the connectivity force acting on the leader. Results of two human subject experiments show that the proposed algorithm is effective in various situations. Moreover, using haptic feedback to provide information about the team connectivity outperforms providing both no feedback at all and sensory substitution via visual feedback.

Author = {Aggravi, M. and Pacchierotti, C. and {Robuffo Giordano}, P.},
Title = {Connectivity-Maintenance Teleoperation of a UAV Fleet with Wearable Haptic Feedback,
Journal = {{IEEE Transactions on Automation Science and Engineering}},
Year = {2020}
Pages = {0},
Doi = {10.1109/TASE.2020.3000060}


Cooperative Navigation for Mixed Human–Robot Teams Using Haptic Feedback

2016, IEEE Transactions on Human-machine Systems, IF 3.374

In this paper, we present a novel cooperative navigation control for human-robot teams. Assuming that a human wants to reach a final location in a large environment with the help of a mobile robot, the robot must steer the human from the initial to the target position. The challenges posed by cooperative human-robot navigation are typically addressed by using haptic feedback via physical interaction. In contrast to that, in this paper we describe a different approach, in which the human-robot interaction is achieved via wearable vibrotactile armbands. In the proposed work the subject is free to decide her/his own pace. A warning vibrational signal is generated by the haptic armbands when a large deviation with respect to the desired pose is detected by the robot. The proposed method has been evaluated in a large indoor environment, where fifteen blindfolded human subjects were asked to follow the haptic cues provided by the robot. The participants had to reach a target area, while avoiding static and dynamic obstacles. Experimental results revealed that the blindfolded subjects were able to avoid the obstacles and safely reach the target in all of the performed trials. A comparison is provided between the results obtained with blindfolded users and experiments performed with sighted people.

Author = {Scheggi, S. and Aggravi, M. and Prattichizzo, D.},
Title = {{Cooperative Navigation for Mixed Human-Robot Teams Using Haptic Feedback}},
Journal = {{IEEE Transactions on Human-Machine Systems}},
Volume = {47},
Number = {4},
Pages = {462--473},
Year = {2016},
Doi = {10.1109/THMS.2016.2608936}

Evaluation of a predictive approach in steering the human locomotion via haptic feedback

2015, IEEE/RSJ International Conference Intelligent Robots and Systems

In this paper, we present a haptic guidance policy to steer the user along predefined paths, and we evaluate a predictive approach to compensate actuation delays that humans have when they are guided along a given trajectory via sensory stimuli. The proposed navigation policy exploits the nonholonomic nature of human locomotion in goal directed paths, which leads to a very simple guidance mechanism. The proposed method has been evaluated in a real scenario where seven human subjects were asked to walk along a set of predefined paths, and were guided via vibrotactile cues. Their poses as well as the related distances from the path have been recorded using an accurate optical tracking system. Results revealed that an average error of 0.24 m is achieved by using the proposed haptic policy, and that the predictive approach does not bring significant improvements to the path following problem for what concerns the distance error. On the contrary, the predictive approach achieved a definitely lower activation time of the haptic interfaces.

Author = {Aggravi, M. and Scheggi, S. and Prattichizzo, D.},
Title = {Evaluation of a predictive approach in steering the human locomotion via haptic feedback},
BookTitle = {{Proc. IEEE/RSJ Int. Conf. Intelligent Robots and Systems}},
Address = {Hamburg, Germany},
Year = {2015}
Pages = {597--602},
Doi = {10.1109/IROS.2015.7353433}


Haptic Guidance in Dynamic Environments Using Optimal Reciprocal Collision Avoidance

2018, IEEE Robotics and Automation Letters, IF 3.608

Human guidance in situations where the users cannot rely on their main sensory modalities, such as assistive or search-and-rescue scenarios, is a challenging task. In this paper, we address the problem of guiding users along collision-free paths in dynamic environments, assuming that they cannot rely on their main sensory modalities. In order to safely guide the subjects, we adapt the Optimal Reciprocal Collision Avoidance to our specific problem. The proposed algorithm takes into account the stimuli which can be displayed to the users and the motion uncertainty of the users when reacting to the provided stimuli. The proposed algorithm was evaluated in three different dynamic scenarios. A total of 18 blindfolded human subjects were asked to follow haptic cues in order to reach a target area while avoiding real static obstacles and moving users. Three metrics such as time to reach the goal, length of the trajectories, and minimal distance from the obstacles are considered to compare results obtained using this approach and experiments performed without visual impairments. Experimental results reveal that blindfolded subjects are successfully able to avoid collisions and safely reach the targets in all the performed trials. Although in this paper we display directional cues via haptic stimuli, we believe that the proposed approach can be general and tuned to work with different haptic interfaces and/or feedback modalities.

Title = {Haptic guidance in dynamic environments using optimal reciprocal collision avoidance},
Author = {Baldi, Tommaso Lisini and Scheggi, Stefano and Aggravi, Marco and Prattichizzo, Domenico},
Journal = {{IEEE Robotics and Automation Letters}},
Volume = {3},
Number = {1},
Pages = {265--272},
Year = {2018},
Publisher = {{IEEE}},
Doi = {10.1109/LRA.2017.2738328},

Haptic Wrist Guidance Using Vibrations for Human-Robot Teams

2016, IEEE International Symposium on Robot and Human Interactive Communication

Human-Robot teams can efficiently operate in several scenarios including Urban Search and Rescue (USAR). Robots can access areas too small or deep for a person, can begin surveying larger areas that people are not permitted to enter and can carry sensors and instruments. One important aspect in this cooperative framework is the way robots and humans can communicate during rescue operation. Vision and audio modalities may result not efficient in case of reduced visibility or high noise. A promising way to guarantee effective communications between robot and human in a team is the exploitation of haptic signals. In this work, we present a possible solution to let a robot guide the position of a human operator’s hand by using vibrations. We demonstrate that an armband embedding four vibrating motors is enough to guide the wrist of an operator along a predefined path or in a target location. The results proposed can be exploited in human-robot teams. For instance, when the robot detects the position of a sensible target, it can guide the wrist of the operator in such position following an optimal path.

Author = {Aggravi, M. and Salvietti, G. and Prattichizzo, D.},
BookTitle = {{Proc. IEEE International Symposium on Robot and Human Interactive Communication (Ro-Man)}},
Title = {{Haptic Wrist Guidance Using Vibrations for Human-Robot Teams}},
Year = {2016},


Crowd Navigation in VR: Exploring Haptic Rendering of Collisions

2020, IEEE Transactions on Visualization and Computer Graphics, IF 4.558

Virtual reality (VR) is a valuable experimental tool for studying human movement, including the analysis of interactions during locomotion tasks for developing crowd simulation algorithms. However, these studies are generally limited to distant interactions in crowds, due to the difficulty of rendering realistic sensations of collisions in VR. In this work, we explore the use of wearable haptics to render contacts during virtual crowd navigation. We focus on the behavioural changes occurring with or without haptic rendering during a navigation task in a dense crowd, as well as on potential after-effects introduced by the use haptic rendering. Our objective is to provide recommendations for designing VR setup to study crowd navigation behaviour. To this end, we designed an experiment (N=23) where participants navigated in a crowded virtual train station without, then with, and then again without haptic feedback of their collisions with virtual characters. Results show that providing haptic feedback improved the overall realism of the interaction, as participants more actively avoided collisions. We also noticed a significant after-effect in the users' behaviour when haptic rendering was once again disabled in the third part of the experiment. Nonetheless, haptic feedback did not have any significant impact on the users' sense of presence and embodiment.

Title = {{Crowd Navigation in VR: exploring haptic rendering of collisions}},
Author = {Berton, Florian and Grzeskowiak, Fabien and Bonneau, Alexandre and Jovane, Alberto and Aggravi, Marco and Hoyet, Ludovic and Olivier, Anne-H{\'e}l{\`e}ne and Pacchierotti, Claudio and Pettr{\'e}, Julien},
Journal = {IEEE Transactions on Visualization and Computer Graphics},
Year = {2020},
Publisher = {IEEE}

Haptic-Enabled Decentralized Control of a Heterogeneous Human-Robot Team for Search and Rescue in Partially-known Environments

2021, IEEE Robotics and Automation Letters, IF 3.608

Teams of coordinated robots have been proven useful in several high-impact applications, including urban search and rescue (USAR) and disaster response. In this context, we present a decentralized haptic-enabled connectivity-maintenance control framework for heterogeneous human-robot teams. The proposed framework controls the coordinated motion of a team consisting of mobile robots and one human, for collaboratively achieving various exploration and SAR tasks. The human user physically becomes part of the team, moving in the same environment of the robots, while receiving rich haptic feedback about the team connectivity and the direction toward a safe path. We carried out two human subjects studies, both in simulated and real environments. Results show that the proposed approach is effective and viable in a wide range of SAR scenarios. Moreover, providing haptic feedback showed increased performance w.r.t. providing visual information only. Finally, conveying distinct feedback regarding the team connectivity and the path to follow performed better than providing the same information combined together.

Title = {{Haptic-Enabled Decentralized Control of a Heterogeneous Human-Robot Team for Search and Rescue in Partially-known Environments}},
Author = {Aggravi, Marco and Elsherif, A Alaaeldin Said and Giordano, Paolo Robuffo and Pacchierotti, Claudio},
Journal = {IEEE Robotics and Automation Letters},
Publisher = {IEEE},
Pages = {0},
Year = {2021},
DOI = {}