The Robotics group at DIAG, and the associated DIAG Robotics Lab, were established in the late 1980s with a commitment to develop innovative planning and control methods for industrial and service robots.
The main research topics are: nonlinear control of robots; control of manipula- tors with flexible elements (in particular, with Variable Stiffness Actuation); hybrid force/velocity and impedance control of manipulators interacting with the environment; optimization schemes in kinematically redundant robots; motion planning for high- dimensional systems; motion planning and control of wheeled mobile robots and other nonholonomic mechanical systems; control-based motion planning for mobile manipula- tors; motion planning and control of locomotion in humanoid robots; stabilization of un- deractuated robots; control of locomotion platforms for VR immersion; sensor-based nav- igation and exploration in unknown environments; image-based visual servoing; con- trol and visual servoing for unmanned aerial vehicles (UAV); multi-robot coordination and mutual localization; unsupervised continuous calibration of mobile robots; actua- tor/sensor fault detection and isolation in robots; safe control of physical human-robot collaboration; sensory supervision of human-robot interaction.
Most of our research activities undergo experimental validation in the DIAG Robotics Lab. The current equipments consist of three articulated manipulators (a 6R Universal Robots UR10, a 7R lightweight KUKA LBR4+ with FastResearchInterface, and a 6R KUKA KR5 industrial robot), two haptic interfaces with 3D force feedback (Geomagic Touch), an underactuated system (Pendubot by Quanser), and several mobile robots, including wheeled (a MagellanPro by iRobot, a team of five Khepera III by K-Team), legged (3 NAO humanoid robots by Aldebaran), and flying (a Hummingbird and a Pelican quadrotor UAVs by AscTec) platforms. These robots are equipped with sensing devices of various complexity, going from ultrasonic/laser range finders to cameras, and stereo vision systems. We have multiple RGB-D sensors, two 6D F/T sensors (Mini45 by ATI), and a HMD (Oculus Rift). We also have a sensorized platform (Cyberith Virtualizer) for locomotion and VR immersion. In the past, we have designed and built a two-link flexible manipulator (FlexArm) and a differentially-driven wheeled mobile robot (SuperMARIO).