Developing Advanced Control Software for the iCub Humanoid Robot
By Daniele Pucci, Diego Ferigo, and Silvio Traversaro, Istituto Italiano di Tecnologia (IIT)
The iCub project was launched in 2004 as part of the RobotCub European Project, whose main aim was to study embodied cognition—the theory that an organism develops cognitive skills as it interacts with its environment (see sidebar). The main outcome of the iCub project is a one-meter-tall, 53-degrees-of-freedom humanoid currently being developed at the Italian Institute of Technology (IIT). Over the years, the iCub robot has been used as a research platform for diverse fields of applied robotics, including balancing, teleoperated walking, and robot-human collaboration (Figure 1).
iCub is equipped with more than 50 motors, as well as force-torque sensors, inertial measurement units, and dozens of encoders and accelerometers. Developing control algorithms for a robot this complex is a difficult challenge. Our team at IIT—the Dynamic Interaction Control team—has created a development workflow based on Simulink® and Simulink Coder™ that makes it possible for even inexperienced team members to rapidly implement new control features, validate them through simulation, and run them on an iCub robot without writing any low-level code.
iCub Research at IIT
The IIT iCub project includes several lines of research. The Dynamic Interaction Control team focuses on three areas: telexistence, agent-robot collaboration (with the agent being either a human or another humanoid robot), and aerial humanoid robotics.
Telexistence enables a human to exist virtually in another location via a robotic avatar. In our telexistence experiments, the iCub walks and manipulates objects in the real world while the human walks and manipulates objects in a virtual environment. When the human takes a step, we process the human motion and a reference signal is sent to the iCub, causing it to take a step. Similarly, when the human closes a hand, a signal is sent to cause the iCub’s hand to close.
Our research into agent-robot collaboration centers on ways that humans and robots can work together. In one of our experiments, a human helps the iCub to stand from a sitting position. The human wears a specially designed suit that provide real-time kinematics and dynamics data, enabling us to track and model the human’s movements and muscular stresses alongside the robot’s movements in Simulink. We recently extended the human-robot collaboration experiment with a robot-robot collaboration version in which one iCub helps a second iCub to stand (Figure 2).
Aerial humanoid robotics is one of our most active research areas. We are working on a version of iCub that will be equipped with jetpacks. It will be able to fly to a specified location, land, and begin walking and interacting with the environment (Figure 3). A robot with these capabilities would be useful in high-risk disaster and search-and-rescue scenarios, such as earthquakes, floods, or wildfires. In these situations, the value of having robots able to fly from one building to another looking for survivors, open doors, close gas valves, and enter buildings is incalculable.
Prototyping iCub Controllers
We prototype our control software using Simulink and the open-source Whole Body Toolbox, developed at IIT. Whole Body Toolbox is based on the BlockFactory dataflow framework, which we created to provide C++ interfaces for dataflow programming. We begin by modeling the controller in Simulink, incorporating sensors, actuators, and commonly used robotic algorithms. The Whole Body Toolbox then creates interfaces to either the real or the simulated iCub (Figure 4).
We use the Whole Body Toolbox most frequently on projects that involve dynamic balancing: The robot controller regulates the contact forces between the robot and the environment, enabling the robot to maintain its balance even if pushed or otherwise perturbed by a human. As a secondary task, the robot tries to maintain a posture chosen by the operator, filtering out any posture that alters its balance and stability.
We cosimulate the control model in Simulink with a model of iCub in the Gazebo robotics simulator (Figure 5). Cosimulation enables us to fix defects before testing on the actual robot, minimizing the risk of damaging the robot or endangering a human.
For example, with our cosimulation setup we can determine whether a given set of gains creates unstable behavior in the robot in the event of unplanned fast movements.
iCub streams measurements coming from sensors at 100 Hz. These measurements are published over the network by our robotic middleware YARP. We typically sync the update rate of our controllers to the rate of the input measurements.
Once we have validated the controller via simulations, we test it on a real iCub in the lab. In our test setup, Simulink runs on a PC with a standard x86 processor and communicates with the iCub via YARP middleware and TCP/UDP. The controller running in Simulink sends torque commands to the robot, which is able to follow a trajectory while maintaining its balance. A real-time synchronizer block, developed for the Whole Body Toolbox, synchronizes the robot moving in the real world with the control model.
Deploying the Controller
Our over-the-network configuration is convenient for rapid design iterations, but it leaves iCub reliant on the TCP/UDP communication link. To break this reliance and enable iCub to operate more independently, we deploy our controller to an x86 processor inside the robot’s head, eliminating the need for network communication, with the associated latency and risk of communication errors.
We generate C++ code from our control models with Simulink Coder, compile it, and validate it using the same over-the-network configuration that we used earlier when running the controller in Simulink. Then, we run the code on the x86 processor mounted inside iCub’s head, enabling iCub to operate independently as a single unit rather than on a separate control PC.
Portable Control Design
Today, about 40 iCub robots are in use by research groups around the world. Although the hardware designs vary, every version can use the same controller design. This portability is made possible by a configuration block within our Simulink control model that loads a Unified Robot Description Format (URDF) file describing the kinematics and dynamics of each robot. Using this configuration file, we can run the same controller on IIT’s 120-kilogram, 1.85-meter-tall simulated Walkman as on a 33-kilogram, 1-meter-tall iCub.
We are confident that our research will result in many real-world applications. For example, in the future, telexistence could be used to help the physically disabled perform tasks requiring strength or dexterity. In these cases, a robotic avatar would move and act in the physical world, remotely controlled by the disabled human.
Our research on human-robot collaboration is fundamental to creating robots to help humans at home and in the workplace—for example, assisting the elderly with activities of daily living and helping factory workers perform tasks involving muscular skeletal stress. Finally, the new branch of robotics that we are pioneering, aerial humanoid robotics, has a multitude of applications. Jet-powered, heavy-payload aerial platforms can be derived from flying humanoid robots for drug and food delivery during disaster response, heavy-payload last-mile delivery, and rescue platforms for firefighters, as well as platforms for humans performing high-voltage pylon inspection.
To translate the results of our research into real applications, we work closely with the iCub facility that develops, maintains and continuously updates the iCub humanoid robot—which has more than 40 copies in use at research facilities around the world.