Robotics research and development has come a long way in the past several decades, but not necessarily as science fiction writers imagined. Instead of cyborgs taking over the world, modern research focuses on designing robots to excel in situations that are hazardous or tedious to humans, such as performing disaster reconnaissance or reliably completing repetitive tasks. Although these robots have proved to be effective in completing physical tasks, learning to interact naturally with their human charges remains a challenge.
To gain a new perspective on the problem, roboticists from Drexel University’s Expressive and Creative Interactions Technology (ExCITe) Center have formed an interdisciplinary collaboration with the university’s Center for Functional Fabrics (CFF). This effort provided their in-house robot, Hubo, with a unique advantage when it comes to understanding humans: clothing.
While still in an early phase of development and far from a fashion statement, the clothing gives Hubo a sense of touch. The team is working to develop touch-sensitive clothing for Hubo that would allow it to distinguish something as nuanced as a tap on the shoulder from something as potentially dangerous as an aggressive push.
The combination of robotics and textile technology may sound counterintuitive, but the labs’ lead researchers say that for them, the collaboration couldn’t have been more natural.
“There was a pretty long period of time when one team was working on robotics, and just 20 feet away, another team was working with smart fabrics. Then some innovative people asked, ‘What would happen if we put some fabric on the robot?’” says Dr. Youngmoo Kim, director of the ExCITe Center. “ExCITe is about having an interdisciplinary space and the chance for lots of different types of research efforts to intersect.”
The researchers say that this kind of collaboration was exactly what the university had in mind when designing the ExCITe Center, which houses researchers who explore everything from expressive robotics (like Hubo) to entrepreneurial game design and computerized knitting. The ExCITe Center’s mission to “maintain an environment encouraging the serendipitous exchange of knowledge and ideas” is a fitting description in the case of Hubo’s wardrobe: The collaboration began as a search to find Hubo a protective covering that was less brittle and more protective than its current PVC shell.
“These fancy plastic shells would make it look very futuristic, but they are very brittle and don’t protect the robot very well,” says Kim.
CFF has been an integral part of ExCITe since 2012 when Japanese technology company Shima Seiki donated 16 workstations and three computerized knitting systems to the university. These computerized knitting machines assemble yarn into textiles row by row, similar to how a 3D printer layers plastic. CFF’s machines use conductive yarn to create smart, flexible garments without the need for additional embedded electronics. To date, the lab has used this technique to manufacture garments for energy storage and Wi-Fi power harvesting, and even to design a smart “bellyband” for expecting mothers that monitors contractions without additional components. But when it came time to dress Hubo, CFF’s director, Geneviéve Dion, says both teams had some obstacles to overcome.
In addition to designing Hubo’s clothing to protect against collisions and mechanical wear-and-tear, the researchers strived to design a fabric that could improve Hubo’s interactions with humans. For the first task, the team combined padded textiles and high-tensile yarn to combat daily wear and tear. The result was a bulky set of matching dark gray sleeves, pant legs, and a shirt.
Dion says that this achievement was important not only for the functionality of their project, but also for the understanding of smart textiles as a whole. While we might view textiles as disposable or flimsy, Hubo’s armor stands as a testament to how durable these fabrics can be—even more so than industrial-strength plastic in this case.
In addition to protecting Hubo, the team also had to design touch-sensitive clothing in the form of capacitive touch sensors. Like a computer keyboard, these touch sensors are designed to sense the pressure and location of a touch and translate that into usable data for the machine, such as where to move your cursor or, in the case of Hubo, answering to the question, “Is somebody touching me?”
“These fancy plastic shells would make it look very futuristic, but they are very brittle and don’t protect the robot very well.”Youngmoo Kim
For Hubo’s new clothing, the sensors were essentially circuits knitted with both conductive yarn to act as the wiring and standard, non-conductive yarn to form the clothing’s structure. The circuit works like a smartphone touchscreen: When a person’s skin presses the bare conductive yarn, it gives the robot a sense of touch, measuring the location and pressure of the touch.
Pressure and location tell humans a lot about the contact: If someone lightly taps you on the shoulder, they want to gain your attention. A hard shove from behind communicates a very different, and obvious, message.
While the goal was fairly straightforward, the emerging nature of this field meant that the team had to work from the ground up to determine how these sensors would work for a robot. It’s difficult to translate an inflexible, complex circuit design into one that can be knitted as a flexible sensor. Unlike printed circuits that contain many layers of individual wires, the team was tasked to imagine a design that used as few layers and wires as possible. Hubo’s flexibility meant that his clothes, and their circuitry, would need to fold and flex as Hubo moved. These constraints led to many questions: How would wiring integrate into the fabric and connect to external sensing controllers? How would the sensor’s knitted circuit survive repeated bending and stretching? How would the sensor discern an external touch from one triggered by folding or bending? And how would the sensor tackle practical concerns such as being laundered?
“We wanted the sensor to be handled like a textile and to be capable of being folded, stretched, and washed without degradation or adverse effects.”Richard Vallett
“We wanted the sensor to be handled like a textile and to be capable of being folded, stretched, and washed without degradation or adverse effects,” said Richard Vallett, Ph.D. candidate and lead researcher on the project.
The designers selected self-capacitance as the sensing method. Self-capacitance sensing registers touch only from something conductive, such as our fingertips. Most smartphones use this same method, which is why we must take off our mittens to text. Using self-capacitance sensing for Hubo’s wardrobe meant that inadvertent pressure on the sensor from folds or stretching wouldn’t register as touch.
To explore this new territory, the team turned to MathWorks. Vallett says they used both MATLAB® and Simulink® to model touch detection in the flexible sensors. As the sensor’s circuits became more complex, they used Simscape™ to model expected outputs and performance as well as to mitigate any measurement uncertainties.
The touch sensors did not operate like traditional capacitive sensors, which rely on a grid of many wires with low resistance to measure touch position. Instead, the sensors use a single conductive yarn of a high resistance that serpentines across the textile’s surface. The low current through the yarn allows fine changes in voltage to be measured from both endpoints of the yarn path. Touch is then inferred from the voltage differential as a linear distance along the path. The team was able to tune electrical behavior by altering the sensor’s resistance through physical means. Knitted loops of bare conductive yarn form complex electrical networks. By changing the knitted pattern and the number of individual fibers in the yarn’s composition, the team could maximize sensitivity without losing performance. Finding the right combination of these two factors was key.
“The parameters of the experiments run through Simulink models before we conduct any physical testing,” said Vallett.
The ability to visualize the output of the model enabled the team to design a set of decoupling equations to track the location and pressure of a single touch on the sensor measured from two coupled voltage signals. Since the virtual model allowed them to make sense of the virtual sensor’s nonstandard output and identify trends in the data, they were able to design and test a network of sensors that could detect accurate touch location using a single flexible wire and be certain that the folded and overlapping wire would not obscure the input location.
Through the design process, the team also came to realize that their all-or-nothing approach to measuring touch location might need a revision as well.
“We switched from trying to be extremely precise with our location to sensing tapping and gesture,” says Dion. “And that was really interesting because it’s not like we were not thinking about it; we were just trying to pursue the best solution.”
The team is still working on perfecting Hubo’s new duds, but they hope to improve the fabric’s touch accuracy and sensitivity as well as expand their technology beyond the lab and help integrate these robots with the communities they aim to assist.
“We envision a future where our robots are truly assistive devices,” says Kim. “If you want a robot around your house who can take out the garbage or do the dishes, it needs to interact with people on a much more natural level than we’re capable of right now. Sensing in all forms, but particularly touch, is a huge part of that.”
So, when Hubo is wearing his new clothing and you tap him on the shoulder, don’t be too surprised when he turns around to see what you want.