A Robotic Skin for Collision Avoidance and Affective Touch Recognition
We describe a flexible robotic skin module that can measure proximity, contact and force, as well as algorithms for detecting obstacles, human hands, and affective touch gestures. The computational requirements of the proposed architecture are in line with the capabilities of a small microcontroller, allowing for a majority of the data to be processed colocated with the skin. The resulting system only communicates low-bandwidth information of interest, thereby solving challenges in routing high-bandwidth information and reducing the burden on a robot’s central processing unit. We describe design and manufacturing of a 10.8 × 10.8 cm2 skin patch containing 64 sensors, classification results for six different affective touch gestures, and a method that allows the skin to differentiate between an approaching human hand and various obstacles.
Tactile sensing is an important sensingmodality for a variety of robotic applications, including grasping and manipulation , assessing object properties , navigating cluttered environments where vision is heavily occluded, safely interacting with humans or determining the underlying intent or emotion associated with a touch gesture . Creating robotic skins that mimic human-like tactile sensing has been explored extensively over the last several decades , investigating a large variety of possible single and multiple sensing modalities. Deploying touch-sensitive skin across the entire robot’s body poses a series of system-level challenges. Primary challenges are the selection of applicationspecific sensing modalities and the routing and processing of the large amount of data arising from high-bandwidth sensing across a large area. Co-locating sensing and computing elements has been proposed to mitigate challenges with high-bandwidth sensing
and communication in materials such as robotic skins . This approach allows for significant in-skin processing to be performed, resulting in only high-level, low-bandwidth information being communicated to a host robot and shifting significant autonomy into the skin itself . We present a skin design that is based on a simple, lowcost ,sensor that can measure proximity, contact, and force , and co-locates a microcontroller with a 64-element sensor array, allowing the skin to detect and avoid collisions with arbitrary obstacles, recognize whether an approaching object is a human hand or not, and classify six social touch gestures made by a human building up on algorithms developed in . Providing tactile sensing capabilities in a full-body robotic skin is an active area of research. Pressure sensitive skins have been achieved by measuring the capacitive , resistive or optical properties of the skin, while optical sensors have been use to provide proximity sensing . Cellular skin with multimodal sensing (pressure, vibration, temperature and proximity) has also been developed . For large robots, high-bandwidth communication is needed to communicate sensor values to the host processor; local processing of sensor measurements in individual skin modules can reduce this bandwidth requirement. The microcontrollers needed to read individual taxels in a cell can be programmed to average measurements, increasing sampling frequency . Event based sampling allows transmission of sensors values only when above a threshold, reducing host CPU usage . Similarly, dynamically selecting active sensors reduces the computational cost of feature extraction. Proximity sensitive skins have been employed to avoid colliding with individuals or objects in the environment . In , proximity sensing is used to actively explore an environment to identify the location of objects in a workspace, which are then later classified using force, vibration and temperature sensing modalities. Using robotic skin to classify touch gestures provides a useful communication channel when interacting with humans. Identifying a set of static and dynamic gestures (e.g., horizontal, vertical and diagonal lines, numerals) allows a force-sensitive skin to act as a tactile user input . Recognizing social touch gestures (e.g., pat, rub) is also of interest in human-robot interaction. Several tactile sensitive skins and classification approaches have been described in the literature . The skins used in a majority of these papers are only capable of detecting force, though force and vibration is used . Proximity sensing is used to determine active cells for feature generation, though proximity is not used in the actual features.
The computational requirements of the proposed architecture are in line with the capabilities of a small microcontroller, allowing for a majority of the data to be processed colocated with the skin. We present a skin design that is based on a simple, lowcost sensor that can measure proximity, contact, and force , and co-locates a microcontroller with a 64-element sensor array, allowing the skin to detect and avoid collisions with arbitrary obstacles, recognize whether an approaching object is a human hand or not, and classify six social touch gestures made by a human building up on algorithms developed .
We described design, manufacturing, and algorithms for a novel robotic skin that combines pressure and proximity sensing for obstacle avoidance and affective touch detection.We showed that proximity information can not only be used for obstacle detection, but also for increasing the accuracy of recognizing ,complex gestures. Both obstacle detection and gesture recognition are tied together by a Bayesian reasoning framework, which allows for system-level analysis and parameter tuning using a principled approach. The resulting system is likely to scale well to large areas and has the potential to overtake significant computation, augmenting the robot’s autonomy. For future work, we are interested in increasing the number of affective gestures and investigating more advanced computational tools such as convolutional neural networks for their classification, and better understand how models can become invariant to curvature.