For robotic agents to have capabilities where they can perform useful tasks in environments that are not specifically designed for their operation, it is crucial to have dexterous manipulation capabilities guided by some form of tactile perception. While visual perception provides a large-scale understanding of the environment, tactile perception allows fine-grained understanding of objects and textures. For truly useful robotic agents, a tightly coupled system comprising both visual and tactile perception is a necessity.
Tactile sensing hardware can be classified on a spectrum, organized by form-factor on one end to sensing accuracy and robustness on the other. Most off-the-shelf sensors available today trade off one of these features for the other. The tactile sensor used in this research, the BioTac SP, has been selected for its anthropomorphic qualities, such as its shape and sensing mechanism while compromising on quality of sensory outputs. This sensor provides a sensing surface, and returns 24 tactile points of data at each timestamp, along with pressure values.
We first present a novel method for contact and motion estimation through visual perception, where we perform non-rigid registration of a human performing actions and compute dense motion estimation trajectories. This is used to compute topological scene changes, and is refined to get object and contact segmentation. We then ground these contact points and motion trajectories to an intermediate action-graph, which can then executed by a robot agent.
Secondly, we introduce the concept of computational tactile flow, which is inspired by fMRI studies on humans where it was discovered that the same parts of the brain that react to optical motion stimulus also react to tactile stimulus. We mathematically model the BioTac SP sensor, and interpolate surfaces in two- and three dimensions, on which we compute tactile flow fields. We demonstrate the flow fields on various surfaces, and suggest various useful applications of tactile flow.
We next apply tactile feedback to a novel controller, that is able to grasp objects without any prior knowledge about the shape, material, or weight of the objects. We apply tactile flow to compute slippage during grasp, and adjust the finger forces to maintain stable grasp during motion. We demonstrate success on transparent and soft, deformable objects, alongside other regularly shaped samples.
Lastly, we take a different approach to processing tactile data, where we compute tactile events taking inspiration from neuromorphic computing literature. We compute spatio-temporal gradients on the raw tactile data, to generate event surfaces, which are more robust and reduces sensor noise. This intermediate surface is then used to track contact regions over the BioTac SP sensor skin, and allows us to detect slippage, track spatial edge contours, and magnitude of applied forces.
Kanishka Ganguly is a PhD candidate in Computer Science, advised by Prof. Yiannis Aloimonos and Dr. Cornelia Fermuller, in the Perception and Robotics Group. Prior to joining the PhD program, Kanishka received his Masters of Engineering from the Robotics program from UMD in 2017, and his Bachelor's degree in Computer Science and Engineering from the Birla Institute of Technology, Mesra in 2015.
His research focuses on robot grasping with tactile feedback, using the Shadow Dexterous Hand equipped with BioTac SP sensors. He has published in top robotics conferences and journals, such as EuCog, Springer, RA-L, Frontiers and IROS.
Kanishka is a recipient of The Dean's Fellowship, The Graduate Student Summer Research Fellowship, and the Maryland Robotics Center Graduate Fellowship. Apart from his academic endeavours, Kanishka has interned at Logitech (2018), Amazon Robotics (2019) and Magic Leap (2022).