UCSD Engineers Develop Low-Cost Robotic Hand That Rotates Objects

In a significant breakthrough, researchers at the University of California San Diego (UCSD) have innovatively leveraged touch-based technology to empower a robotic hand to manipulate objects solely through touch, eliminating the need for vision. This pioneering approach has resulted in a robotic hand capable of delicately rotating a variety of objects including small toys, cans, and even fruits and vegetables without causing any damage, all based on tactile information.

The UCSD engineers are optimistic that their groundbreaking work could pave the way for the creation of robots capable of operating in darkness or low-light conditions. The construction of this system involved the strategic placement of 16 touch sensors, costing around $12 each, on the palm and fingers of a four-fingered robotic hand. These sensors are designed to detect the presence of an object in contact with the robotic hand.

The UCSD team’s approach is unique in its utilization of low-cost, low-resolution touch sensors that generate simple binary signals across a large area of the robotic hand. This sharply contrasts with other methods that rely on high-cost, high-resolution touch sensors located at the fingertips.

The research was spearheaded by Xiaolong Wang, a professor of electrical and computer engineering at UCSD. Wang highlighted several issues associated with high-cost approaches, such as a limited number of sensors reducing the likelihood of object contact and thus impairing sensing capabilities. Additionally, high-resolution touch sensors generate texture-related information that is complex and costly to simulate.

Wang explained their innovative approach, “We demonstrate that intricate details about an object’s texture aren’t necessary for this task. Simple binary signals indicating whether the sensors have made contact with the object or not are sufficient, and these are much easier to simulate and implement in real-world applications.”

The UCSD robotic hand stands out due to its extensive coverage of binary touch sensors. This provides enough information about an object’s 3D structure and orientation to successfully rotate it without requiring visual input. The UCSD team trained the system by running simulations of a virtual robotic hand manipulating various objects, including those with irregular shapes.

The system evaluates which sensors come into contact with the object during rotation, along with the current positions of the hand’s joints and their previous actions. Based on this data, the system directs the robotic hand regarding joint movement. The team tested the system on a real-life robotic hand with objects previously encountered by the system. The hand successfully rotated numerous objects, including a tomato, pepper, a can of peanut butter, and a toy rubber duck without losing grip.

While objects with more complex shapes took longer to rotate, this development marks a significant step forward in robotics, especially in electronics and computer programming fields. The team is now focusing on advancing this approach to more complex manipulation tasks such as catching, throwing, and juggling.

“In-hand manipulation is a very common skill that we humans possess, but it is extremely complex for robots to master,” said Wang. “If we can equip robots with this skill, it will significantly broaden the range of tasks they can perform.” This development could revolutionize not only robotics but also industries like electronics where such robots could be utilized for intricate tasks requiring delicate manipulation.