Jul 6, 2017 | By Benedict
Fingervision, a partially 3D printed robotic sensing system, uses a medley of affordable off-the-shelf parts to create an artificial sense of touch. This touching ability could allow a larger robotic system to perform complex manual tasks such as peeling a banana.
Akihiko Yamaguchi and Christopher G. Atkeson’s title for their 2016 robotics research paper on “Fingervision” might be one of the best I’ve ever heard. And “Combining Finger Vision and Optical Tactile Sensing: Reducing and Handling Errors While Cutting Vegetables” does exactly what it says on the tin.
Yamaguchi and Atkeson, two Carnegie Mellon roboticists, have developed a robotic system that uses camera-equipped fingers to “feel” objects. It might not sound like much—and the robot sure doesn’t look like much—but this sensing power allows Fingervision to do some pretty amazing things—chopping vegetables included.
Using a combination of off-the-shelf parts, the two Carnegie Mellon experts attached their Fingervision system to a Baxter industrial robot, augmenting the machine with a pair of 3D printed grippers covered in clear silicone wrap.
$50 RGB cameras inside these grippers keep an eye on a pattern of black dots marked onto the gripper’s artificial skin, which is pulled tight over the hollow 3D printed frame. These black dots distort when the robot grips an object, providing the “eyes” of the system with vital information about the object being handled.
According to Yamaguchi and Atkeson, this finger sensing ability allows Fingervision to determine when it is holding a fragile object, as well as when the object in its grip is slipping. In the former case, the robot will tighten its grip very gently until it can hold the object securely; in the latter case, the robot will tighten its hold.
Other cool features of the Fingervision system include an automatic “handover” process, in which the system can sense when a human is trying to retrieve an object from the grasp of the grippers. When this happens, the system will relinquish its grip just at the moment the human takes the object.
Yamaguchi and Atkeson think Fingervision has big potential for bringing robots into the realm of tactility. And excitingly, the Carnegie Mellon roboticists say that Fingervision could soon become something like full body vision…just as long as they can afford it.
“We want to cover the whole body with this kind of sensor, so the price is very important,” Yamaguchi told Tech Crunch. The roboticist added that improved sensitivity could be vital in getting Fingervision implemented in human work environments.
(Images: Tech Crunch)
Excitingly for all the tinkerers out there, the Carnegie Mellon duo plans to make the Fingervision project open source. Until then, you can read about the robot’s full capabilities here.
Posted in 3D Printing Application
Maybe you also like:
- T-Bone Cape motion control board launches on Indiegogo
- New extruder could lower costs of 3D printing cellular structures for drug testing
- New Ninja Printer Plate for consumer 3D printing
- mUVe3D releases improved Marlin firmware for all 3D printers
- Zecotek plans HD 3D display for 3D printers
- Add a smart LCD controller to your Robo3D printer
- Maker Kase: a handy cabinet for 3D printers
- Heated bed for ABS printing with the Printrbot Simple XL
- Next gen all metal 3D printer extruder from Micron
- Pico all-metal hotend 100% funded in 48 hours, B3 announces Stretch Goal
- Create it REAL announces first 3D printing Real Time Processor
- A larger and more powerful 3D printer extruder on Kickstarter