HERB Gets a New Eye

NREC Chimp

Background

HERB 2.0, also known as the “robot butler”, is an ongoing project in the Personal Robotics Lab at Carnegie Mellon University (CMU). Velodyne’s Frank Bertini recently had the opportunity to visit CMU and experiment with integrating the VLP-16 Puck LiDAR sensor onto HERB’s head.

David Butterworth, a graduate student at the CMU Robotics Institute, has been hypothesizing and testing new methods of robotic perception using the HERB (Home Exploring Robot Butler) platform. The Velodyne LiDAR was used to scan a standard set of household items which included a box of cereal, a bottle of clothes washing detergent, and a toy airplane. These everyday items are part of a universal set of common objects that all robotics programs experiment with.  Since the robotics research community is using a standard set of test products, it makes it easier for teams to collaborate and exchange data, algorithms and code.

Working with the Puck

In the first scenario the Puck was mounted on HERB with a fixed and rigid screw mount. This configuration did not yield very interesting results because HERB had to move in order to create a scanning effect. The LiDAR unit was then mounted on an external encoder, which allowed for one more axis of rotation.

The encoder can be seen in the accompanying picture and video at the base of the structure. LEGO Mindstorms pieces were used to create the structure while a specially designed 3D printed case held the Puck in place. By rotating the sensor on its vertical axis, David and Frank were able to scan a spherical 360-degree view of the room. The implementation used standard ROS (Robot Operating System) drivers and libraries to fuse the data and control the rotation.

HERB with base

Preliminary results were promising. The Velodyne Puck produces 300,000 points of data per second and the data is transmitted in real-time. HERB was able to receive rich detail from the environment and his surroundings. The hope is that with this enhanced data, HERB will be able to perceive and identify items much easier. Accurate perception is a critical first step in machine learning because it then dictates the following manipulation programming sequence. Once HERB can recognize a cereal box, he will know how to grasp it and then know to start looking for a bowl and some milk.

Collision Detection

Along with perception analysis, The Velodyne LiDAR can also act as a collision detection sensor. HERB currently uses a few different sensors and cameras that are faced forward. With the current implementation, the robot’s arms are not entirely in the FoV (Field of View). This could create a potentially unsafe situation as a human approaches HERB from the back or the side while the arms are in motion. With the 360 degree FoV of the Puck, HERB is able to see in all directions, including where his arms are moving and if they might strike something in the environment.

HERB is just one of the many applications that could benefit from a 360 degree LiDAR sensor. While in Pittsburgh, Frank also visited the National Robotics Engineering Center (NREC) and was able to speak with the team responsible for developing CHIMP (CMU Highly Intelligent Mobil Platform). CHIMP competed in the DARPA Robotics Challenge Finals in 2015 and was tasked with navigating a search and rescue scenario.

 

xxxx

Please contact Frank Bertini (fbertini@velodyne.com) for additional information.

Leave a Reply

Your email address will not be published. Required fields are marked *