DARPA Fast Lightweight Autonomy Program

A DARPA program with the goal of developing quadrotors that can fly at 20m/s and navigate within challenging outdoor forest and indoor cluttered environments. Our team primarily consists of researchers at the University of Pennsylvania, the University of Zurich and the Open Source Robotics Foundation. Currently, I am working on algorithms for event-based cameras that could allow for high speed flight in high dynamic range situations (indoor-outdoor transitions for example), and in the past have worked on real time robust target detection.

Coursera Robotics Specialization Capstone

The Capstone course for the Robotics Specialization on Coursera is split into two tracks. I developed the hardware track, where students who purchase a basic robot kit (wheeled base, Raspberry Pi, IMU, camera and battery) learn to program a basic position controller for the robot, detect AprilTags, implement an EKF to fuse the AprilTag measurements with IMU information, and finally autonomously navigate a user generated world populated with a known map of AprilTags given a set of way points. Work involved spec’ing parts, recording lectures, generating assignments and code and grading and support.

Final project from a student of the course. Thanks to Francois Delgove for the video.

 Laboratory Automation with Baxter

This is a joint long term project with GlaxoSmithKline to use the Baxter Robot from Rethink Robotics to automate repetitive laboratory tasks such as pouring liquids and mixing chemicals. At present, Baxter is able to autonomously detect transparent flasks and beakers, and perform pick and place and peg in hole procedures.

Our pipeline is, in summary:

  1. Given an action (such as place test tube in rack, or pick and place flask), detect the desired object.
  2. Plan a safe trajectory to a neighborhood above the object using the MoveIt! Motion Planning Framework.
  3. Using eye to hand visual servoing, accurately navigate the gripper to a pre-defined grasping pose relative to the detected object.
  4. Grasp the object, and detect the desired target pose.
  5. Plan a safe trajectory to a neighborhood above the target pose using MoveIt.
  6. Visual servo the object to the desired pose, and gently set it down.
Note that the video is from more than a year ago, and the latest pipeline has improved significantly.

 DARPA Robotics Challenge Finals

The DARPA Robotics Challenge simulated a natural disaster situation, and challenged teams to semi-autonomously control a humanoid robot to perform a number of challenging tasks:

  1. Drive a vehicle around a number of obstacles
  2. Egress the vehicle
  3. Open a door and enter through the door
  4. Turn a valve
  5. Drill a hole in a wall
  6. Perform a mystery task
  7. Walk over a field of uneven cinder blocks or clear a path of debris
  8. Climb a set of stairs

My main role was to provide accurate obstacle detection and visualization for the driving task. Using a stereo camera mounted on the head of the ATLAS robot, we were able to generate a depth map of the road in front of the vehicle, and perform ground plane segmentation to extract the obstacles above the ground. By using the resulting 3D map, we generated a top-down view of the course, allowing for accurate driving and obstacle avoidance, even in extremely high latency situations.