Multipolicy Decision-Making for Autonomous Driving via Changepoint-based Behavior Prediction

icra 2015
MPDM: Multipolicy Decision-Making in Dynamic, Uncertain Environments for Autonomous Driving

icra 2015
M3RSM: Many-to-Many Multi-Resolution Scan Matching

Online mapping and perception algorithms for multi-robot teams operating in urban environments


iros 2014
Locally-weighted Homographies for Calibration of Imaging Systems

Next Prev

Project Spotlight



UM:SmartCarts is an MTC-funded project to develop a prototype transportation-as-a-service system, allowing students to summon an autonomous car via their phones. SmartCarts will leverage state-of-the-art sensing and reasoning technology to provide a safe and reliable means of transport around the UM campus. With this system in place, researchers will be able to study human factors in autonomous transportation systems.

Solar Pink Pong

Solar Pink Pong

We are currently collaborating on Solar Pink Pong, a hybrid of a street and video game. Players of this game can interact with an animated pink sunlight reflection on the street using their bodies and shadows. The device that makes this game possible works autonomously and completely off the grid. Solar Pink Pong aims at pushing the boundaries of video game culture and technology outside of the living room, changing the way humans interact with outdoor environments and see daylight through the lens of technology.

Check out these videos to see Solar Ping Pong in action in Ann Arbor and Dubai.

AprilTags in Space

AprilTags in Space

(Photo Credit: NASA) NASA and Microsoft have collaborated to develop Sidekick, a system designed to aid astronauts working aboard the International Space Station. The key technology in Sidekick is Microsoft’s HoloLens, a virtual reality device that uses AprilTags for localization within the crew cabin. This marks the first time APRIL Lab technology has been used in outer space!

Second-Generation MAGIC Robot Platform

Magic 2 Robot

In the past year, we have been building the next generation of the robot platform we used to win the MAGIC 2010 competition. The latest iteration has a number of improvements over the previous model:

  • An upgraded drivetrain allows the robot to drive more quickly and quietly

  • All on-board electronics have been integrated into a single circuit board, de-cluttering the interior of the robot and making maintenance easier.

  • With new all-terrain tires, the robot can drive over small obstacles and operate in muddy areas.

  • We are developing several additions to the existing sensor suite, including an omnidirectional camera, a high-frequency, short-range obstacle detector and an LED display panel.

Muskatatuck Urban Training Center

Tunnel exploration

We recently visited the Muskatatuck Urban Training Center to field-test the latest advances in our multi-robot exploration system. Our robots traversed underground tunnels, searched ramshackle villages, and mapped concrete labyrinths.

AprilTag 2.0 released

Robots with AprilTags

A new version of the AprilTag library (2014-10-20) has been released! It’s written in pure C, dramatically faster than the old version, and generally has both a lower false positive rate and a higher true positive rate.

MAEBot mobile robot platform

Maebot front view

MAEBot is a mobile robot with rich sensor capabilities designed to provide students and researchers with a robust, open, affordable platform on which to explore the concepts of robot control, localization, kinematics and machine vision.

We are in the process of releasing our design files and code in order to benefit the wider robotics community. If you are interested, drop us a line — we’d love to hear from you!

Next-Generation Vehicle

NGV teaser

In partnership with Ford and State Farm Insurance, we have begun development of a next-generation automated vehicle. On the University of Michigan side, the principal investigators are Ryan Eustice and Edwin Olson. Michigan is taking a leading role on sensing and decision-making.