Extracting Utility-Scale Energy from Municipal Waste
APRIL Lab is collaborating with faculty from the UM Department of Civil and Environmental Engineering under an NSF Cyber-Innovation for Sustainability Science and Engineering (CyberSEES) grant to engineer a transformative means of extracting utility-scale energy from waste using next-generation facilities to be termed Sustainable Energy Reactor Facilities (SERFs). Current landfills leak biogases, such as methane and carbon dioxide, into the atmosphere, where they remain as greenhouse gases. These emissions have a significant environmental impact, yet careful management could reduce their harm and yield a source of renewable energy. In this project, we seek to gain better understanding of the physical-chemical-biological processes that cause landfill methane leaks through environmental sensing and modeling. We are currently deploying distributed sensor networks and autonomous robotic systems in existing landfills to monitor methane production. With better models of the methane production of landfills, engineers can design SERFs to maximize energy recovery and minimize environmental impact.
Solar Pink Pong
We are currently collaborating on Solar Pink Pong, a hybrid of a street and video game. Players of this game can interact with an animated pink sunlight reflection on the street using their bodies and shadows. The device that makes this game possible works autonomously and completely off the grid. Solar Pink Pong aims at pushing the boundaries of video game culture and technology outside of the living room, changing the way humans interact with outdoor environments and see daylight through the lens of technology.
AprilTags in Space
(Photo Credit: NASA) NASA and Microsoft have collaborated to develop Sidekick, a system designed to aid astronauts working aboard the International Space Station. The key technology in Sidekick is Microsoft’s HoloLens, a virtual reality device that uses AprilTags for localization within the crew cabin. This marks the first time APRIL Lab technology has been used in outer space!
Second-Generation MAGIC Robot Platform
In the past year, we have been building the next generation of the robot platform we used to win the MAGIC 2010 competition. The latest iteration has a number of improvements over the previous model:
An upgraded drivetrain allows the robot to drive more quickly and quietly
All on-board electronics have been integrated into a single circuit board, de-cluttering the interior of the robot and making maintenance easier.
With new all-terrain tires, the robot can drive over small obstacles and operate in muddy areas.
We are developing several additions to the existing sensor suite, including an omnidirectional camera, a high-frequency, short-range obstacle detector and an LED display panel.
Muskatatuck Urban Training Center
We recently visited the Muskatatuck Urban Training Center to field-test the latest advances in our multi-robot exploration system. Our robots traversed underground tunnels, searched ramshackle villages, and mapped concrete labyrinths.
AprilTag 2.0 released
MAEBot mobile robot platform
MAEBot is a mobile robot with rich sensor capabilities designed to provide students and researchers with a robust, open, affordable platform on which to explore the concepts of robot control, localization, kinematics and machine vision.
We are in the process of releasing our design files and code in order to benefit the wider robotics community. If you are interested, drop us a line — we’d love to hear from you!
APRIL Camera Calibration Suite
The APRIL camera calibration suite is now available as part of the APRIL Robotics Toolkit. This interactive tool uses the current calibration state to suggest the position of the target in the next image.
AprilCal yields more reliable and accurate camera calibrations compared to alternatives such as OpenCV. A 2D grid of AprilTags is used for calibration, removing the need for observing the entire calibration target for detection. In addition to single-camera calibration, AprilCal also supports multi-camera calibration (without interactive suggestions).
Check out this demo video for more details.