Current Projects

Ground Texture Localization and SLAM

Reliable simultaneous localization and mapping (SLAM) in novel environments is essential for safe and effective navigation of mobile robots, enabling applications such as autonomous delivery, hospitality, and inspection. However, localization using LiDAR or outward-facing cameras can struggle in dynamic environments due to occlusions from moving objects and lighting variations. To overcome these limitations, ground texture localization uses a downward-facing camera to achieve high-precision localization based on the ground’s visual appearance. For this project, we are researching novel techniques that exploit the inherent geometric and lighting constraints of ground texture SLAM to increase the accuracy, speed, and robustness of these algorithms. By incorporating these constraints, we aim to achieve more precise and reliable localization for autonomous ground robots operating in challenging, unstructured environments, enabling robust navigation even in visually degraded conditions.

High-Level Robot Synthesis and Programming

High-Level Robot Synthesis and Programming

This research area focuses on automating and optimizing robot design, an inherently challenging task often involving cross-domain dependencies and complex trade-offs among performance, cost, efficiency, and robustness.  Our approach formulates robot design as a constrained optimization problem. In combination with other techniques, we use constraint programming to efficiently navigate extensive solution spaces of over 1025 candidate solutions and identify globally optimal Pareto fronts within seconds. Our methodology has been effectively demonstrated in complex applications such as optimally co-designing the component selection and task scheduling of a quadcopter fleet performing package delivery. Through this project, we aim to enable rapid design iteration, streamline decision-making, and achieve reliable, high-performance robotic solutions.

Bio Collaborations

We are helping observe bumblebee colonies with customized video traps that can monitor the comings and goings of individual bees. We are developing models for comb construction in honey bees, which helps us understand how colonies of 10s of thousands of bees effectively collaborate to build their magnificent combs.

Building in Unstructured Environments

This project is about finding tractable models for unstructured environments to help robots reason about the long-term outcomes of different modification strategies. By on representations that are designed to capture uncertainty, we can build robotic systems that can work with a variety of challenging building materials, such as deformable bags and poly-urethan foam, yet have a high degree of certainty that they will succeed. The trick is that we are trading off predictability of the exact final shape of a structure for better chances of success. One or more robots continually re-scan and evaluate their environment and then make incremental modifications toward a common goal, e.g., building a large ramp or level surface. We focus on mid-level abstractions that allow us both to formulate plans during execution and serve as a design tool. For example, we can answer questions like: Can a particular robot platform use material X to build structure Y? What combinations of materials and robots make sense if I want to build a specific structure within a certain tolerance?

Dry Stacking with Robots

We are developing rock stacking algorithms that allow today’s robots to re-arrange rocks into stable structures. Building things using found stones without using mortar is called dry stacking and is amongst the oldest known human building techniques. However, it is currently quite challenging to replicate with robots. This situation is unfortunate since autonomous dry stacking would open the door to a variety of really useful applications. Robots could build structures in remote environments without needing a supply of consumable construction materials. There is no environmental impact associated with the production of construction materials, which makes dry stacking ideal for building temporary or large-scale structures, e.g., protection structures in disaster areas or erosion barriers in remote locations.

Past Projects

Amorphous-Construction

Amorphous-Construction

This project aims to create algorithms and hardware that can reliably build structures in unstructured terrain. Some animals, e.g. mound building termites, are really good at this type of construction. Their skill depends on the tightly coupled interaction of construction strategy and construction material. Termites and other animals often take advantage of goopy, amorphous materials to build in irregularly shaped environments. From a robotics perspective, this approach is appealing since mechanical feedback during construction not only makes the process robust but potentially allows for much coarser control and sensing requirements of the construction mechanism. The robots I am working on use polyurethane foam as amorphous construction material. Depositions are modeled as operator applications to continuous functions, and robust strategies are designed to always reach desirable invariant sets after a finite number of depositions.

Factory Floor

Factory Floor

The Factory Floor Testbed is an experiment to explore scalable, robust, multi robot construction hardware and algorithms. It consists of modular robots that can build arbitrary lattice structures from two types of raw materials. The hardware is built by the ModLab at the University of Pennsylvania. My role in the project is the design of models and robust construction algorithms for the testbed.

Robotic Chemistry

Robotic Chemistry

The Robotic Chemistry Testbed is physical instantiation of a tunable Stochastic Chemical Reaction Network. It consists of three types of simple robots that are randomly stirred on an air table. They are machined from polyurethane foam with embedded magnets. One type of robot has an analog circuit connected to solar cells that allow it to collect energy and break apart the dimers formed by the other two robot types. In this project I helped with the initial design and advised a team of undergraduate students with the detailed design and construction.

Programmable Parts

Programmable Parts

The Programmable Parts Testbed is a stochastic self assembly experiment. Triangular robots are randomly stirred on an air table so that their motion cannot be directly controlled by the experimentor. However, the robots binding behavior with respect to other robots can be controlled by specifying local rules as a graph grammar. We showed how to create grammars to build arbitrary shapes by using the testbed as an instantiation of a Stochastic Chemical Reaction network. I did most of the electrical and mechanical design work for these robots as well as writing the embedded control software for interpreting graph grammars.