I have been involved in a number of different projects at the Robotics Institute during my time here. I am currently working on Trestle and IDSR, as well as infrequently assisting the Roboceptionist team. Below are my current and past projects, in approximate reverse chronological order.
The Trestle project is investigating multi-agent coordinated assembly, with particular focus on inter-agent coordination and the incorporation of remote humans through sliding autonomy. Trestle has previously been known as Space Solar Power (SSP) and Distributed Robotic Autonomy (DIRA). See Human-Robot Teams for Large-Scale Assembly for an overview of the project. I have been involved with all aspects of Trestle since the fall of 2001.
IDSR is an initiative between Carnegie Mellon, the University of Maryland, and Stanford. Our first project is to assemble the EASE structure in UMD's neutral-buoyancy tank, using the architecture and software from Trestle to interface with the controller of UMD's multi-armed Ranger robot. UMD is providing the hardware and controllers, while we provide task sequencing and automation. See Overcoming Sensor Noise for Low-Tolerance Autonomous Assembly for a discussion of the challenges and results of our first year's experiments.
Microraptor is a process control system that Trey Smith and I developed. Through a network of daemons and clients, Microraptor allows multiple users to easily run and control an arbitrary number of interdependent processes. It is currently in use by the Trestle, Roboceptionist, Grace, LITA, and other projects. See the Microraptor site for further details, documentation, and downloadable packages.
The Roboceptionist project is a long-term installation in the entranceway to Newell-Simon Hall at Carnegie Mellon. A collaboration between the Robotics Institute and Carnegie Mellon's Drama Department, the Roboceptionist is an experiment in both human-robot interaction and storytelling with a unique robotic actor. I have been involved in the systems and interface side of the project.
I was a teaching assistant for the Fall 2003 section of Mobile Robot Programming Lab (16x62), taught by Illah Nourbakhsh. MRPL is a graduate and undergraduate course that gives the students a grounding in robotics by allowing them to work with Nomad Scout robots. The curriculum is centered around a series of labs in which the students begin with basic commands (wheel velocities, sonar readings, etc.) and build a system that is capable of coordinated planning and exploration with a pair of Scouts by the end of the semester.
Grace was our entry in the AAAI Grand Robot Challenge in 2002 and 2003, in cooperation with the Naval Research Lab, Swarthmore, Northwestern, and Metrica/TRACLabs. The goal of the challenge is to create a robot able to attend the AAAI conference by navigating to the registration desk without a map, standing in line, registering, interacting with other attendees, and giving a talk about itself.
The project was a major integration effort, with each group bringing its own skills and software to the table. The project homepage is here. I was involved in both 2002 and 2003. In 2002, we were able to complete the task, except for interaction with conference attendees. In 2003, we tightened the integration between the components, added a second robot (George), and began to address the free interaction problem. I worked on the integration, task sequencing, and systems aspects of the problem.
I adapted the Grace software to produce Lena, a portion of the Science Museum of Minnesota's traveling Robots and Us exhibit. Lena isn't a robot per se, but includes portions of the Grace and Roboceptionist software to provide an animated talking head that can carry on conversations about the exhibit. Lena went live in late 2003, and has been on tour ever since; see the Robots and Us schedule to see where she has been, is now, and will be going next.
As part of Sebastian Thrun's Statistics in Robotics course, Mary Berna, Brad Lisien, and I performed preliminary research into localization via wireless access point signal strength, culminating in an IJCAI '03 poster. We used Monte-Carlo localization with a relatively sparse map of access point signal strengths to localize a moving target to a resolution of approximately 3 meters. We did this with no a priori knowledge of the locations of the access points and with much less map data than other contemporary approaches.
During the summer of 2001, I worked on stereo vision code for a semifinalist entry in the DARPA PerceptOR project. MacArthur was the vehicle built for the project, although I spent most of my time testing vision code on an electric ATV, since MacArthur was based in Colorado.
DiME is a project I worked on during my junior and senior years (1999-2001) for Illah Nourbakhsh that investigated an early form of augmented reality. Using display goggles and a head-mounted camera, we experimented with modifying the user's visual stream, with the eventual goal of helping individuals with certain kinds of vision problems.