Research Proposal

   As of now, my research proposal is slightly vague.  Devon and I are coming into the Robotics project group at UMN during the project fine-tuning stages.  The group is called MinDART, which stands for Minnesota  Distributed Autonomous Robotics Team, and they are wrapping things up over the next month to give a presentation in July.  This project has been in the making for quite some time now, so, at first, Devon and I were slightly worried if we would be able to make any type of significant contribution to the project.  As we talked with some group members, we discovered that we may be able to help them implement a more complicated visual localization and homing system  and possibly add implicit communications to their existing robots.  Currently, the goal of the robots is to perform search and retrieval tasks, in which they search for specific target objects, pick them up, and return them to a home base.  In order to find the targets, the robots use infrared detectors.  To "find" their way back to the home base, the robots use a light sensor localization technique by gathering light data from three light towers in the field and making comparisons of the data.  The team has decided that it may be better to use different colored towers and CMUcams to perform the localization task.  With the ability to use the cameras to see 360-degrees, the robots can not only localize themselves, but they can distinguish other robots from random objects via color comparisons.  There has also been a discussion about adding implicit communications to the robots through the cameras.  For example, if a specific colored light was lit on a robot that had found a target, then another robot could see that light and "know" that, by following this first robot, it could find targets as well.  

Questions? E-mail me!