Home Weekly Reports My Mentors Project About Me

Final Report

Summer 2016

For the summer of 2016, a team comprised of several high school students and I were tasked to autonomously align and assemble the University of Texas A&M logo out of 4 cardboard boxes with physical robots. But alas, the project proved to be too difficult for us. The farthest we got was autonomously pushing the box in two dimensions to a preset point on the room. Unfortunately the straw that broke the camels back was the fact that while the programmer would get an accurate reading of where the robot was in the room relative to the origin, the robot itself didnt know where it was due to the lack of support the Player API had. In order to set the x and y position of the robot, the SetOdometry() function was provided, but that fucntion just wasn't compatible with the Create robots we were using.


The IR Team

The IR was in charge of finding the HomeBase with the help of the Enviroment team. The Aruco markers would tell the robot its location and the robot would then steer its way to the Homebase, but docking to the Homebase is much more complicated than what it sounds like. If the IR sensor ended up outside the range of the HomeBase, the robot would have no idea where it would end up, however, once in the range of the homebase, the robot would then know its location relative to the HomeBase and try to navigate itself towards it. Even though it got to the Homebase, the robot only actually docked and started to charge about 80% of the time, an achievement that is still quite impressive regarding the turnaround we had.

The Aruco Team

The Aruco team was in charge of coding in the ability to "see" the world by identifying Aruco Markers placed around our particular enviroment. To do this, they specialised a few laptops to run their code and see the markers. Even though it was a tedious task, the team was able to succesfully test a few markers out of the over 100 installed in our enviroment, thus solidifying our progress towards our ultimate goal to have the robot navigate itself with no human aid. At the moment, the only problem that they are facing is that their movement isn't very accurate. The robot says it goes to a specific place, but in reality, that isn't what is happening in the real world.

The Environment Team

The Environment team was placed in charge of printing all of the Aruco markers and placing them in as many locations as possible without interfering with the other teams and figuring out their exact position in relation to the origin. Not only that, but once the distance from the origin was found, the next challenge was to discover and mark down the angle that the marker was from the origin. Even though it was less mentally demanding, it was certainly the most tedious part of the project that could be assigned to someone.


In the picture above, the team was testing the Create's ability to see the Aruco markers that will eventuall tell the robot its own location in relation to the origin of the room. As seen in the photo, the actual Create doesn't have the ability to "see" the real world, that ability has to be deidcated to the netbook sitting atop it and its webcam. Each of the markers placed around the room by the Enviroment team has a unique identification code that separates it from the others, each measured to the greatest precision a measuring tape could give them .