CJ's home at Brooklyn College

Carlos Jaramillo's
Distributed Research Experience as Undergraduate (Summer 2009)

WEEKLY JOURNAL

Home
The Lab
About Me
My Mentor
RoboCup 2009
Final Report
The Project
Journal
Week 1
Week 2
Week 3
Week 4
Week 5
Week 6
Progress Report
Week 7
Week 8
Week 9
Week 10
Tips

Week 2:

Task Comments
Meeting with Mentors and Team Redefined short-term goals: 1) to be able to use Mezzanine's location data to Player/Stage
2) Write Scribbler's drivers for Player/Stage
Mezzanine : a Vision Tracking Server that can be used to track the position coordinates of robots (color blobs) or any other fiduciary objects with color-coded identities within a field of view from an overhead camera. Installed the one-and-only Mezzanine version publicly available (but this version uses Video4Linux v.1 . Nevertheless, our friend, John Cummins (a former Brooklyn College student), has made Mezzanine work with V4L v2, which improves “frame-grabbing” in newer systems that come shipped with V4L v.2. Mezzanine -at this point- is a moving target that has been re-activated by John after its orphaned state since its release in 2002.
Learning about YUV color space that Mezzanine (Blob finder/tracker) uses It's imperative to understand YUV,  the color space that Mezzanine  currently uses to track the color blobs. YUV was mainly used in the early days of color-TV where the The "black and
white" signal was called Y and the two color parts U and V.  However, there is some delay on the camera's capture at 16-bit RGB (Highcolor) and its conversion from into YUV that Mezzanine is based upon. There are plans of getting rid off this overhead since RGB16 can already capture the full 10-bits of the video signal provided.
Calibrating Mezzanine with “Mezzcal” However, the original calibration system and interface for Mezzcal needs to be improved. John Cummins is trying  new implementations of this system. Current open issues are: 1) a better way to "tweak" the color identification, currently this is a manual intensive process that takes a long time due to the creation of Look-up Tables ( LUT ) to categorize the calibrated colors.
2) A better pattern identification that understands the oddities of the camera (colors are not being considered the same on different cameras). Perhaps, using a Fast Artificial Neural Network Library ( FANN ) can be help with this color calibration process.
The connection between Mezzanine and Player/Stage This kind of integration is still under observation so we can implement an abstraction that uses the data provided by the blob-finding/tracking system (Mezzanine) and use it within Player to control the positioning of the robots either in real or simulated worlds. To achieve this, we have to connect to “libmezz” : an IPC (Inter-process communication) library for communicating with Mezzanine.
Writing a “ Scribbler ” driver for Player By looking at how other drivers have been written for Player, such as the Roomba / Create driver, we could model the Scribbler's driver to be used abstractly via Player. The Myro library already provides a driver implementation for the Scribbler written in Python and John/Prof. Sklar have already translated some of these commands into C/C++, which makes it a feasible task.
Using a few software development tools. Git (source version control system), Doxygen (automated documentation generator). We might need to use these tools for team collaboration in our Mezzanine project. We will be hosting the project via SourceForge.