DMP 2005 Web Site for Alexandra Constantin

Weekly Journal

Week 1

This week was devoted mostly to logistics (like getting a USC card, a computer account, a desk and a computer in the lab), reading, and defining a project. I read papers from the general area of human robot interaction and became accustomed to the projects that Maja and her students had developed in the Interaction Lab. I decided on the narrower topic of learning by imitation, specifically on developing a project in which a robot, or simply a computer, would try to teach arm movements to humans. The hard aspect of this project consists of evaluating the learner's performance.

Week 2

This week I read more specific papers regarding learning by imitation and comparing limb trajectories. I set out a plan of what I would start implementing. The most important part of the project is developing a good metric for comparing the movements of the teacher and the learner. My approach to developing this metric is based on segmenting the motion into basic movements. I If the segmentation is satisfactory, then time scaling can be performed on each segment, and the error in learning can be isolated to specific segments and joints where the imitation failed.

Besides laying out a first plan, this week, I also worked on the DMP web site, set up the lab desktop I was assigned, read about the USC motion capture suit (which I will be using to capture data for the project), and looked for sources on spline interpolation, which is needed for time scaling.

Week 3

This week I learned how to use the USC motion capture suit. I managed to successfully install the latest version of the motion capture suit applications, hook the suit up to my computer (using a pioneer robot battery), and capture the first set of data. I became accustomed to the BVH file format and managed to get the motion capture suit to write the data it captured in BVH format. I also finalized my readings on spline interpolation.

This week, I also started implementing my project. The most tedious part was implementing a BVH file reader. I also implemented a very basic application for displaying the motion from the BVH files (I was lucky I took a Graphics course in the spring, because I already knew OpenGL, and the application is very handy for debugging). I also implemented a mean square distance metric between two trajectories, using joint angles instead of Cartesian coordinates, because join angles are independent from the length of humans' limbs. At last, I implemented time scaling, in order to develop a metric that is independent of the absolute speed of movement and accounts for all the data in both compared trajectories.

Week 4

This week I worked on implementing movement segmentation. I came up with some ideas of segmenting that are particularly well suited for my project and make it easy to provide good feedback. First of all, using rotational angles instead of centroid or joint trajectories seems to be a more promising approach, because of the restrictions that can be assumed when dealing with angles. Besides that, considering the arm a hierarchical structure, and thus evaluating rotational angle offsets from the parent bone, provides a better way of evaluating individual bone movement without the error of the parent bone lingering on to the child bone.

I started testing my segmentation algorithm on BVH files available online, as well as files captured with the USC motion suit. Since I've recently become accustomed to the motion suit, I also started writing a manual page describing how to get started using the suit and the basic ways of capturing data.

While testing my algorithm, I discovered there were some apparent problems in the way the motion suit was capturing data. Sometimes the rotational angles were off, at other times rotating in opposite directions generated the same output, and sometimes the exact same movement of the sensors on the table generated completely different values.

Week 5

This week I realized that the problem with the motion capture suit was an initialization problem. Due to limited memory, each sensor has to be in a specific position when the suit is turned on, otherwise the readings will not be accurate. With this condition met, the suit seems to be capturing data reliably.

I also finished the motion suit manual and wrote a USC project page, entitled Evaluating Arm Imitation.

This week I also wrote the DMP Progress Report, due midway through my internship.

I spent the rest of my time capturing motion data and making small modifications to my program.

Week 6

I spent this week testing my evaluation metric and modifying some details in order to improve its performance.

First of all, I thought of how best to choose the values of different parameters needed for my segmentation algorithm, such as the minimum number of frames in a segment, the minimum number of surrounding frames of lesser value that need to surround a number in order for that number to be considered a maximum, etc. Instead of estimating values for these parameters, I decided to systematically perform segmentations with different combinations of these values, ranging from 1 to a maximum value. This change in the evaluation metric greatly improved its performance.

The second problem I noticed is that the motion capture suit provides rotational angles obtained with the arctangent function. The angles thus jump abruptly from values such as 89 to -89. This can make the difference between two trajectories seem much greater than it really is if comparing angles such as 75 and -89, which in reality may only differ by 16 degrees. I came up with a way of transforming the angles into the true rotational angles with unlimited values (such as 370 for a full 360 degree rotation and a 10 degree rotation in the same direction). My method is based on the values and monotonicity of the values in the surrounding frames. It works well in most cases, but sometimes fails if the movement is too fast, in which case there is no way of knowing if, from a value such as 25, -70 is reached by moving in one direction until the angle of 110 is reached, or in the opposite direction until -70 is reached.

After these modifications the algorithm recognized most imitations as such, only encountering problems in the case of fast movement, as described above.

Week 7

This week I tried to look for a graphical interface for demonstrating movement. The problem was that most applications can only be utilized by selecting options with the mouse. Source code for the applications is not available, so my program cannot manipulate what the application is playing. I tried using the LifeForms CD available in the Interaction Lab, but this posed the same problem. After consulting with Professor Mataric and other students from the lab, I decided that the best thing to do is implement an animation of my own. Even though this animation will not have a complicated, human-like appearance, the movement will be clear and so its most important purpose will be met.

Later this week I started implementing the animation, thinking about different ways of representing a skeleton and testing which one looks clearest.

Week 8

This week I completed the animation part of my project. I tested it by playing different ".bvh" files found online and comparing the movement with that of other BVH viewers available online. The application worked great on the files found online, but the data captured with the sensors did not look very similar to my original movement.

I started looking for errors in the motion capture application. First of all, the offsets between joints were not captured; they were just set to being 30 units away in the z-plane. But since the coordinate system used for OpenGL is left-handed, this placed all joints at the same height, which could not have happened in reality. Besides this error, I noticed that the axes of rotation were switched. All these errors had not previously posed any problems because if two trajectories were similar, they appeared similar regardless of the errors in the motion capture.

In order to run easy experiments and ask for repeated movement, I now have to combine my program and the motion suit application into the same application.

Week 9

After fixing several errors in the motion capture application, the animated data captured with the suit still doesn't look like the original movement. In addition to this, there is no way of calculating the real offsets between joints. In light of these newly found problems with the suit, which I think was previously used only in detecting up and down movement versus no movement, professor Mataric and I decided that in the remaining time I have I should run an experiment for testing only my evaluation metric. I spent most of this week designing the experiment and capturing and videotaping movements to be used during the imitation process.

Week 10

This week I started my experiment. I gathered data from fellow lab members and analyzed my findings, which are described in the final report.

 

2005 CRA-W Distributed Mentoring Program - Alexandra Constantin's Web Site