Weekly Project Journal (5/18 ñ 8/1)
The flight from Salt Lake City to Pittsburgh was going fine but like any other trip it had its misfortunes. I arrived in Pittsburgh but without my luggage. Apparently Delta Airlines wasnít quick enough to transport my luggage onto the connecting flight in time.Ý I wouldnít have minded the missing luggage so much except that the day that arrived and the next few days were extremely cold and I only had a light shirt on. My warm sweater was of course in the missing luggage.
Well, I finally got my luggage and began to get used to living in Pittsburgh. The first few days werenít too bad, a group of the grad students from the graphics lab invited me for dinner one night and showed me around town a bit and introduced me to my workplace. There wasnít much to do at work first, I had to of course take care of all the technical items such as setting up accounts, getting familiar with the lab and my project, and being introduced to the grad students.Ý So overall my first week turned out to be just fine.
The first order of business was to understand the nature of my project. I have described my project under Project.
My first assignment was to understand what a plug-in is and how to write one for Maya. Maya is a 3D animation software. It is very powerful and is used for modeling, animation and creating visual effects. I am somewhat familiar with Maya because of my last summerís research but I didnít know anything about writing plugins.
After looking through the help files provided by Maya and studying the Maya manuals, it wasnít very difficult to understand how one can write a simple plugin and incorporate it into Maya. There were a few examples on writing plugins and I found the documentation to be very helpful.
So now that I figured out how a plug-in is created and loaded into Maya the big question wasÖhow to write a plugin that will communicate between the motion capture system and Maya? It didnít sound easy and believe me it wasnít easy.
This was not an easy task. There were too many details and features that I needed to learn first before starting to write anything. At this point my mentor told me to search and see if there already exists a plugin for Maya. This made sense; I mean if somebody else already did all the work then why bother wasting my time trying to do the work from scratch.
I was able to get a hold of the company named Vicon that created the motion capture system and they said that they actually do have a plugin for Maya. You canít believe how much easier my life was made at that moment, instead of sitting for hours trying to figure out how to write a plugin here I was instead sitting waiting to be handed the plugin on a silver platter. By the next day I received an e-mail with an attachment Ö the plugin was here!
My goal this week was to get the plugin set up and running. The first problem was that the Vicon capture system was in a second lab, which is on the first floor, and Maya is running on my machine, which is on the fourth floor. Now in order to get the plugin to work I need to be using both machines simultaneously. Since being in two places at the same time wasnít working out I had to install Maya on one of the machines in the downstairs lab.
I took about a couple more days to figure out how the data was being read into Maya and how to map the data onto the character in Maya. I spent the rest of the week doing real time motion capture using a simple stick with 3 markers as my object. By doing multiple trials of real time capture I was trying to understand the mapping of the data onto the skeleton.
Doing the real time capture wasnít always fun, most of the time it was frustrating because I didnít understand what was going on. The object in Maya was not moving in any way like the real object, if I move the stick up and down the Maya object starts moving diagonally and at a different speed. I decided that the best thing to do is to study the plugin source code and see if I can make sense of things.
By reading through the code I realized that trying to figure out other peopleís code isnít easy, especially when they have no commenting or any documentation. In order to get through the reading I had to guess as to what some of the functions did.Ý By the end it seemed that the code was missing! There was no function that illustrated how the data was being communicated between the motion capture system and Maya. I was looking for a function that shows how the plugin obtains the stream of data from the capture system. This was important because for any future modification of the code (which was hinted to me as being probable) we needed to get access to the actual data.
It turned out that I was correct in my assumption and that the code was missing. Apparently the plugin only handles the initialization of the communication between the capture system and Maya and it then launches a Proxy that is a communication program that sits between the capture system and Maya and it handles all the data transfer. Again, I sent an e-mail to Vicon requesting the source code for the Proxy.
While I was waiting for an answer from Vicon, I went back to doing real time capture to try and understand why the data was messed up. After several trials the answer was clear. When I created my object in Maya I had created a hierarchy with three joints to represent the three markers on the stick. However, the data coming from the capture system wasnít hierarchical. I then broke up the hierarchy and basically ended up with 3 objects representing the 3 markers on the stick. The real time capture then worked after the modification.
Even though I was happy that my real time capture worked it was actually bad news, because in order to do human real time capture we need the data to be hierarchical. This was a problem.
Lately the motion capture lab has been very busy. There are a couple of students who work in the lab and they help everyone conduct motion capture sessions. They are the ones familiar with the positioning of the cameras, placing markers on the body and cleaning up the data after the capture. They were planning on going out of town for three weeks and so everyone started scheduling their capture sessions before the two guys left. This meant that I didnít have the freedom to use the lab when I wanted. In any case, I finally got my slot and it was my turn to do a real time capture, this time using a human as my object.
This was my first time doing a human real time capture. It really isnít much different from an object capture, there are of course minor changes such as the type of skeleton in Maya (first it was an object and now itís a human skeleton) and also the default files called the marker files, which describe the number of markers on the object and the relationship between these markers, were changed from an object marker file to a human marker file.
The hardware wasnít behaving properly. Vicon refused to capture data and so we had to restart the machine. After that the session was going fine however when I tried to open up Maya and map the data on the human skeleton I ran into a number of problems. First, Maya was running very slow that it was reading data at a rate of about 5 frames/sec and not the normal 30 frames/sec. Second, it appeared that the number of data channels coming from the capture system did not match the number of body segments of the object skeleton in Maya. I was expecting about 30 data channels and instead I got about 17 or so channels. That ended my real time session and it was back to debugging the source code and understanding what was going on.
By this time I had received the source code for the Proxy and I spent the rest of the week reading through the code.
Looking back at my Spring semester I remember telling myself that whatever I ended up doing, I will not end up with programming. I had taking three CS classes and they all required programming. Believe me, 4 months of endless programming wasnít fun. And guess what Iím doing nowÖ. Programming! Well, at least Iím not writing any code (yet), itís mainly reading code.
ÝAfter long hours of code debugging, I finally got the general idea of how the Proxy works and how the overall communications between the motion capture system and Maya is done. I was told that it would be nice to print the data values to a file so that they can be later plotted and analyzed. So, I modified the code by adding a few output statements and I was ready to compile the new version of the Proxy and run it with Maya. To my surprise it seemed that there were 4 ìmissingî header files. In other words, I found a number of header files used throughout the code but I didnít have those header files. They werenít included as part of the package e-mailed to me and they werenít any of the standard files. Another dead end!Ý After a number of e-mails to Vicon I got an answer saying that they couldnít locate those header files and that theyíll keep searching. In other words, ìwe donít have the files, donít bother us!î. I was extremely disappointed. After 5 weeks of building up and getting to where I was now and then I canít go any further because of 4 missing files.
At this point, my options were:
Ý Do reverse engineering and write 4 new replacement header files.
Ý Re-write the Proxy program (not possible in 5 weeks)
Ý Move on to a different, yet related, project.
Option three sounded the easiest but I wasnít satisfied. I had worked long on the project and I wanted to accomplish something. I went back to the source code and started searching for header files with names similar to the missing header files. It turned out that all of those files were actually standard header files but their names have been modified. My next idea was to go through the code and replace the missing header files with standard files. This of course required some modification in the code that wasnít hard but rather time consuming.
It was now time to compile the code. .Ý . . 0 errors and 20 warnings! This was a good sign but the real test was to try the new proxy with Maya. I managed to find the motion capture lab empty and I ran the programs. . . . IT WORKED! Well at least it ran but more thorough testing is needed. Iím back in business!!!!!
Six weeks have gone by so far and thereís till a lot to do. I left along the programming and the debugging, now Iím dealing with the math part of the project. So far I have figured out how to capture the data and input it into Maya, now comes the mapping part of the data. How to map the data to the skeleton in Maya? During week 5 I ran into some problems with the number of available body segments. The problem is that the skeleton generated in Maya has more body segments than the default skeleton in Vicon. I still have to modify the default skeleton to include the extra body segments. Iím leaving this for the upcoming weeks.
The current situation is this:ÝÝ
ÝÝÝÝÝ Tarsus (the Vicon software) outputs the position and orientation of each individual body part relative to the global coordinate system. The skeleton in Maya expects joint angles for the different joints linking the body parts. The problem now is how to convert from position and orientation to joint angles! I needed background information on angle representation and inverse kinematics. The lab had a number of math and graphics books and I started reading.
ÝÝÝÝÝ I read about the fundamentals of inverse kinematics and the different angle representations (angle/axis, Euler, Quaternion). The problem is a bit more complex than I expected and I need to sit down and discuss the problem with Alla (a grad student in the lab). Since itís the 4th of July weekend most people have left early and so weíll have our little talk hopefully Monday. Also, my mom and my sister are coming to visit for the weekend and so I doubt Iíll be tackling this problem anytime before next week.
This whole week was spent modifying the code so that the plugin will output joint angle information instead of position/orientation information for the different body parts. It wasnít very difficult: given the current position and orientation of any two body parts, represent it as Quaternion, find the difference between the two quaternions and the result is the angle for the joint linking the two body segments (more detailed information is in the Project page).
Therefore, I modified the code to calculate the joint angles for all the joints in the body and output the new data to Maya. After the modification I needed to test the code. At first, instead of reading the data coming from Tarsus I instead hardcoded values to be sent out to Maya. For example I would hardcode the knee joint to always be at 90 degrees and then I should see the character in Maya bending its knees at 90 degrees (assuming the program works correctly). This type of debugging took about a couple of days. Then it was time to input the real data from Tarsus. This is where I ran into problems. The motion of the character looked nothing like the captured motion. The joints of the character were just running wild!! I decided that it might be a problem with the emulator. The emulator is a program that gets a motion data file as input and then reads through the file cyclically and output the data to client programs as if though the data was real time capture data. At this point the error can be due to two things: 1) my code 2) the emulator. My goal next week is to check the correctness of the emulator.
Fixing the emulator wasnít going to be fun, especially since I had no information on how it functions. All I had were the instructions on how to run the emulator. I sat down with Alla to discuss different ways of trying to test the emulator and what would happen if I couldnít fix it. I decided to go back and re-read the instructions for the emulator and see if I had missed anything. It turned out that I wasnít performing all the steps correctly. So the problem was very simple, I wasnít copying the correct file into Viconís system directory. After realizing the importance of this file and copying it into the appropriate directory, the emulator ran beautifully. This file was the Camera Calibration file (.cp) that contained information about the calibration of the cameras used during the capture. This .cp file is needed in order to process the motion data file correctly and to reconstruct the 3D data.
Now that the emulator was working, I had one more problem to tackle: the hands and the feet. Apparently, the hands and the feet of the character in Maya were not being animated correctly. The values being read into Maya for those two body parts seemed inappropriate. I didnít have much time to try and fix this problem and so Iím hoping to work on it during my last week.
ÝWeek 10: SIGGRAPH 2002!
ÝThis whole week I was in San Antonio attending the SIGGRAPH conference on graphics and animation. This was no small event about 20,000 ñ 25,000+ people attended this yearís conference. Itís a wonderful gathering to see the latest in computer graphics and there are a wide range of activities for people to attend such as courses, papers and panels. Also for 3 days there is an exhibit where various companies display their latest products and innovations. SIGGRAPH is a great place to meet people from different backgrounds and specialties in graphics and itís an excellent place to build those connections!
ÝLast three days:
These are my last 3 days and basically Iím just wrapping things up. I have to document everything that Iíve done and write a simple guide on doing real-time motion capture. Also Iím testing my program by doing an actual real-time capture. I tried the capture yesterday and unfortunately the results werenít that good, I think the problem was with the Vicon capture system because it kept crashing and most of the times it didnít capture all of the markers on the body. Iíll give the capture one more round and hopefully it will work this time. I hope that my code actually works because otherwise I have no time to fix it!!!
It has been a fun experience especially since we donít do motion capture at my university (University of Utah).Ý I wish I had more time to finish things completely but what can I do, my flight leaves Thursday morning!!!!