In my penultimate week of my summer research, I invested a good amount of time looking into alternative gesture classification methods. Using past work our postdoc, Naomi, had done during her PhD as an example, I wrote a python script to preprocess our IMU gesture data. Instead of raw data, I now have a general csv that has extracted features as attributes. I then used this processed data in SVM and decision tree models (using sklearn packages). To our excitement, these models did much better than the previous real-time models using hmm! After implementing a kfold cross validation technique and plotting results on confusion matrices, both models classified 100% accurately on 8/10 gestures (the other 2 had some trouble, but were still above 80% accuracy). This is a great basis to work off of going into my final week. Hopefully I will collect more data from other lab members so as to build more robust models.
Progress has been made! My partner and I used the Wii remote to make a dataset of our proposed gestures — we each recorded ourselves doing our set of motions about ten times each. I then fed the dataset into my HMM script and generated confusion matrices to see how the gestures/classifier performed. The results revealed some potential issues such as the gestures being too similar, the dataset too small, or the classifier not accurate enough. Nonetheless, this is good feedback to work off of and will keep me busy in my last few weeks!
My partner and the other REU students working in my lab for the summer all leave this weekend, so I’ll be continuing the project solo for my last few weeks.
This week was a bit shorter because we spent Friday going to Catalina Island 🙂 I was able to tag along with the SURE and REU students that are working in my lab and other robotics labs at USC for the summer. The trip was a lot of fun – we got to tour USC’s research center at Two Harbors and get a taste of the sustainability work they are doing. After that, we took a refreshing dip in the ocean and kayaked around, taking in the clear blue water. To top it all off, we saw a pod of dolphins surfing the waves of our boat on the way to Catalina and the way back!
My partner succeeded in being able to generate CSV accelerometer and gyroscope data from a Wii remote. We brainstormed together and came up with potential movements that can be done with the remote.
I’m also starting to look at other classification models that can be used in addition to HMM, such as Dynamic Time Warping. To do that, I’m looking at the Gesture Recognition Toolkit (GRT) on Github, but have been having trouble building at and running the GUI. So, that will be something to work on in the future as well.
This week started off on a great note – I was notified that my research was accepted to the Grace Hopper Poster Session! That will definitely be something to look forward to for the fall.
We finally settled on a gesture set, so now once my partner finds a good way of collecting IMU data from the Wii remote we will be ready to go and can start collecting data to test out the HMM model accuracy.
This week I finally got to meet my professor, Maja! There are several REU/SURE students in my lab with me. On Monday afternoon we all met with Maja and discussed applying to grad school over lunch. It was super insightful; Maja has lots of wisdom to share. Later in the week, we had a lab meeting where all the visiting undergrads gave a short presentation of their research. I learned about confusion matrices and quickly made one for my hmm classifier to present – it was a nice cumulative experience and great to hear more detail about what everyone else is working on.
It’s been difficult to find research papers that detail exactly what gestures are common and/or essential for learning in a classroom. There is, however, a lot of information out there about the importance of gestures for language acquisition. So, I’ve been piecing together information I find and we will try to come up with our list of gestures we want the robot to do soon.
This week I continued to work on my hmm gesture classifier. It’s been a little difficult to work with my script as the package I am using is very slow — it takes about an hour to run my script on all the training data that I have. I am also starting to read up on literature about different gestures that are important for education. Once we decide which gestures are most important for the robot to use, we can hopefully generate data of our own for me to play with.