JOURNAL
The first thing we did was get CITI certified (citiprogram.org ). This certification ensures that we are familiar with the basic procedures and ethics of conducting research.
We were given a list of projects to choose from by Dr. Gilbert. We had a few days to think about the projects and make a decision. This week, we also got the chance to meet many of the people who work in the lab from graduate students pursing masters degrees and PhDs, and many Post-docs (Dr.'s).
Finally, I decided to work on the "Cable Labs" project led by Dr. Kyla McMullen and "Distracted Driving" project led by Dr. Wanda Eugene.
I took lessons on CodeAcademy.com so I could learn how to make a website (took about a week to finish). This website is my progress.
The other DREU participants and I finally met Dr. Gilbert in person, along with Dr. McMullen. Dr. Gilbert gave us a very informative talk on grad school.
I think he convinced me to pursue graduate school.
Dr. Eugene gave us material to review to get us caught up with the distracted driving study.
This material includes a 150+ page report on a study by Texas A&M dealing with distracted driving.
For this study, I'll be working with the screening participants, but most importantly, I get to program
the LED lights with an Arduino Uno (microcontroller). The drivers will respond to the pseudo random flashing of the LED lights that
we will program. I've never done any programming outside of console applications, so it's
very exciting to apply code to "real world" applications like making lights flash!
In regrads to the Cable Labs project, I downloaded the SDK (software development kit) for the Kinect 2.0. Now, I can start writing
code for the Kinect! We'll be writing this code in C# which I've never used before. I'm now on the gestures study team and we'll
meet every Monday. With the help of online forums, I was able to write a "Hello, World!" console application with the Kinect.
I also got the Kinect to recognize my hand being open and closed thanks to EL bruno's forum! Making great progress so far!
I finally understand what an API is!
I wrote some code to pseudo randomly make a light blink which I'll apply to the Adafruit LED light strip.
June 2
Meeting w/ Dr. Eugene and Dr. Remy for LED lights.
We have a meeting this week with Dr. Remy and Dr. Eugene to set up the LED lights for the disractd driving study. These lights will pseudorandomly flash. We are currently using the owimotorcontroller as a means of controlling the LED lights. It's been cool working with the robot arm that is also controlled by the owimotorcontroller. Working with the robot arm is getting us familiar with the API so we'll have a good understanding when we start working with the LED light strips. I wrote some quick code to make the lights flash pseudorandomly. I need to read the Texas A&M study (100+ pages). A few changes to this code will make it work for the LED lights, but atleast I have the concepts down. I also have a meeting with the gesture recognition team. We'll begin programming in C# using the Kinect SDK. We'll write code that recognizes different gestures done by a user. We watched and evaluated a bunch of clips for the gesture study and decided/generalized gestures for each command. We'll conduct a reverse study of the first gesture study.
We're cutting the owimotorcontroller idea to control the AdaFruit NeoPixel Digital RGB LED light strips. The owimotorcontroller can't send the right signals to the strip to control it how we want. The strip can't be controlled with a simple on/off power supply. Each individual pixel has to be controlled. There is an Adafruit NeoPixel API that we can use with an Arduino Uno microcontroller.
This week I've worked with the arduino and downloaded the sdk for the arduino to program the led strip. The API isn't working too great at the moment.
The test for this week have failed. The program doesn't seem to be running on the lights. We have a lot of trouble shooting to be done. There are many possible problems and none of us are experienced with the Arduino and LED lights, so this may take a while. We drove with Dr. Eugene to look at a possible place to conduct the distracted driving study. I also worked on coding more gestures. With the help a forum, I was able to get a simple Hello-World, app going with the kinect. I've been able to program a simple open and close gesture recognition program.
Coded more gestures channel up and channel down (swiping left and right). I'm
learning more
C# and XAML - they are very cool.
Texting is now illegal in the state of South Carolina, so the distracted driving study will be a little more difficult
It SEEMS LIKE THE ARDUINO IS NOT GOING TO WORK!
The arduino and LED strip are not working how they should be. We've wasted a lot of time trying to get this to work. This part of the experiment is very frustrating.
Finally got the arduino working with the led strip!
Dr. Eugene says the strip was burned out. We wasted a lot of time trouble shooting a broken strip.
NOW that the strip works, we can do everything! This feat is a very big weight off of our shoulders.
We have the program written to pseudorandomly flash the lights
We now have the arduino controlling 3 strips which is what it will do for the actual study.
We did a test run in the lab with the script to get a better feel for the time of the experiment.
Our gesture study 2 is looking good,
We hung up flyers,
we have interested participants, and I've
edited all of the videos for our study.
Cable Labs meeting was short, a lot of people are out of town this week at conferences and what not.
There are only four weeks left!
It's the week of the Fourth of July. Time has gone by really fast.
We've souldered the LED light strips to ensure a good connection. We've continued planning for our gesture 2 study which will take place next week.
This was a short week considering the Fourth was on Friday and people didn't go in on Thursday. I've really reflecting on my great time in Clemson as I see the finish line approaching. It's been a great experience so far.
We finally started conducting the Gesute Study 2. I'm still working on touching up a few programs to recognized some gestures. Chris says my gestures need to be a little more robust. Our presentations are next week, so I've been preparing. I made an outline and am drafting up the main points to focus on for my presentations. I have a whole summer's worth of work to talk about. We've also started analyzing gesture study 2's data. Now I can say I help design and execute a a research study.
We conducted the distracted driving study this week. The study was a pilot and I learned how important it is to do pilot studies. I understand how important pilots are when conducting research. There was a lot that went wrong, that we will be able to fix so that the actual study runs smoother.
We needed cell phone signal to send text messages but the signal was pretty bad on the road we were on. The lights didn't work 100%, but we adjusted. I liked reading the script and giving the participants instructions. We are done with our presentations. The more I prepare, the better I am at presentating. The audience didn't really ask anyone questions which I would have found useful. I'm about to watch 2 dissertations. Dissertations are the presentations were PhD candidates present there research to a panel that determines whether you become a doctor or not. I feel like the gesture study 2 could've been done with more detail.
This summer went by very fast. We completely cleaned up the lab since the HCC lab is moving to the University of Florida. I no longer have access to the Kinect since it's been packed up to go to Florida. I'm thinking about buying my own Kinect sensor so I can keeping making and testing programs using the Kinect SDK. Our studies are done and it's time to say our good-byes. It's been a great summer filled with challenges and learning. I know a lot more about graduate school and I will most likely be attending graduate school.