Weekly Journal

Here I will be updating my weekly progress and my general thoughts while at UC Berkeley. Enjoy the read!

 

Week 1: May 20 – May 26

What a week! I drove up from El Centro, CA to UC Berkeley and that was QUITE the drive. It did give me time to sing to myself in the car though, so I guess I can’t complain. I moved into my room in International House on campus and it has a great view! There is no kitchen, but I have some swipes for the cafeteria for the summer so I have that covered.

This week I met with my mentors: Prof. Ruzena Bajcsy and Prof. Erickson R. Nascimento. You can learn more about them in the Meet My Mentors page. They’re very accomplished and I am honored to be working with them over the summer! The week was mostly spent getting all the paperwork and logistics covered. I didn’t have a UC Berkeley card or email (both of which I need to make some progress here). Luckily, that was done this week after chasing down many people to find out the process for two days. Other than that, Prof. Erickson gave me two papers to read to get acquainted with the literature of the project.

 

Week 2: May 27 – June 2

¡¡¡Estas soooon las mañaniiiitas que cantaaaaba el Rey Daviiiiid!!! It’s was my 21st birthday on May 27 and it was my father’s birthday the day before! I spent the weekend back in El Centro, CA to spend it with family, but promptly returned to UC Berkeley to keep on working!

This week I set up the simulator that we will be using to examine the psychoacoustics of a autonomous drive. I got Grand Theft Auto V and set up all the necessary files to run the game with an autonomous vehicle. It was very interesting that I would be using a video game to perform research, but that’s why I’m here — to learn! I started to play around with the existing program made by Prof. Erickson to gain more control of the environment. So far, I have been able to change the radio stations in the game as well as add my own .mp3 files to listen to on the “Self-Radio” in-game. I usually work in Python and Java, but all these files are in C++! I’ve worked with C before, but it was a while ago so this is an adjustment (using Visual Studio, relearning error codes, etc.). Luckily I’m all caught up with the C++ lingo.

 

Week 3: June 3 – June 9

This week I continued my work in updating the environment to something that more represents what we want. I successfully added pedestrians into the environment and can make them speak on command! This is very important as we will want to use audio and visual data to find out where a noise is coming from in the streets. This week I also started to use Scrum with Prof. Erickson. I have never used this sort of format so it was interesting to start, but it seems to be a very productive way of organizing tasks. We will soon start using the environment that I made to train an autonomous car to lower the acoustic annoyance during a drive using speed changes, window state (open or closed), and radio station changes. Prof. Erickson also told me that he might want my help on another project he is working on as well which deals with computer vision. More work to do is always a good thing, for science!

 

Week 4: June 10 – June 16

Things just got upgraded! This week while I was working on the simulation using Deep-GTAV to add wind noise and simulate a window being open/closed, I was told to drop that work for now since we will be working on an actual vehicle! This is more of what I thought I would be doing in the smart vehicles project so needless to say this was exciting to hear! We came up with an experimental plan and next week will meet with the owner of the vehicle to start our trials to capture audio and visual data. We will most likely be doing our experiments on site in the Richmond Field Station, home to PATH which houses UC Berkeley’s large-scale transportation technology research. We will be using one of the Berkeley DeepDrive vehicles to gather data. Very exciting!

 

Week 5: June 17 – June 23

During this time, my advisor Prof. Erickson is in Salt Lake City, UT for the CVPR conference. Since he is gone, he sent me to speak with the owners of the vehicles in Berkeley DeepDrive this week. I met with Chen-Yu Chan and Long Xin to discuss the project and if there were any restrictions to what we can mount in or where we could drive the vehicle. Everything went well and we are planning our first experimental drive! Apart from that, I have been reading the tutorial on how exactly the vehicle works since we have to start up the data collection before we turn on the vehicle and we have to use ROS, Robot Operating System, that isn’t exactly an operating system, but more of a set of software libraries and tools that help you build robot applications. I have also been tinkering with the simulation environment to see what else we can add to the simulator.

 

Week 6: June 24 – June 30

Prof. Erickson has returned and we are now refining our experimental design for our first drive in the car! We also got a high school student to help us with the simulator while we work on the physical car experiment. His name is Frank and I was given the job of showing him around the simulator and allocating some work to do. He seems like a bright kid and, although initially confused, eager to start helping and learning AI techniques. This week we finally took the car for an initial drive to see how the data is captured and any complications that might arise. The car itself is a Lincoln MKS (fancy!) and it collects all sorts of data, but we ran into some connectivity issues during our first drive. We are trying to see how we can control any sort of issue and how we can attach a microphone inside and outside the vehicle without removing any of the cameras already attached to the computer. We collected the data from our first drive in a rosbag format something I’ve never worked with before, but I’m here to learn! The files are quite large, with only about 7 minutes of driving we collected about 1.6 GB of data.

 

Week 7: July 1 – July 7

This week I continued to help Frank with the simulator as he is new to the framework. I showed him all the ropes and have been advising him as he continues to help us with adding features to the simulator. During this week I learned the ins and outs of ROS and how to use it. As it turns out, it only works on Ubuntu so I learned about virtual machines and how to use them in order to work with the data that we had. I found out that we were having some interruptions with the LIDAR signal which was NOT good and we were only capturing data from 3/8 cameras. We obviously would rather collect all the possible data we can get. Next week will be more experimental driving, but we will be collecting all the camera data along with all the possible vehicle data such as acceleration, throttle, braking, turn signals, etc. Read on to see how we got along!

 

Week 8: July 8 – July 14

What a week of driving it has been. At first, our goal was to take the car out on Monday, but we were not provided with the necessary key to enter and start using the car so we had to wait until the next day to start collecting. By Tuesday we were out driving the vehicle with some very shoddy results as we were still getting used to the process of starting and using the vehicle. By Wednesday we had a much more structured way of going about the car and we were able to drive within the Richmond Field Station to collect some more data as we controlled as many variables as we could. Then Thursday rolls around and we decide to take the car out into the city to get much more interesting data rather than an empty park in RFS. The problem is, during our drive (to and from Berkeley) the GoPro decided to stop collecting the inside audio, WHICH IS THE MOST IMPORTANT PART! That was really annoying and we decided to take a quick break before we came back and did the whole ~30 minute drive once again. Thankfully the second run through was better, but by now we were running out of gas so we couldn’t take it out of RFS anymore. Instead, we opted to collect some data on some dirt roads inside the facility. This ran with no issues. Finally we are back in the lab on Friday and we are analyzing the annoyance levels of the sounds, comparing the inside/outside data, etc. As expected we are finding that the outside data is MUCH more annoying as it is mainly wind noise (even with a deadcat on the mic). By next week we wish to have analyzed all the data collected from this week.

 

Week 9: July 15 – July 21

We started to analyze all of our data and as we expected, we are seeing some results that things become more annoying when the windows are down and we get faster! Picture yourself in your car with the windows open driving by a nice calm road in a neighborhood. Chances are you are going about 25 mph and you can feel the nice wind on your face as you go by and the noise is not so jarring. Now imagine yourself in that same car with the same conditions in a new environment: a speedy highway. Now you will be going between 65 and 75 mph and that same calming wind turns into a harsh gust with the deep sound of the wind and road overpowering the sound inside your car. This week we also used two more days for driving, but this time we took Frank along for the ride and we got some GREAT data with all of us talking (either conversational or above each other) in English, Spanish, Portuguese, and Mandarin. We might be some different PA levels in different languages, but that’s what we will be analyzing next week!

 

Week 10: July 22 – July 28

This is my last official week with Prof. Bajcsy under the DREU scope, but I will still be here for an additional two weeks helping with the project and data annotation. As of now we have finally gotten all of our data and have started with the annotation. We are using bounding boxes to go over the faces of who is speaking during any given time in the data set. This is so that we can help train the agent to recognize where a specific sound (and annoyance) is coming from inside the vehicle.