Week One

This first week we were introduced to the UCSD campus and the San Diego Supercomputer Center. We were given a program to modify and run on the Blue Horizon (the Super Computer). We were also introduced to the projects. There are four interns working here for the summer. Four projects were presented to us. All of these projects involved trying to get the best possible performance out of the supercomputer. The main goal of one project was to simulate computers with different cache sizes and see which architectures would a select group of applications the best performances. Another project Involved looking at the scheduling algorithm of the Blue Horizon and analyzing ways of giving the computer better performance. Another project involved coming up with a user survey and figuring out a way to encourage people to more accurately guess the time their application will take to run.

This week we met Jeanne (our mentor Prof. of Computer Science at UCSD), Allan (the head of the projects here at the San Diego Supercomputer Center), Beth, Nicole, Laura, and Cynthia. Allan also had me meet with Steve Cutchin and talk about a project that had to do with graphics.



Week Two

Steve gave me a book, Advanced Animation and Rendering Techniques Theory and Practice, to read. I was supposed to go through it and find something that looked interesting to me to work on for a project. Some of the areas that I'm interested in are: volume rendering, radiosity, and caustics.

Jeanne (my mentor) also arranged for me to meet with Henrik Wann Jessen, a computer science professor at UCSD. I also talked with him about possible projects. One project he proposed involved tone mapping. My part of the project would be to work on the problem that introducing a bright light produces when tone mapping is being used. Another project he proposed was to use my current ray tracer to implement a technique described in, Temporally Coherent Interactive Ray Tracing, which is a paper by William Martin, Erik Reinhard, Peter Shirley, Steven Parker, and William Thompson. This technique is a hack to try to make ray tracing interactive by only rerendering the parts of the scene that had moved.



Week Three

I added the project description, personal information, and the weekly journal to my webpage. I met with Steve and we decided for sure what my project would be. I decided on researching photon mapping. I read Realistic Image Synthesis Using Photon Mapping by Henrik Wann Jensen. I also worked with my current ray tracer and got eliminated some problems I had been having with it. I fixed a problem that was causing some artifacts in my implementation of glass. I still want to understand photon mapping a little bit more before I try to implement it. We interns went and got an account at the library today. Now we can check out books!



Week Four

I worked a lot with my raytracer. I tried getting shadow rays and path tracing to work. For the shadow rays I had to shoot rays towards the light. It took me a while to figure out how to hit a random point on a triangle light source and make sure each point on the light has the same probability of being hit. I finally developed a fool proof method. I just took three random values between zero and one and multiplied them by the dimensions dx, dy, and dz of the triangle. Dx, dy and dz are the distances between the maximum and minimum x, y, and z values occupied by the triangle. I then added the three values to the minimum x, y, and z values of the triangle. I shot a ray at that point. If the ray didn't hit the triangle, I then subtracted the values from the maximum of the triangle instead. If the triangle still wasn't hit, I calculated a random point on the triangle by taking a random point on the line between two vertices of the triangle and taking a random point between that point and the third vertex. The reason I didn't use only this method is that the probability of each point on the triangle being hit isn't the same. If a point is closer to the third vertex than another, then it has a greater probability of being hit. I have some pretty good looking soft shadows now. These shadows will be used in combination with photon mapping in order to get a full global illumination of an image. Here is a link to some of the images I've generated.



Week Five

I'm very confused now. I thought that I had implemented shadow rays correctly, but now I have reason to believe that I totally missed a concept somewhere. The upper corners of the pictures of my walls are no longer shaded. I believe that they're supposed to be, which means that I am missing something somewhere. I have been generating images and trying different things to get my images to look better, but I haven't had any luck. I will talk with Steve on Monday and hopefully he'll be able to help clear up some confusion.



Week Six

I finally got my shadow rays and glass to work correctly. One problem that I was having with the shadow rays was that the display on this computer has only 8 bits for color. That means it only supports 256 colors. My program was written to support many more colors than that. Steve showed me an application that what better at guessing which color to use than the application I was using. Although I still only have 256 colors, I am now able to see better what my images actually look like. I was also reflecting rays in the incorrect direction from inside my glass sphere. I think everything is finally working. I looked a bit more at photon mapping. The next step is to get that implemented and working.



Week Seven

I implemented photon mapping. I got it to work mostly, but there are still some behaviors about it that I don't quite understand. I also haven't put a full globally illuminated image together yet because I'm still testing and trying to figure out why my photon maps are dim. My raytraced direct lighting is fairly bright compared to the direct lighting in my photon map. The photon map will only be used for indirect and not direct lighting anyway, but it definitely should be brighter. The caustics in the photon map do look pretty good.



Week Eight

I attended Siggpraph!!!

Below I've given a summary of my activities for the week.

Monday, July 28

Tuesday, July 29 Wednesday, July 30 Thursday, July 31 Conclusion: I really enjoyed this conference. Some of the highlights of it were the Finding Nemo session, the Water, Water Everywhere sketch, the walking teapot, the Electronic theater, and the Ray Tracing course. Those were the things that I loved the most!!! I was also very happy that there was a Subway around for lunch. I think sandwiches from Subway are very good! It was a very nice conference and I'm very glad that I had the opportunity to attend it.



Week Nine

I met with Henrik on Monday and he got an account set up for me in the graphics lab. There are graduate students in the lab that I can talk to and ask for help if I need it. It's really great. My photon mapping pretty much works now. I changed my method of sampling and my pictures look a lot better now because of that. And, I also stumbled across another way of generating the random rays. So now I have two new methods of generating randoms rays and the images generated when I use these methods look about the same, but I still do need to see which way is more efficient. I think it also might be a good idea to see how others have generated their random rays because I don't think that the books I've read give a clear picture of how to do it. They tell how to get random rays in a sphere or a hemisphere, but the idea is to generate rays in a hemisphere around a normal to an object.



Week Ten

I generated images for my final report. I generated images for path tracing as well as photon mapping. The images do look different, but I understand why they are different. The path tracing uses the light as an object in indirect lighting. The photon mapping that I've implemented does not. Photons are shot out from the light, but the light is not included as an object in the scene.