In order to fill you in on what research is like, I have kept a journal
that will hopefully give you an idea of what I have been doing...ENJOY!!!
Week 1
So, this concludes my first week as an undergraduate researcher. I met a
lot of new people and they were all very helpful. I spent most of my time
reading articles that are relevant to my project and getting used to working
with the people in the lab. As of right now, it looks as though I may have
a lot more to research before I get starting designing my web testing tool.
Week 2
I found some really good articles this week concerning web testing
and also found quite a few descriptions of client and server-side
testing tools that are already available. Right now, I am trying to get a
better understanding of the characteristics of the web, web applications
and the different techniques that can be exploited for testing web
applications.
Week 3
This week I found and read more articles focusing specifically
on those that discussed structural testing of web applications and the
dynamic web. I have found so much information on web testing and there is
still more to find. I organized my information by making a broad list of all
the different aspects of the web (including: the functions of the client/server
and how they interact when dealing with dynamic pages as opposed to static;
characteristics and content of the pages; and the languages used, etc.) and I
noted what others have been/are testing and the techniques that they used.
Week 4
I found an article this week that descibed a project similar to my
own and searched for some of the papers that were referenced. From there
I was able to get quite a few articles that discussed testing goals and
plans. However, I was unable to locate or retrieve some of the articles. I
e-mailed a few professors/authors to see if they were willing to send me a
copy of them. I guess we'll see what happens. A few of the articles discussed
proposals, but only two of the articles seemed to focus on actual tools that
were being designed to white-box test web applications. They outlined some of
the issues involved with web testing and possible solutions to them. We decided
at our last meeting that our focus for now is to get more of an idea as to
what has and has not been done as well as to find a way to automate a test
case generator suitable for handling different types of pages whose result
would be the inputs for our testing tool.
Week 5
So far, I have been taking notes on all of the articles that are related
to our project (Most of the articles that I have found discuss automating test
case generation, and goals/approaches for white box testing the web). I am
still in the process of finding more information and discovering how big the
"world" of web testing is. It doesn't seem like too many people focus their
attention towards structurally testing the web. Most of the tools that are
available focus on load/stress testing the server, hyperlink checks, code
validators, functionality tests and regression tests. Most of the articles
that aim at structural testing web applications only discuss ideas on how
to test an application. This week I am sortof putting together all of the
information that I found from ideas to issues and I plan on working with Lori
and Amie to narrow things down or find out what our next goal is going to be.
This week I also found a web crawler created by Robert Miller as a project
at
Carnegie Mellon University. When given a URL, the crawler creates a model of
the web application. I have been trying to use it to figure out what it can
do. This may prove to be useful for finding specific pages (pages with forms,
frames, etc) for our testing.
Week 6
This week I made a bibtex file and referenced all the articles that I
found concerning web testing and gave Amie and Lori a copy of all of the
articles that I referenced. I also started writing information and ideas that
I got from the articles and our discussions concerning this project in a LaTeX
file. This week I began to write a C++ program that will hopefully help us
come up with some statistics concerning the structure of a web page. This
program is basically a counting program that takes in a file that has been
downloaded to my home directory and calculates the number of loops,
conditionals, functions within the source code and determines whether the
document is a form or contains an embedded script.
Week 7
This week, I found a three perl scripts online that would work well with
my counter program. When given a URL, network.pl (establishes a connection
with the server), www.pl (retrieves the page that was requested), and
webget.pl (prints the source code to the screen) work together to gather the
source code of a specific web page. I changed the webget.pl program so that
instead of printing the source code to the screen, it would open an output
file and write the code to that file. This way I could use these programs to
get the source code and then use my program to open the output file and
examine the code. So, in order to make things easier, I started to rewrite the
program I wrote previously in C++ using perl.
Week 8
Most of the perl program that I was working on last week is written and
with the other programs that I found online, we hope to be able to gather
statistics. My next plan is to extend the program that I wrote so that
it recognizes declarations and keeps a list of the variable names that were
declared and the line that they were declared on. There is one problem that
is making this part fairly complex. Some variables are declared implicitly and
are not given data types. This makes it difficult to find all of the
declarations throughout a program. So far, this program is able to do this for
variables declared with a var or Dim (like in JavaScript, VBScript, ASP pages),
but I am still working on other variable declarations. I also spent a lot of
time writing bits and pieces of the paper concerning this project in the LaTeX
file that I created the other week. I organized the file by putting some notes
of mine into sections. So far, I wrote an Introduction, Related Works and an
Approach section.
Week 9
This week I worked on gathering sample programs or scripts written in
different languages and using my program to gather statistics on them. I am
analyzing the programs manually and then using my program to make sure that it
works correctly. One problem that I am facing while writing this program is
figuring out a way for the program to recognize comments in different
languages. For instance, in perl and python, # is used to symbolize a
comment, however, with CSS (Cascading Style Sheets), the # is used as an
indication to
represent a single line or heading in a certain way. I decided to fix this
problem by extending the capabilities of my program to include recognizing
the different languages.
Week 10
So, this is it. My last week of summer research. I definitely learned a
lot this summer. I was a little unsure in the beginning about starting a new
project, but it turned out to be a lot of fun and a great learning experience.
It just seemed like there was so much information and it was everywhere.
I actually enjoyed reading and dissecting articles. I'm also glad I got a
chance to program a little this summer and that I learned how to program
in perl. The program that I wrote brought to our attention some
complications involved when creating a tool that handles multiple
languages that may have gone unnoticed. Jean, another CRA-W
undergrad that worked here for the summer, was nice enough to share with
us some of her programs for an online
auction. With these programs, we will be able to see the files that
remain behind the scenes on the server (java servlets) as well as the
client pages. This week
I used Jean's files to get a better understanding as to how her
client-side
files work with her server-side files and the
information
residing on the database. This week, I also I worked on my program and
finished up my webpage/final report.
Click here to see my final report
Main Menu