MY LOG CONTINUED
BACK TO HOME PAGE
                                                                                            
BACK TO PROJECT PAGE


June 19th

I began looking at my books for algorithms. I looked at the classical algorithm of SOM.  I then looked through it to examine how it could be changed. The architecture remains the same but the algorithm can differ. Mainly the training process is different in the algorithms. It either uses Euclidean distance to measure the differences in weights or it will use something else such as the dot product when using matrices.


June 20th

I began looking for references. I looked on the Internet for some information and found a lot on the classic Kohonen Self Organizing Map Algorithm. I then found some more information on alternative algorithms. I found Bayesian algorithm and the EM algorithm. These two seem to be the most widely used. The EM algorithm is also in other SOM algorithms.

June 21st

Today I looked through my Neural Network book just to study the architecture and algorithm of the classic algorithm. The architecture is a little more advanced than the Adaptive Resonance Theory architecture. However, all of the neural network algorithms start out the same by initializing weights and parameters. I also compared my bookís SOM algorithm to other books and websites.


June 22nd

Today is Saturday so I did my laundry and waited for my mom to come visit me. We went to Fall River and New Bedford. We just walked around and then went to a restaurant to eat. I started to look around to see if there is anything interesting going on for Fourth of July. I look watching fireworks so I was hoping there was a place that my friends and I could go to see them.



June 23rd

Today is Sunday. Since I didnít have any local channels for a whole week. (Yes, I know what a tragedy.) My father taped my favorite soap opera, Passions, for me, so I watched all five hours of it.

June 24th

I continued working on my paper today. I started looking at code just to see how the algorithm is used to show the output graphically. I found a lot of applets that simulate the SOM. I also found other algorithms. I found an SOM algorithm that uses fuzzy logic to train the network. I found another algorithm called the Growing Hierarchical SOM; this is a layered SOM that proves to be very useful.

June 25th

I found a few more algorithms today, related in architecture. Soft topographic mapping SOMS as they are called are used to display maps in high-dimensions. There are three distinct algorithms that use soft SOMS. One, which is the backbone for the other two, is known as Soft Topographic Vector Quantization. It uses an EM algorithm and also deterministic annealing, which is a technique that is used to train the network. The other two are known as Kernel-Based Soft Topographic Mapping, and Soft Topographic Mapping for Proximity Data. Kernel based uses kernel function in the initialization process than goes on to use an EM algorithm. Proximity Data mapping focuses on data that is not in vectors but instead matrices.

June 26th

Today I started to put the entire paper together. I tried to make the paper flow from algorithm to algorithm. It works out well because each new algorithm that has emerged stems from a previous algorithm.

June 27th

Today I finished writing the algorithms for each of the SOMs. I then started to look at what each algorithm is suitable for. Some of the algorithms are similar in application. For instance the Bayesian and Growing Hierarchical algorithms are both used in classifying genes, and tissues, however the Growing algorithm has proven to be more accurate and efficient. Other algorithms are used for organizing articles in documents. I began thinking that maybe these two algorithms should be used together, perhaps that will allow for an efficient and fast converging algorithm.

                                       
CONTINUE LOOKING AT LOG ENTRIES

                                                
BACK TO LOG ENTRIES
1