Home Title

June 11 - June 15

Amy came back this week, and we met with Greg, a machine learning expert. Amy explained our problem to him and asked for his opinion on using a linear dynamical system (LDS). He said that it does make sense, but an LDS is not necessarily a good model, since it makes untested assumptions. The only way to know is to fit the model and see if it performs well.

We spent the earlier part of the week trying to think of new ideas. In the process, we came across the supplier-side and customer-side models of Deep Maize, University of Michigan's TAC SCM agent. Aysun and I found and read a paper about how Deep Maize makes its predictions. At the heart of this agent's predictions seems to be a k-nearest neighbor algorithm. After I explained the current supplier-side model to Amy, she said that it already is an NN algorithm. I agree, but I think it is only 1-NN, and doesn't exploit all the possible sources of information (there are only a few features), so I think implementing Deep Maize's model might improve our predictions. However, Amy seemed to want to try something else for our agent, which makes sense if we want to publish a paper about our work.

We were going to delve deeper into LDSs, but Amy came up with a different idea for predicting computer prices. She wanted us to implement a perceptron that learns computer prices by looking at whether past offers issued by the SCM agent were won or lost. We came up with some features for the perceptron, such as the corresponding RFQ's properties and current market conditions. Aysun did not remember how the perceptron worked, while I learned about them recently and a brief reminder from Amy's lecture notes as well as my trusty notes from Intro to AI refreshed my memory, so she read up on perceptrons while I worked on implementing another idea Amy and I had for predicting component prices.

Our idea was to save data from past games with the same set of competitors (in the prediction challenge, the SCM agent for whom we make predictions plays a number of games with each of several different sets of competitors). I implemented this, and predicted the price as a weighted average of the past games' data and the current data (from today's offers). It did not perform much better than the original method (see last week's entry), and did no worse only when the past data was given very little weight. The predictions seemed to improve slightly as more games were played (although we only had 5 games on which to test), so perhaps using data from past games gives us more of an advantage in later games against the same competitors.

Finally, we implemented the perceptron. I thought I might use a perceptron for predicting component prices too (Amy thinks there is not enough data to learn anything, but I think the data is of higher quality in this case), so I wrote a general perceptron that Aysun and I could both use. I also implemented some mathematical vector functions to make things easier. Aysun wrote the actual computer price prediction code. After she debugged that for a while, there were still some problems, so I read her code and started debugging with her. She had to leave early on Friday evening, but I stayed in the lab until 10:30 p.m., driven by my addiction to elegant, well-designed, and functional code.

I fixed a bug that was introduced when Amy transformed the problem to fit the perceptron in a way that turned out to be unnecessary and refactored the code so that it was cleaner and more object-oriented. However, the weights were being adjusted too much, and I could not think of a way around this. Making the initial learning rate small and decaying it were not enough, because it was a positive feedback loop in which the weight adjustments, and consequently the errors, and then the former, were getting larger rather than smaller. The weights eventually overflowed, resulting in lovely sequences of "NaN"s and "Infinity"s.

While the philosophy of computer number representation is fascinating, more mundane needs overcame me in the end and I left the lab in search of lightbulbs for my desk lamp. I discovered a vibrant summer nightlife at Brown in the process.


Previous Top Next