This final week was mainly about tying up any loose ends in the project. We documented all our work, updated the ReadMe, heavily commented code, and fixed/added any minor details. There was also a poster session held Wednesday where we and the other undergraduate researchers presented posters on our summer research. Our project won the "Best Project Award". We were also busy this week with writing our final technical paper. It has been a great summer, and I am happy with what we accomplished!
This week I continued working on a text classifier for the Builder responses. This classification RNN model is finished it can aid the Builder in predicting the type of message the Builder should respond with. It serves as a metric to check the precision and recall of the type of utterance the Builder should have generated.
I started undertaking a text classifier for the Builder. After analyzing the trials, I decided the Builder responses can be assigned to one of 8 mutually exclusive response types (i.e. dialogue-act): Greeting. Verification Questions, Clarification Questions, Suggestions, Extrapolation, Display of Understanding, Build Materials Update, and Chit-Chat/Other. I have been hand-labeling our trials with these classes (which was a tedious process). I coded/ran the data in a script for text classification that shows great promise that the classes can be predicted with high accuracy. From proving this, we decided to proceed in creating an RNN (Recurrent neural network) that can use deep learning to classify the dialogue-acts and be incorporated into the Builder AI.
This week I added the functionality to our data-loader to keep track of block-weights when the Builder adds or removes blocks. The weight system would keep track of the 5 most recently placed blocks. This will be useful in the Builder-Action-Prediction model that will be created in the future. The Builder can make better predictions on where to build next by knowing which blocks were placed more recently. Also, I added the data-loader functionality of generating CNN readable grid worlds, so the CNN can "see" what the current state of the Minecraft world is (where blocks are placed). This will also be incorporated into the Builder-Action-Prediction model.
Week 6 was a little slow. After the important milestone of creating a successful cnn last week, we have started transitioned to training the rnn (recurrent neural network) to handle the builder utterances. We are just having trouble getting a footing on this new aspect of the project. Hopefully next week won't be so slow. Fourth of July was nice here; I saw the fire works show at Memorial Stadium :)
Our cnn has been very successful. This week we implemented loss functions to know when to halt training to avoid over-fitting. We also implemented having a validation set before running on our test set. We have been able to load this model back in and have it make predictions based on our collected data form the experiments.
We finally have had some really great strides with our neural network. The accuracy is quite high, so it can identify our simple shapes well. We are also starting to take a look at the natural language processing aspects of the project. We have started generating and formatting data collected from the human trails conducted earlier. I'm feeling pretty good, seeing that we are having some success.
Week 3 w continued our efforts from week 2. We continued to create more shapes and data for the cnn (convolutional neural network) to use. We also began training our cnn. It is kind of a lsow process, however I am learning a lot about neural networks and best practices to train.
Week 2 I got to dive more into the project's code. We created scripts that would generate shapes with labels in a grid representation. This data is needed to train our cnn. The shape exists in an 11x11x9 world and represented as a 2-D array, so our cnn can interpret the points. I also created a script that can visualize these shapes.
Week 1 was overwhelming. I met the team I was going to be working with. A rising senior named Charlotte that is attending Vassar College and a rising junior at University of Minnesota, Morris. We learned about the project that we were joining. We spent most of the week familiarizing ourselves with neural networks, Python, and PyTorch. I am hopeful about how the project will progress.
My research project was focused on AI (artificial intelligence) and NLP (natural language processing). The goal of the project is to create a builder and an architect AI that are able to communicate to create simple to complex block structures in Minecraft. The project entails us creating a convolutional neural network (cnn) for the builder and architect to interpret the structures, and training a recurrent neural network (rnn) to generate builder/architect utterances, so they can converse instructions. The end goal is to integrate these networks together and then into Minecraft.