week 10 šŸŽ‰


This week was dedicated to fixing a few more bugs and wrapping up the whole research experience. We were able to have several team meetings to discuss the current state of the project. We are still improving the results returned, and I am still adding a few more small features/tweaks to the front-end. While we didnā€™t get to conduct the user study yet, the engine should be ready to go next week, which is really wonderful. In addition, we talked about the future of the project, and how implementing the search engine successfully allows to to explore a lot of great questions/ conduct multiple studies in the future. Iā€™m extremely excited to continue working on this project during the school year!

week 9


The front-end is nearing completion! This week was mainly dedicated to fixing a couple of (annoying) bugs and fine tuning other features. One issue was that infinite scroll would seemingly work only 90% of the time. Because the error didnā€™t occur every single time, it took me a while to figure out what was wrong. I finally realized that after a user would scroll to the bottom of a page (completing infinite scroll), one of the states would basically say, ā€˜infinite scroll is finished, no more items to display.ā€™ When loading the results for a subsequent query, I forgot to reset this state. Thus, infinite scroll wouldnā€™t get triggered unless the page was refreshed. Another issue was with the dropdown/toggle that could change the search mode. If a user clicked on the toggle, then tried to click off of it (perhaps they changed their mind), the onChange function would still fire, reloading the page. I later fixed it so that clicking off the toggle would not fire any action, making it much more user friendly.

Alice headed home on Saturday, and we all got together for brunch at Destihl beforehand. It was very bittersweet :ā€™)

week 8


This week, I returned to working on front-end development. We decided to implement a couple of new features, including toggles for adjusting what results our engine returns and how we rank them. For search, we have two options (keyword-only and smart search), as well as ranking (z-score and elastic). While implementing these toggles, I ran into a host of problemsā€”the main one being that when I adjusted one toggle to ā€˜trueā€™, it would automatically reset the other toggle to its default value, rather than saving the state. After looking at my code, I realized that the big issue was that I was adjusting state locally throughout my codeā€”aka in each individual component. As I added more and more features (and thus states), they became difficult to manage, and some components were not remembering the state changes in other components. While I pinpointed the area in which the bug was happening, I couldnā€™t figure out exactly what it was. After reading a great article on lifting state, I decided to completely refactor my code so that I was only changing state in the App component (with functions like handleSearchToggle, handleRankToggle, etc). Thus, the App component could act as the single source of truth. Although it took a bit of time, it made everything else a lot easier, such as implementing the functionality of the filter/style buttons on each product card. Towards the end of the week, I worked on implementing infinite scroll while Michael worked on autocomplete for the search bars.

This weekā€™s (last!) REU lunch was about what roles graduate students/ individuals with PhDs might hold in the industry. It was pretty fascinating, and many of the presenters talked about working at companies such as Google and Microsoft, but in a research role.

week 7


Over the weekend, we had two team meetings during which we reviewed code and discussed the progress on each of our parts. The front end development is complete for the time being, and we decided to shift our focus to the actual fashion recommendation model itself and improving it. One idea that we would like to implement (eventually) is presenting the justifications on the product imageā€”with arrows pointing to each corresponding featureā€”rather than simply having them listed below. Doing so would help create a stronger visual connection between the justification and product recommendation. It is a bit complicated however, so we decided to save this particular feature for a later iteration. In the meantime though, Alice (my fellow lab member) and I have started doing research on the ResNet and ResNext architectures specifically designed for image classification. Our grad student mentor, Yuan, suggested a couple of online courses that would be helpful. I spent the rest of the week working through the courses and quickly realized that I didnā€™t have enough background in machine learning to fully understand everything in the course. Thus, I decided to pause the course for a bit and instead Google the basics, such as ā€œwhat are neural networks?ā€, ā€œwhat is convolution?ā€, and ā€œhow does backpropagation work?ā€. I created a document full of notes about these topics, and familiarized myself with Keras. Alice also recommended a Caltech ML course, which really helped bolster my understanding of the math behind backpropagation. While I missed this weekā€™s REU lunch due to a sickness, we did go to the REU-organized escape room outing on Friday! Alice, Yichen and I teamed up with a couple of other girls from Dr. Amatoā€™s lab and tried to conquer the ā€œWizardā€™s Curseā€ room. In the end, we managed to escape, only leaving 40 seconds left on the clock. It was a fantastic little break from work, and weā€™d love to escape from the ā€œArtificial Intelligenceā€ room with the rest of our lab another day!

week 6


This week, Iā€™ve been in the thick of working on the front-end development in React. I decided to actually switch to using Semantic UI rather than Material, since it is more customizable and has simpler syntax. The coolest part of this week was definitely successfully making a call to the server and seeing all the appropriate information pop up in our results, product cards and more. Iā€™m also learning about react router, and I hope to implement that soon. The next challenge to tackle will be adding additional search filters on the click of a buttonā€”for example, when a user clicks ā€œwomenā€, I need to filter the existing products by their targeted gender. Sometimes development does feel a bit slow as Iā€™m the only one on front-end and my other lab members are not quite familiar with React either. But again, learning how to work through these problems is a good experience and Iā€™m enjoying it. Ranjitha (my mentor) came back from California this week, so weā€™ve had a lot of team meetings and are planning to have in-depth code reviews this weekend. Ranjitha also presented this weekā€™s REU lunch, and she covered how to develop your personal persona. It was a fantastic lunch (both in terms of food and content!), and it prompted me to think about my online presence, as well as how I introduce myself in emails.

week 5


This week was largely devoted to working on the UI in React. I decided to use a styling library called Material UI to help me with the creation. Through the process, I learned the importance of making reusable components (especially when you have tons of product cards), passing in props between components, callback functions, and async/await. It is definitely still a challenge, but it is a fun one nonetheless. Seeing your designs transform into an actual, usable web application is pretty cool. Thursday was the fourth of July, so we got a bit of a break! Towards the end of the week I started learning about Axios and making AJAX requests to servers, so hopefully all the functionality will start coming together next week.

This weekā€™s luncheon featured Tianyin Xu and Sasa Misailovic, who talked about presentation and verbal communication skills. One thing they emphasized is presenting simple yet striking slides. The more words, the more they can dilute your presentation. On the other hand, images, graphs and short sentences can really bring the focus back to the presenter him/herself and what they are talking about.

week 4


This week, I worked primarily on the UI design, and learning Sketch in the process. While Iā€™ve done graphics before in Illustrator, I quickly realized that wire framing and designing the entire interface is much differentā€”there are so many more considerations. For example, one of my first iterations looked basically like a typical e-commerce site. While it looked okay, after talking to my team, I quickly realized that it didnā€™t achieve the goals we had in mind. We wanted to present users with an improved search in terms of experience, and not just design. In particular, for the study, we wanted to see whether displaying justifications next to product images would make a difference, vs just an image and no justification. But how do we achieve that without overcrowding the product page and making users read blocks of text? What visual cues can we provide so that users know they can search this, or click that? How do we create an effective visual hierarchy, encouraging users to focus on the justification rather than the picture, price and brand?

All questions that need to be addressed. To better do so, we decided to focus on the product cards only, rather than all of the pages. After many more iterations, Iā€™ve come up with some new designs. These will likely continue develop over the next few weeks. One mistake I definitely made was jumping straight to a prototype instead of basic wireframing. Since people are consistently coming up with new ideas, I have to consistently edit my designs, and all of the details (color, font, weight, etc) make it a lot more time consuming to adjust, versus a bare-bones, black and white wireframe. Good things to keep in mind for the future!

In addition to working extensively on the UI, Iā€™ve been learning Javascript, Node.js and how web applications work. My original assignment was actually to learn React, but after jumping in last week, I quickly realized that I needed at least some background in Javascript and web development basics to be an effective learner. This week I backpedaled a little bit, but building a good foundation will be much more beneficial for my learning process. At this weekā€™s REU lunch, Professor Tandy Warnow talked about writing and publishing research papers. Some of the main points she went over: dealing with rejection, plagiarism, reading critically, what to do if you find mistakes in your paper, and how to avoid those mistakes in the first place.

week 3


This week, I finished up and submitted my Python web crawler. Towards the end, I was stuck on two main problems a) dealing with infinite scroll and b) how to retrieve different color swatches (by clicking on buttons) when the URL didnā€™t change. Luckily, my grad student mentors were able to help me to figure out the issues. While there are still a few bugs (too many edge cases!) the crawler is basically completeā€”the main goal is to get most (not necessarily all) products from each site.

Next up, I started learning React. Itā€™s honestly a bit challenging, since I donā€™t have a lot of background in web development and Javascript. Iā€™ve been following online tutorials as well as reading the documentation for React and ES6. Hopefully more practice will help!

On Sunday, we went to a team dinner at Chinatown Buffet. It was a fun night and I got to meet some of the other group members (working on different projects) that I hadnā€™t seen previously.

chinatown

week 2


Professor Kumar and I have decided on two weekly meetings: one on Monday afternoon, and the second on Thursday evenings. The Monday meetings are dedicated to discussing the actual fashion research project, reviewing the work we did over the week, and setting new goals. The Thursday meetings are for lab-wide updates.

This Mondayā€™s meeting was particularly productive. We reviewed the mock survey Yichen and I had designed. In the mock survey, we planned to ask about things such as the userā€™s clothing preferences (brand preferences, price range, colors), and asking them to describe a fashion problem theyā€™d like to solve. At the end, weā€™d give them the option to provide more personal information, such as hair color and skin tone, which would allow us to produce a more personalized result. This question had two purposes: to try and create more meaningful product recommendations (such as skin tone-color combos), but to also gauge if users were willing to give up that information in the first place. After discussing our survey, Professor Kumar and I honed in on one aspect. Having Yichen and I generate the explanations by handā€”acting as the temporary stylistsā€” might not only be time consuming, but also lack consistency. Could we automate the process? Professor Kumar already had a database of ~2000 key words commonly used to describe clothing, such as the color, silhouette and material. We figured that the best solution would be to initially map these keywordsā€”or physical attributes of clothing ā€”to more organic explanations by hand. For example, the key word ā€˜high-waistedā€™ could be mapped to ā€˜high-waisted bottoms are great at elongating your figureā€™. We could then use these phrases to construct a complete explanation. While mapping 2000 words seems like a lot, it is ultimately much more productive than trying to come up with an explanation by hand for every unique scenario. We also talked about the possibility of creating a browser plug in, which would generate these explanations as you browsed clothing sites and indicated interest in a certain product.

During Thursdayā€™s tutorial, we learned the basics of Amazon Web Services, more commonly known as AWS. We learned 5 basic skills: creating instances, stopping and deleting instances, using ssh into an AWS from our local machines, the scp command for transferring large files between local machines and an AWS instances, as well s the screen command for running processes in the background on an AWS instance.

Besides the meetings, I spent time this week setting up my DREU website, as well as working on my first web crawler in Python. I decided to crawl Uniqloā€™s site. Itā€™s been a little stressful, as I didnā€™t have any exposure to programming in Python before Sunday. I do wish I had more of a foundation/experience in Python before beginning my crawler, but I can make that one of my goals this summer. As for the DREU site, it was great to familiarize myself with the Bootstrap library. Iā€™ll probably continue to update the siteā€™s design throughout the summer!

week 1


This week was a lot of fun! On day one, I got to meet up with Professor Kumar, as well as my other lab members, Yichen and Liza. We were able to grab lunch at Sakanaya and discuss what projects were were most interested in during one-on-one sessions. I decided to work on Prof. Kumarā€™s ongoing fashion project with Yichen, where we are aiming to design the future of personal fashion via machine learning.

During the first week, we read several papers: one of Professor Kumarā€™s own publications, as well as a paper on generating justifications for users in self driving cars. We also familiarized ourselves with online programs such as MailChimp and Typeform, which will be useful when conducting studies later on. Most importantly, we discussed what factors consumers care about when selecting clothingā€”such as color, price, versatility and appropriateness for a certain event. I created a mock HCI storyboard, and came up with various scenarios of user queries and accompanying product recommendations. We plan to conduct a user study to determine which of these factors are most important to users. The results will ultimately allow us to make the best recommendations to users, and map the observable attributes of a product (such as material, price and color) to the more organic factors like the fit for an ā€™occasionā€™, the userā€™s ā€˜body typeā€™, or ā€™skin tone.ā€™ To be authorized to conduct the surveys, we worked on completing the CITI Research Ethics and Compliance Training course.

We also had our first technical tutorial on Saturday, where graduate student Yuan introduced us to web crawlers and walked us through a bit of his code. Iā€™ve never had exposure to data scrapping, but it should be super interesting! Our job is to program our own crawlers for a specific clothing site. We briefly discussed what sites might be better than others for web crawling, and despite there being a general template, we will have to deal with edge cases for every unique site. The web crawler consists of functions that can return information about a product, a list of all products, prices, URLs and more.

The first program luncheon was also held on Wednesday. A lot of people from the REU program turned up, and we had empanadas while learning about what research is like, how it differs from industry, and how to get started!