This week was dedicated to fixing a few more bugs and wrapping up the whole research experience. We were able to have several team meetings to discuss the current state of the project. We are still improving the results returned, and I am still adding a few more small features/tweaks to the front-end. While we didnāt get to conduct the user study yet, the engine should be ready to go next week, which is really wonderful. In addition, we talked about the future of the project, and how implementing the search engine successfully allows to to explore a lot of great questions/ conduct multiple studies in the future. Iām extremely excited to continue working on this project during the school year!
The front-end is nearing completion! This week was mainly dedicated to fixing a couple of (annoying) bugs and fine tuning other features. One issue was that infinite scroll would seemingly work only 90% of the time. Because the error didnāt occur every single time, it took me a while to figure out what was wrong. I finally realized that after a user would scroll to the bottom of a page (completing infinite scroll), one of the states would basically say, āinfinite scroll is finished, no more items to display.ā When loading the results for a subsequent query, I forgot to reset this state. Thus, infinite scroll wouldnāt get triggered unless the page was refreshed. Another issue was with the dropdown/toggle that could change the search mode. If a user clicked on the toggle, then tried to click off of it (perhaps they changed their mind), the onChange function would still fire, reloading the page. I later fixed it so that clicking off the toggle would not fire any action, making it much more user friendly.
Alice headed home on Saturday, and we all got together for brunch at Destihl beforehand. It was very bittersweet :ā)
This week, I returned to working on front-end development. We decided to implement a couple of new features, including toggles for adjusting what results our engine returns and how we rank them. For search, we have two options (keyword-only and smart search), as well as ranking (z-score and elastic). While implementing these toggles, I ran into a host of problemsāthe main one being that when I adjusted one toggle to ātrueā, it would automatically reset the other toggle to its default value, rather than saving the state. After looking at my code, I realized that the big issue was that I was adjusting state locally throughout my codeāaka in each individual component. As I added more and more features (and thus states), they became difficult to manage, and some components were not remembering the state changes in other components. While I pinpointed the area in which the bug was happening, I couldnāt figure out exactly what it was. After reading a great article on lifting state, I decided to completely refactor my code so that I was only changing state in the App component (with functions like handleSearchToggle, handleRankToggle, etc). Thus, the App component could act as the single source of truth. Although it took a bit of time, it made everything else a lot easier, such as implementing the functionality of the filter/style buttons on each product card. Towards the end of the week, I worked on implementing infinite scroll while Michael worked on autocomplete for the search bars.
This weekās (last!) REU lunch was about what roles graduate students/ individuals with PhDs might hold in the industry. It was pretty fascinating, and many of the presenters talked about working at companies such as Google and Microsoft, but in a research role.
Over the weekend, we had two team meetings during which we reviewed code and discussed the progress on each of our parts. The front end development is complete for the time being, and we decided to shift our focus to the actual fashion recommendation model itself and improving it. One idea that we would like to implement (eventually) is presenting the justifications on the product imageāwith arrows pointing to each corresponding featureārather than simply having them listed below. Doing so would help create a stronger visual connection between the justification and product recommendation. It is a bit complicated however, so we decided to save this particular feature for a later iteration. In the meantime though, Alice (my fellow lab member) and I have started doing research on the ResNet and ResNext architectures specifically designed for image classification. Our grad student mentor, Yuan, suggested a couple of online courses that would be helpful. I spent the rest of the week working through the courses and quickly realized that I didnāt have enough background in machine learning to fully understand everything in the course. Thus, I decided to pause the course for a bit and instead Google the basics, such as āwhat are neural networks?ā, āwhat is convolution?ā, and āhow does backpropagation work?ā. I created a document full of notes about these topics, and familiarized myself with Keras. Alice also recommended a Caltech ML course, which really helped bolster my understanding of the math behind backpropagation. While I missed this weekās REU lunch due to a sickness, we did go to the REU-organized escape room outing on Friday! Alice, Yichen and I teamed up with a couple of other girls from Dr. Amatoās lab and tried to conquer the āWizardās Curseā room. In the end, we managed to escape, only leaving 40 seconds left on the clock. It was a fantastic little break from work, and weād love to escape from the āArtificial Intelligenceā room with the rest of our lab another day!
This week, Iāve been in the thick of working on the front-end development in React. I decided to actually switch to using Semantic UI rather than Material, since it is more customizable and has simpler syntax. The coolest part of this week was definitely successfully making a call to the server and seeing all the appropriate information pop up in our results, product cards and more. Iām also learning about react router, and I hope to implement that soon. The next challenge to tackle will be adding additional search filters on the click of a buttonāfor example, when a user clicks āwomenā, I need to filter the existing products by their targeted gender. Sometimes development does feel a bit slow as Iām the only one on front-end and my other lab members are not quite familiar with React either. But again, learning how to work through these problems is a good experience and Iām enjoying it. Ranjitha (my mentor) came back from California this week, so weāve had a lot of team meetings and are planning to have in-depth code reviews this weekend. Ranjitha also presented this weekās REU lunch, and she covered how to develop your personal persona. It was a fantastic lunch (both in terms of food and content!), and it prompted me to think about my online presence, as well as how I introduce myself in emails.
This week was largely devoted to working on the UI in React. I decided to use a styling library called Material UI to help me with the creation. Through the process, I learned the importance of making reusable components (especially when
you have tons of product cards), passing in props between components, callback functions, and async/await. It is definitely still a challenge, but it is a fun one nonetheless. Seeing your designs transform into an actual, usable web
application is pretty cool. Thursday was the fourth of July, so we got a bit of a break! Towards the end of the week I started learning about Axios and making AJAX requests to servers, so hopefully all the functionality will start coming
together next week.
This weekās luncheon featured Tianyin Xu and Sasa Misailovic, who talked about presentation and verbal communication skills. One thing they emphasized is presenting simple yet striking slides. The more words, the more they can dilute your
presentation. On the other hand, images, graphs and short sentences can really bring the focus back to the presenter him/herself and what they are talking about.
This week, I worked primarily on the UI design, and learning Sketch in the process. While Iāve done graphics before in Illustrator, I quickly realized that wire framing and designing the entire interface is much differentāthere are so
many more considerations. For example, one of my first iterations looked basically like a typical e-commerce site. While it looked okay, after talking to my team, I quickly realized that it didnāt achieve the goals we had in mind. We
wanted to present users with an improved search in terms of experience, and not just design. In particular, for the study, we wanted to see whether displaying justifications next to product images would make a difference, vs just an image
and no justification. But how do we achieve that without overcrowding the product page and making users read blocks of text? What visual cues can we provide so that users know they can search this, or click that? How do we create an
effective visual hierarchy, encouraging users to focus on the justification rather than the picture, price and brand?
All questions that need to be addressed. To better do so, we decided to focus on the product cards only, rather than all of the pages. After many more iterations, Iāve come up with some new designs. These will likely continue develop over
the next few weeks. One mistake I definitely made was jumping straight to a prototype instead of basic wireframing. Since people are consistently coming up with new ideas, I have to consistently edit my designs, and all of the details
(color, font, weight, etc) make it a lot more time consuming to adjust, versus a bare-bones, black and white wireframe. Good things to keep in mind for the future!
In addition to working extensively on the UI, Iāve been learning Javascript, Node.js and how web applications work. My original assignment was actually to learn React, but after jumping in last week, I quickly realized that I needed at
least some background in Javascript and web development basics to be an effective learner. This week I backpedaled a little bit, but building a good foundation will be much more beneficial for my learning process.
At this weekās REU lunch, Professor Tandy Warnow talked about writing and publishing research papers. Some of the main points she went over: dealing with rejection, plagiarism, reading critically, what to do if you find mistakes in your
paper, and how to avoid those mistakes in the first place.
This week, I finished up and submitted my Python web crawler. Towards the end, I was stuck on two main problems a) dealing with infinite scroll and b) how to retrieve different color swatches (by clicking on buttons) when the URL didnāt
change. Luckily, my grad student mentors were able to help me to figure out the issues. While there are still a few bugs (too many edge cases!) the crawler is basically completeāthe main goal is to get most (not necessarily
all) products from each site.
Next up, I started learning React. Itās honestly a bit challenging, since I donāt have a lot of background in web development and Javascript. Iāve been following online tutorials as well as reading the documentation for React and ES6.
Hopefully more practice will help!
On Sunday, we went to a team dinner at Chinatown Buffet. It was a fun night and I got to meet some of the other group members (working on different projects) that I hadnāt seen previously.
Professor Kumar and I have decided on two weekly meetings: one on Monday afternoon, and the second on Thursday evenings. The
Monday meetings are dedicated to discussing the actual fashion research project, reviewing the work we did over the week, and setting new goals. The Thursday meetings are for lab-wide updates.
This Mondayās meeting was particularly productive. We reviewed the mock survey Yichen and I had designed. In the mock survey, we planned to ask about things such as the userās clothing preferences (brand preferences, price range, colors),
and asking them to describe a fashion problem theyād like to solve. At the end, weād give them the option to provide more personal information, such as hair color and skin tone, which would allow us to produce a more personalized result.
This question had two purposes: to try and create more meaningful product recommendations (such as skin tone-color combos), but to also gauge if users were willing to give up that information in the first place.
After discussing our survey, Professor Kumar and I honed in on one aspect. Having Yichen and I generate the explanations by handāacting as the temporary stylistsā might not only be time consuming, but also lack consistency. Could we
automate the process? Professor Kumar already had a database of ~2000 key words commonly used to describe clothing, such as the color, silhouette and material. We figured that the best solution would be to initially map these keywordsāor
physical attributes of clothing āto more organic explanations by hand. For example, the key word āhigh-waistedā could be mapped to āhigh-waisted bottoms are great at elongating your figureā. We could then use these phrases to construct a
complete explanation. While mapping 2000 words seems like a lot, it is ultimately much more productive than trying to come up with an explanation by hand for every unique scenario.
We also talked about the possibility of creating a browser plug in, which would generate these explanations as you browsed clothing sites and indicated interest in a certain product.
During Thursdayās tutorial, we learned the basics of Amazon Web Services, more commonly known as AWS. We learned 5
basic skills: creating instances, stopping and deleting instances, using ssh into an AWS from our local machines, the scp command for transferring large files between local machines and an AWS instances, as well s the screen command for
running processes in the background on an AWS instance.
Besides the meetings, I spent time this week setting up my DREU website, as well as working on my first web crawler in Python. I decided to crawl Uniqloās site. Itās been a little stressful, as I didnāt have any exposure to
programming in Python before Sunday. I do wish I had more of a foundation/experience in Python before beginning my crawler, but I can make that
one of my goals this summer. As for the DREU site, it was great to familiarize myself with the Bootstrap library. Iāll probably continue to update the siteās design throughout the
summer!
This week was a lot of fun! On day one, I got to meet up with Professor Kumar, as well as my other lab members, Yichen and Liza. We were able to grab lunch at Sakanaya and discuss what projects were were most interested in during
one-on-one sessions. I decided to work on Prof. Kumarās ongoing fashion project with Yichen, where we are aiming to design the future of personal fashion via machine learning.
During the first week, we read several papers: one of Professor Kumarās own publications, as well as a paper on generating justifications for users in self driving cars. We also familiarized ourselves with online programs such as
MailChimp and Typeform, which will be useful when conducting studies later on. Most importantly, we discussed what factors consumers care about when selecting clothingāsuch as color, price, versatility and appropriateness for a certain
event. I created a mock HCI storyboard, and came up with various scenarios of user queries and accompanying product recommendations. We plan to conduct a user study to determine which of these factors are most important to users. The
results will ultimately allow us to make the best recommendations to users, and map the observable attributes of a product (such as material, price and color) to the more organic factors like the fit for an āoccasionā, the userās ābody
typeā, or āskin tone.ā To be authorized to conduct the surveys, we worked on completing the CITI Research Ethics and Compliance Training course.
We also had our first technical tutorial on Saturday, where graduate student Yuan introduced us to web crawlers and walked us through a bit of his code. Iāve never had exposure to data scrapping, but it should be super
interesting! Our job is to program our own crawlers for a specific clothing site. We briefly discussed what sites might be better than others for web crawling, and despite there being a general template, we will have to deal with edge
cases for every unique site. The web crawler consists of functions that can return information about a product, a list of all products, prices, URLs and more.
The first program luncheon was also held on Wednesday. A lot of people from the REU program turned up, and we had empanadas while learning about what research is like, how it differs from industry, and how to get started!