Hey, how do I look?

- May 4, 2017
| By : Swati Dey |

Technology is changing the way people dress: from checking in the mirror, seeking advice over selfies, to consulting AI-enabled digital style assistants “I have pinged you a picture. Tell me, how I look in this dress?” This is a question that we often ask our  friends, relatives or colleagues. It’s one of the most challenging […]

Technology is changing the way people dress: from checking in the mirror, seeking advice over selfies, to consulting AI-enabled digital style assistants

“I have pinged you a picture. Tell me, how I look in this dress?” This is a question that we often ask our  friends, relatives or colleagues. It’s one of the most challenging issues, especially among the youth. How to figure out the outfit that’s most suitable for the day or evening? It shouldn’t be too loud, too tacky, but should be apt for the occasion. It should enhance the look, enough to stand out, and turn faces. It’s that tricky moment when someone, not just women, looks at the overflowing cupboard and still feels short of clothes.

Remember those days — toggling between the clothes amongst the heap in the wardrobe, trying them one after another, posing in front of the mirror, nagging people around for suggestions. All these efforts to finalise the attire for that special evening! This changed a lot with the launch of camera-equipped smartphones. Now, we don’t trust just the mirror, and our own opinion and those of family members. We send the photographs to our friends and colleagues too. In fact, even when we buy a new dress or shirt, we do it.

This insecurity, or rather the security in numbers, has prompted Amazon, a global e-commerce retailer, to launch its Echo Look, which hopes to replace the full-length mirror and the smart camera.  Amazon has its own version of an artificial intelligence (AI)-powered digital assistant, Alexa, which works on the same principles as Siri (Apple) and Cortana (Microsoft). Using the same technology, the e-tailer first launched the Echo, which was a smart speaker, and now Echo Look.

How does it work?

Echo Look is like a selfie-taking device that doesn’t require a person to hold it, tap a button, or put it on a timer. It works on a verbal command: “Alexa, take a picture”, or “Alexa, take a video”. As soon as the machine hears it, the device clicks a full-length photo of the person or a short video with its hands-free camera built with LED lighting. The image or video is instantly available on the user’s smartphone. They can see themselves from every angle, and share the photographs with friends, relatives and peers to seek fashion advice.

The device that costs $200 creates a dressing-room effect: it blurs the background, and highlights the outfit with its depth-sensing quality. The device, along with an app, creates a “personal lookbook” of photos in different outfits. The users can “get a second opinion on which outfit looks best with Style Check,” a feature on the app. Individuals can send their best two options to fashion specialists, who are mentioned in the app, and ask the latter for advice. Echo Look is a learning device; it combines machine-learning algorithms with the advice given by the specialists to recommend better options, based on current fashion trends.

What’s the catch?

Echo Look reveals what global e-tailers hope to do in the future. The “personal lookbook” of the photographs one has clicked keeps a track of the users’ fashion choices. It stores what one has worn, when, and how many times. In this age of social media, selfies, and Instagram, a lot of the younger people have become like celebrities. Like the latter, we don’t wish to repeat the clothes we wear. Celebrities do it to avoid criticism and controversies: oh remember, this actor wore the same shirt a year ago, and the message floats on the Internet, along with the picture of that day. People don’t want their friends to gossip that they have run out of new clothes.

So, the “lookbook” becomes a shopping and wearing memory for Amazon, which can use this knowledge to promote new brands, fashion, and clothes. To go a step further, it can even push for brands that are more lucrative to it. Amazon can accumulate a huge amount of users’ data via the device. Apart from the voice instructions, Alexa and the app have the access to the users’ mails, travel information, habits, daily schedules, music and clothing preferences, look of the bedroom, wardrobe collections, and what not. Unfortunately, there is no guaranteed encryption of the data. So, the information will add to the growing database of the company, which it can use later.

Philosophy of good looks

There’s yet another restriction with the Echo Look, which is true for most of the apps and websites. Since the device becomes more intelligent after it takes into account the last few outfits and styles that a user has worn, it recommends similar styles and clothing in the near future. This restricts the freedom of choice. For example, if one has worn tight bottoms the last few times, it doesn’t mean that the person may not try baggy jeans the next. But the application will flash the tights. Further, it won’t be any different from a human influence as AI echoes human prejudices, as found in various studies, like the one by Arvind Narayanan and others from the Princeton University.

There is immense social pressure, or rather social media pressure, to look good these days. But who should tell us how we look — a digital assistant, friends, family, peers, or colleagues? Or a combination of them. Or, as philosophers contend, it should be a part of personal will and choice. In fact, in modern times, even the fashion industry, which used to peddle certain styles, fibres and clothes, has begun to believe in personalised choices and recommendations. What one wears should not depend on one’s body size and skin colour. Thus, no brand or dress can assure ‘good looks’ unless the feeling comes from within.

However, the clothing merchandisers and e-tailers like Amazon don’t, and can’t, agree with such arguments. They need to sell more and, to do it, they need to promote more aggressively. They ‘sell’ good looks. They market a specific trend each season, which changes with the season. Instead of asking “Mirror, mirror on the wall, who is the best dressed of all”, they want us to ask this question to them, or Echo Look, again and again. So, may be, and this is just a may be, users may get hooked to Echo Look for all their clothing needs. And as the device starts giving recommendations, one may forgego consulting friends and colleagues. On the positive side, those photographs can be instantly uploaded on social media, and then everyone you know will have the opportunity to comment.

Trend voice: Amazon recently launched an AI-enabled digital assistant (top-right) that helps one decide what looks good on them // photos: wiki/ Amazon