Google has rolled out a new AI-driven shopping feature to help you find out what the clothes you are interested in buying can look like when you wear them. It’s called “try it on” and it’s available right now in the US through Google Search Labs.
To get started, just turn it on in the laboratory. Then upload a photo of yourself in full length and start looking for clothes in the Google Shopping tab.
When you click on a picture of some clothing from the search results, you see a small “Try it” button in the middle of the enlarged version of the clothes in the right panel. A click and about ten seconds later you see yourself wearing the clothes. It may not always be a perfect illusion, but at least you get a sense of what it would look like on you.
Google claims it all runs on a model trained to see the relationship between your body and clothing. AI can therefore realistically drapes, stretches and pile material across different body types.
The feature does not work with every clothing you may see, or even any type of clothing. The clothing dealer has to sign up for the program and Google said it only works for shirts, pants, dresses and skirts.
I noticed that costumes and swimwear both had no useful pictures, but I could put shorts on myself, and costumes that looked like plain clothes were usable. AI also did not seem to have a problem with jackets and coats as categories.
Elvis looks
On Google Shopping, for example, I found copies of the clothes, Elvis bar for his 1966 comeback and one of his 1970s jumpsuits. With a few clicks I could imagine dressed as the king in different eras.
It even changed my shoes in the very black suit. I had always wondered if I could pull either out. The photos can be shared and you can save or send them to others from the Google Mobile app and see how much of a Elvis your friends think you are.
Super summer
The details that AI is changing to make the images work are impressive. I used AI to try on a fun summer look and the closest to a superhero costume I could try. The original photo is me in a suit and jacket with a Bowtie and black dress shoes. But the shoes and socks on both AI-generated images not only vote with what was in the search result, but they are shaped for my attitude and size.
Despite wearing long sleeves and pants, AI found a way to show some of my arms and legs. The color matches reality, but its imperfections are noticeable to me. My legs look too thin in both, just as AI thinks I jumped over the leg day and my legs in the shortsen have not been so hairless since I turned 13.
Imperfections aside feels like this will be an important part of the next era of e-commerce. The awkward guess of whether a color or cut works for your skin color and building can be easier to solve.
I will not say that it can compensate to try them in real life, especially when it comes to size and comfort, but as a digital version of holding a clothes up against you while looking in a mirror, it’s pretty good.
Completion of unnecessary return
Ukandy As some of the resulting images are, I think this will be a popular feature for Google Shopping. I would expect it to be strongly imitated by rivals in AI development and online retail, where it is not already.
I especially like how AI allows you to see how you look in more foreign or bold appearance you might be hesitant to try in a store. E.g. Paisley jacket and striped pants on the left or swallow neck and vests with Victorian pants to the right. I hesitated to order either to see and would almost certainly plan to return one or both of them even before they arrive.
Returns are a plague on online retailers and waste tons of packaging and other resources. But if Google shows us how we look in clothes before we buy them, it could chip away at return prices; Retailers are running to sign up for the program.
It can also open the door to more personal advice from AI. You may soon have an AI personal dresser, ready to give you a virtual fit -check and suggest your next look, even if it is not something Elvis would have worn.