
Google has introduced a new AI-powered feature called ‘Try On’ designed to allow users to virtually see how they look in various clothing items before making a purchase.
The feature is currently available in the US through Google Search Labs and requires users to enable it within the Labs settings.
To use the ‘Try On’ feature, users upload a full-length photo of themselves and then browse for clothing in the Google Shopping tab.
When an outfit is selected, a ‘Try it on’ button appears, and with a single click, the user’s image is generated with the clothing applied, providing a sense of how the outfit might look on them.
The underlying technology is built on an AI model trained to understand the relationship between a person’s body and clothing, enabling it to realistically drape, stretch, and shape materials over different body types.
However, ‘Try On’ is not available for all clothing items or all types of outfits. Participation is based on retailer opt-in and currently supports shirts, pants, dresses, and skirts. Accessories and swimwear may not be supported.
The AI can accurately apply various looks, including complex outfits like Elvis Presley-inspired ensembles, and modify details like shoes and socks to match the selected clothing.
While the feature is not perfect and may have minor imperfections, the article suggests it can be a valuable tool in reducing uncertainty about how clothing will look.
The article suggests the feature could potentially decrease online return rates for retailers, as it allows users to make more informed purchasing decisions.
It is anticipated that this technology might evolve into a more personalized style advice tool, offering virtual fit checks and customized outfit suggestions.