Industry Insights

Syndicated News

See It, Search It, Shop It: How AI is Powering Visual Search

Artificial intelligence has the power to change how we search and how people shop — and it all starts with a single image.

You spot something you love on a passerby. That stranger walking past you is wearing the perfect pair of sneakers. You want them. But you have no idea what brand they were or where to buy them. Even without those essential details, you figure you can go online and search — but you get just a few, mostly irrelevant, results, and you aren’t any closer to getting your next favorite pair of shoes.

Enter visual search.

If a picture is worth a thousand words, visual search — the ability to use an image to search for other identical or related visual assets — is worth thousands of spot-on searches — and thousands of minutes saved on dead-end queries.

With visual search, you no longer need to try and guess the brand, style, and retailers. You could have snapped a picture of those sneakers on the passerby. Upload the image, and immediately find the exact same sneakers or ones like them — maybe even shoes you like better.

The power — and simplicity — of visual search

That spot-it/want-it scene is common, and good for business. It could be a shirt on someone walking down the street, an image on Instagram, or a piece of furniture in a magazine — somewhere, your customer saw something that made them want to buy one, and now they’re on a mission to find it.

While it’s a seemingly simple task, in many cases the path from seeing to buying is a circuitous and friction-filled route that leads to a subpar purchase — or no purchase at all. Just one in three Google searches, for example, leads to a click — and these people come to the table with at least a sense of what they’re searching for.

“Visual search is all about focusing your attention toward a target,” says Gina Casagrande, senior Adobe Experience Cloud evangelist, “and helping you find what you’re looking for that much faster. You also get the added benefit of finding things you didn’t even know you were looking for.”

Like text-based search, visual search interprets and understands a user’s input — images, in this case — and delivers the most relevant search results possible. However, instead of forcing people to think like computers, which is how the typical text search works, visual search flips the script.

Now, powered by AI, the machine sees, interprets, and takes the visual cues it learns from people. After applying metadata to the image, AI-powered visual search systems can dig through and retrieve relevant results based on visual similarities, such as color and composition.

How visual search is revolutionizing the retail experience

There are countless applications for visual search, from enabling designers to find relevant stock images to identifying specific people in images. Visual search isn’t a thing of the future, it’s already facilitating better, more frictionless retail experiences so you can find that maroon tunic sweater with a quick snap and click.

One early adopter of visual search is Synthetic, Organic’s cognitive technology division, an Omnicom subsidiary. Synthetic’s Style Intelligence Agent (SIA) — powered by Adobe Sensei — uses AI to help customers not just find specific clothing items, but also find the right accessories to complete their new look.

To use SIA, customers simply upload an image — a red carpet shot of a gorgeous dress, an ad from a magazine, or even a random picture of a friend’s jacket from your phone’s photos. From there, Adobe Sensei’s Auto Tag service extracts attributes from the image based on everything from color, to style, to cut, to patterns. At the same time, SIA’s custom machine-learning model correlates those tags with a massive catalog of products.

SIA then displays visually similar search results as well as relevant recommendations — items with similar styles, cuts, colors, or patterns, for example. SIA also uses these visual searches to build a rich profile for that customer’s preferences and tastes — a much deeper profile than what could be built from text-based searches alone.

“This is where visual search goes beyond just search and becomes a true shopping consultant,” Gina says, “and a superior, more sophisticated way to search for what you want and what you didn’t know you wanted.”

Eliminating the friction between seeing and buying

In delivering such a simple, seamless experience, AI-powered visual search removes the friction from traditional search-and-shop experiences. No longer do customers have to visit multiple retailers or sites and strike out. They can now find virtually anything, anywhere, even without knowing exactly where to find it.

With some visual search tools, people can snap a photo to call up a series of similar products across multiple sites and retailers. Google Lens, for example, pulls similar product examples from Google Shopping.

Other retailers use visual search to make the distance between seeing and buying virtually nonexistent — within their own brand experience. Macy’s, for example, offers visual search capabilities on its mobile app, which allows customers to snap a photo, and find similar products on Macys.com. It’s “taking impulse buying to new heights,” one source says.

Frictionless image search is just the beginning. “The value of visual search technology grows as the customer returns to the site,” Gina says. “On that next visit, it’s a more personalized, powerful targeted search. Just being able to pick up where I left off and get to that product that much faster helps reduce friction, and has been shown to increase conversions and order rate.”

Visenze, which builds shopping experiences using AI, is already seeing these benefits. In one example, the company saw a 50 percent increase in conversion among clients such as Nike and Pinterest that implemented visual search technology.

“In the United States, Amazon and Macy’s have been offering this feature for some time,” says Visenze CEO Oliver Tan. “Consumers are crying out for a simpler search process. If you do not, they will move on to somebody else.”

Rising to customers’ visual search expectations

Though the benefits of visual search are clear, as with any new technology, there’s still a gap in both adoption and customer expectations versus delivery. “Our current iteration of visual search gets us maybe 70 percent of the way there,” Gina says. However, even with visual search still in its initial stages, Gartner predicts that early adopters of the technology will experience a 30 percent increase in e-commerce revenue by 2021.

“Keep in mind, as more data and content become available the algorithms will get smarter, and the visual search experience will only continue to get better,” she says.

Given this accelerated growth and customers’ growing demand for personalized, frictionless retail experiences, now is the time to start evaluating and integrating visual search technology. To get started, focus on solving customer problems and getting your own visual assets in order — and don’t try to make your visual search workflows all about advertising.

Instead, aim to have solid metadata on products so that searching is easier and more natural. From there, work toward visual search processes that are real time and increasingly intuitive, creating a positive customer experience that keeps people coming back.
 

View original content: Here

 

Related Adobe News:

Are Governments Providing Improved Digital Experiences During a Global Pandemic?