Is visual search the future of search engines?
Identifying plants on a nature walk used to involve thumbing through hundreds of encyclopedia pages and taking a best guess that the visual on the page matches the foliage in question.
Today, however, you can open an app like Picture This, snap a photo and have an (accurate) answer within seconds.
The idea of visual search isn’t new, but it is gaining new momentum. The 2017 debut of Google Lens and Snapchat’s Scan — a feature that identifies dog breeds, plants and products — helped familiarize users with the concept. More recently, Apple debuted Visual Look Up, a feature within the iPhone’s or iPad’s native photo app that can identify elements inside a photo.
The trend has mobile users increasingly turning to visual search to connect with the world around them.
This development has obvious applications in retail, especially in relation to social commerce. Love a shirt you saw on Instagram, but have no idea where to find it? No problem: just upload a screenshot of it, put it in your cart and click “buy.”
That potential demands action, especially in the online shopping arena. This is what brands should know about getting ahead of visual search in eCommerce.
History of search
Long before the era of Google and Wikipedia, finding information about anything typically involved either passing down history through oral storytelling, or by reading scrolls, books and periodicals.
Natural History by Pliny the Elder is considered to be the first encyclopedia. More than 37 volumes and a million words, the first-century CE Roman scholar attempted to document all known facts about the world. His work — which is said to have included as much fact as opinion and imagination — remained a vital source of scientific knowledge until the Middle Ages.
Since then, humans have sought information through libraries, newspapers and radio news broadcasting. And, then with the creation of Archie, the first internet-based search engine in 1990.
When you think about it, images have accompanied encyclopedic text for generations, although it isn’t until recently that you could conduct a search using an image alone. That capability has been brought to us through the dynamic combination of user-generated content and artificial intelligence (AI), particularly the painstaking process of tagging and annotating images. As well, other subsets of AI — such as deep learning and computer vision — are helping to automate and advance the power of visual search.
Quality, quantity, and responsibility: Three key data pillars for successful AI implementations
In this IDC Info Snapshot, learn about the three key pillars for any successful AI implementation and the advantages of partnering with a trusted advisor to create and deliver your AI vision.
What is visual search?
AI has facilitated the development of search methods using criteria beyond the written word. It’s what allows us to ask Amazon’s Alexa to tell us about the weather, and what enables an app like Shazam to identify a clip of a song or film after just a few seconds of listening. In that vein, visual search allows you to search for an image using another image — for instance, by uploading a photo into Google’s reverse image search, or by using Pinterest’s pioneering visual search tool, launched in 2015. Often, people use photos containing products or images of unknown origin so that they can find something similar — say, a pattern on a hotel carpet or the cut of a Hollywood star’s dress.
The ease with which these tools can be operated makes these newer forms of search appear deceptively simple. In reality, refining the accuracy of visual search engines remains an immense and ongoing challenge as more and more images enter the realm of the internet every single day. That’s because preparing images to be searchable involves highly detailed labeling and annotation, which often must be done — or at least checked — by a human.
For instance, take a pair of jeans. How could you make it easy for an interested shopper to find this particular pair of jeans? To do that, you would need to label the photo with certain information to yield an accurate search, including, but not limited to:
- The colors of the denim, label and stitching
- The rise of the waist
- The tightness and shape of the legs
- The type of opening (Buttons? Zipper?)
- Pocket placement
- Size descriptors
Brands also need to consider how images are shown online. Does a background pattern or color on a typical product shot skew a computer’s ability to “see” and search for an image? And then there’s the language your target markets speak: Will they understand the terms you use in the tagging and annotation process?
These kinds of details will be increasingly important moving forward with the development of Google’s new multisearch tool, which allows users to add textual parameters to a visual search. As the tech giant explains in a blog post, “With multisearch, you can ask a question about an object in front of you or refine your search by color, brand or a visual attribute.”
Image displaying a visual search example by Target.
Benefits and opportunities of visual search
Brands have a lot to gain from leaning into visual search for eCommerce, from increased discoverability and reaching new pockets of potential customers, to improving the overall customer experience. This is especially true as image-dominant social commerce platforms, such as Instagram, roll out new visual search tools to customers.
“Visuals are more important to customers these days than keywords or filters,” said Michal Pachnik, eCommerce director of European footwear brand CCC Group, in an interview with the writers of the Think With Google blog. “We found that consumers who use visual search are more likely to add products to their basket and buy them than those using a traditional keyword search.”
Pachnik also said that CCC Group’s conversion rate grew four times over traditional keyword search since it started using Google’s visual search engine.
Pinterest has seen similar success. As detailed on the Modern Retail news site, Pinterest saw three times more visual searches using Pinterest Lens in 2020 than the year before. “People on Pinterest are 35% more likely to take a week to make purchasing decisions, and spend two times more per month than people on other platforms,” Dan Lurie, Pinterest’s head of Pinner Product, told Modern Retail.
For as long as search engines have existed, they have been fundamental to the shopping experience. Visual search is the next generation of that evolution. Given companies like CCC Group and Pinterest’s great success using visual search, the power of an image-driven approach to the shopping experience is crystal clear.
Speak to a search evaluation expert to learn how to integrate visual search into your business.