Outdated search functions are costing retail trillions – this is how you fix them…


This article is brought to you by Retail Technology Review: Outdated search functions are costing retail trillions – this is how you fix them….

By Karl Hampson, Chief Technology Officer, Data & AI, Kin + Carta. 

As the brand that’s synonymous with search, it’s no surprise that Google is the go-to platform for the vast majority of us when we want to find something online.

However long or short, simple or complex our search terms are, there’s a feeling that Google just ‘gets us’ and can intuitively produce exactly what we’re looking for.

Google has set the benchmark. That means there are high expectations when it comes to search, and we expect this quality in all corners of the internet. But unfortunately, online retailers aren’t living up to them. According to exclusive data from Kin + Carta and Google Cloud, 85% of the UK’s largest retailers see “search relevancy” as their biggest challenge regarding product discovery on their ecommerce sites.

And these difficulties with delivering relevant results are leading to search abandonment issues – a multi-million pound headache for most ecommerce retailers. In fact, Google research has shown that it's costing retailers more than $2 trillion per annum globally.

What’s the simplest way to go about this? By harnessing the full potential of AI. Here’s how…

Interpreting ‘tail’ queries

When people enter long ‘tail’ search queries including three or more terms, they often do so in more of a natural language formation: ‘date night dress in small’ for example. Longer queries such as this are a growing trend, as customers become more accustomed to interacting with AI chatbots in a conversational way – so it’s essential that retailers’ search functions are able to deal with them effectively.

Generally speaking, the more terms that are included in a search query, the more specific it is. This in itself is an indication that the consumer is looking for a particular item, and that they will be less tolerant of results that don’t meet their expectations. 

The problem with traditional search functions is that longer queries tend to be more distant from the language in the product catalogue, which makes it more difficult to deliver accurate results. But introducing large language models (LLMs) can help to bridge this gap between user intent and product information.

Finding inspiration 

Consumers are also increasingly turning to search functions in search of inspiration, essentially looking for product inspiration with terms such as ‘Christmas gifts for dad’. This presents almost the opposite problem to long search terms: the lack of specificity makes matching the query to relevant products from the inventory extremely challenging.

Again, generative AI can be used to retailers’ advantage here. Because they’ve been trained on such vast data sets, LLMs can accurately interpret the intent behind these search queries and deliver the best results. 

Handling typos

The occasional typo or spelling mistake is part and parcel of online interaction, particularly when we’re in a rush. But while a friend might be able to look past a typo in WhatsApp and still respond to your message appropriately, traditional search engines can find this more difficult. 

For example, a search for ‘riped high rise skinny jeans’ instead of ‘ripped’ can lead to a ‘result not found’ response and user frustration. But with LLMs in place, retailers can ensure the intent behind these misspelt queries is understood and that relevant results are delivered. 

Delivering ‘wow’ moments

The impact of a failed search function is huge. Google’s research shows that:

  • 3 out of 4 customers will go somewhere else
  • 77% of customers don’t come back
  • 52% abandon the cart completely if they can’t find one item

…and generative AI is the perfect solution for troubleshooting the vast majority of these problems. But the benefits extend beyond fixing errors and making search more intuitive. 

LLM tools such as Google Cloud Retail Search can improve conversion metrics further by personalising results for every user to provide a more intelligent experience – a clear differentiator vs legacy search. Using this technology, retailers are seeing up to a 20% increase in revenue per visitor against their existing search platforms. 

It also means a more intelligent ranking of search and browse results. This will not only establish what a customer is most likely to buy, but also what brings you, the retailer, the most revenue. And in time, it can become hyper-personalised to the individual customer, rather than just a cohort - reducing cart abandonment as well as increasing the likelihood of repeat custom.

With both of these elements of functionality in play, merchandising becomes more about directing search towards what you want to sell and less about trying to band-aid your search engine with hard-coded rules.

With generative AI implemented, retailers are able to start delivering these ‘wow’ moments very quickly, immediately removing pain points from their existing search to deliver an experience that just works. 

LLMs are rapidly becoming a critical component of ecommerce search systems – and one that requires little to zero human intervention once implemented.  

Retailers that fail to keep pace with this change are potentially leaving millions of pounds of revenue on the table, so the time to act is now.

Add a Comment

No messages on this article yet

Editorial: +44 (0)1892 536363
Publisher: +44 (0)208 440 0372
Subscribe FREE to the weekly E-newsletter