Janaka Fernando
Mar 5, 2021
(5 votes)

When NOT to use Episerver Search & Navigation (Find)

I was having a conversation recently with a Technical Architect around Product recommendations and website performance. There was some inefficient code retrieving the product display based on the results from the Optimizely Product Recommendations tracking response.  The first thought was to utilise Episerver Search & Navigation (formerly Find) for this feature. On the face of it, Search provides a very fast and effective means of retrieving catalog items. In some situations that would be correct, however this is a bad use case for Episerver Search. 

This got me thinking about why some scenarios are not suited for Episerver Search & Navigation.

Unable to implement a Caching strategy

Anything that could result in hundreds or thousands of customers triggering Search queries around the same time can quickly max out the queries per second threshold on the Search service.  This is the type of bug that could pass QA and not be detected until the real-world load is on the application.

Recommendations are by design unique for each customer and could change between subsequent requests. This makes them difficult if not outright impossible to cache.

There is another efficient API available

In the case of the code needed for Product recommendations, the response from Episerver provides a Content Reference for each recommendation. The Episerver API provides an easy way to retrieve all of the items by calling IContentLoader.GetItems() directly.  Even if the Content Reference was not supplied, most systems would provide the unique refCode/SKU code, which again can be converted to a Content Reference easily.

This same rule applies when a developer simply needs to retrieve content in a hierarchical nature such as Pages from the site tree, even when filters need to be applied.  This only becomes ineffective when many hundreds or thousands of pages are in a single node, but I would say that is a fault of bad information architecture!

Paging through large datasets

This last one might not seem as obvious, because you want to use Episerver Search for searching across very large amounts of data right? For the most part yes. When a customer searches by a keyword and goes through a couple of pages of results, that’s normal behaviour.

What I’m thinking of here is code that runs in custom jobs that use Episerver Search, for example, a Google shopping feed scheduled job.  The problem here is it normally traverses every product in the catalog and the Skip and Take methods internally are not efficient for this purpose.  If your batch size is 1000 products and you are on step 50, that Skip(50000) still needs to load through all those products.  This puts quite an intensive load on the Search cluster and can affect overall performance.

If it's possible to avoid Search and work directly against the Commerce API, I would advocate that approach first, provided it's performant enough for your needs.

The recommended approach instead with Search is to use a GreaterThan() filter alongside Take() on your data.  For example,

searchQuery.Filter(x => x.PublishDate.GreaterThan(lastProcessedPublishDate));


I hope you found this informative and helpful.  Episerver Search is a very powerful and flexible product, but with great power comes… well you know the rest.

Mar 05, 2021


Please login to comment.
Latest blogs
Update on .NET 8 support

With .NET 8 now in release candidate stage I want to share an update about our current thinking about .NET 8 support in Optimizely CMS and Customiz...

Magnus Rahl | Oct 3, 2023

Adding Block Specific JavaScript and CSS to the body and head of the page

A common requirement for CMS implementations includes the ability to embed third party content into a website as if it formed part of the website...

Mark Stott | Oct 3, 2023

Performance optimization – the hardcore series – part 1

Hi again every body. New day – new thing to write about. today we will talk about memory allocation, and effect it has on your website performance....

Quan Mai | Oct 3, 2023 | Syndicated blog

Next level content delivery with Optimizely Graph

Optimizely introduced a new product called Optimizely Graph earlier this year. We were one of the first partners to adopt this new service in a...

Ynze | Oct 2, 2023 | Syndicated blog

IDX21323 - RequireNonce is true, Nonce was null

We have multiple clients configured with Azure Active Directory (Microsoft Entra) for requiring authentication when accessing their website. The...

David Drouin-Prince | Oct 1, 2023 | Syndicated blog

Minimum Detectable Effect in Optimizely Web Experimentation

Understanding Minimum Detectable Effect Minimum Detectable Effect (MDE) is a core statistical calculation in Optimizely Web Experimentation. In...

Matthew Dunn | Oct 1, 2023 | Syndicated blog