A critical vulnerability was discovered in React Server Components (Next.js). Our systems remain protected but we advise to update packages to newest version. Learn More

dada
Jun 4, 2025
  859
(1 votes)

Avoid Deep Indexing of ContentAreas Unless Necessary

In a typical Episerver CMS implementation, it’s common to encounter deeply nested structures — ContentAreas containing Blocks, which themselves contain more ContentAreas and Blocks, potentially going multiple levels deep.

The Problem with Deep Nesting in Indexing

By default, when indexing ContentAreas, there’s no sensible maximum nesting depth enforced. Only the default JSON serialization max depth at 25.This can lead to:

  • Increased strain on the search service

  • Longer indexing times

  • Larger mapping sizes

  • Degraded query performance

If you don’t specifically need to filter or retrieve deeply nested content, it’s a good idea to limit how deep the indexing goes.

How to Limit ContentArea Depth

This can be controlled using the MaxDepthContentAreaConverter. You can configure the maximum depth by adding the following code to your initialization module:

SearchClient.Instance.Conventions.ForInstancesOf<ContentArea>().ModifyContract(x => x.Converter = new MaxDepthContentAreaConverter(1));

This example limits the indexing depth to 1 level, but you can adjust the value based on your needs.

This information is also available in our documentation:
https://world.optimizely.com/documentation/Items/Developers-Guide/EPiServer-Find/11/Integration/episerver-77-5-cms/Indexing-content-in-a-content-area/

Important Note on SearchText

Keep in mind, this setting only affects the serialization depth of ContentAreas for indexing purposes. It does not impact the maximum depth used by SearchText, which is responsible for collecting, concatenating, and indexing textual content within ContentAreas and their nested items. Meaning the content in your ContentArea will still be searchable.

 

Jun 04, 2025

Comments

Manoj Kumawat
Manoj Kumawat Jun 10, 2025 06:38 AM

It's not a feature I enjoy modifying, but I have to when I need something outside the content areas covered by Search and Navigation. It's good you pointed out that depth might be the cause of the slow queries.

dada
dada Aug 13, 2025 07:20 AM

Thanks for your feedback Manoj. This will eventually be changed in the product so that the default serialization depth of content areas is more sensible.

Please login to comment.
Latest blogs
Building simple Opal tools for product search and content creation

Optimizely Opal tools make it easy for AI agents to call your APIs – in this post we’ll build a small ASP.NET host that exposes two of them: one fo...

Pär Wissmark | Dec 13, 2025 |

CMS Audiences - check all usage

Sometimes you want to check if an Audience from your CMS (former Visitor Group) has been used by which page(and which version of that page) Then yo...

Tuan Anh Hoang | Dec 12, 2025

Data Imports in Optimizely: Part 2 - Query data efficiently

One of the more time consuming parts of an import is looking up data to update. Naively, it is possible to use the PageCriteriaQueryService to quer...

Matt FitzGerald-Chamberlain | Dec 11, 2025 |

Beginner's Guide for Optimizely Backend Developers

Developing with Optimizely (formerly Episerver) requires more than just technical know‑how. It’s about respecting the editor’s perspective, ensurin...

MilosR | Dec 10, 2025