A critical vulnerability was discovered in React Server Components (Next.js). Our systems remain protected but we advise to update packages to newest version. Learn More

dada
Jun 4, 2025
  864
(1 votes)

Avoid Deep Indexing of ContentAreas Unless Necessary

In a typical Episerver CMS implementation, it’s common to encounter deeply nested structures — ContentAreas containing Blocks, which themselves contain more ContentAreas and Blocks, potentially going multiple levels deep.

The Problem with Deep Nesting in Indexing

By default, when indexing ContentAreas, there’s no sensible maximum nesting depth enforced. Only the default JSON serialization max depth at 25.This can lead to:

  • Increased strain on the search service

  • Longer indexing times

  • Larger mapping sizes

  • Degraded query performance

If you don’t specifically need to filter or retrieve deeply nested content, it’s a good idea to limit how deep the indexing goes.

How to Limit ContentArea Depth

This can be controlled using the MaxDepthContentAreaConverter. You can configure the maximum depth by adding the following code to your initialization module:

SearchClient.Instance.Conventions.ForInstancesOf<ContentArea>().ModifyContract(x => x.Converter = new MaxDepthContentAreaConverter(1));

This example limits the indexing depth to 1 level, but you can adjust the value based on your needs.

This information is also available in our documentation:
https://world.optimizely.com/documentation/Items/Developers-Guide/EPiServer-Find/11/Integration/episerver-77-5-cms/Indexing-content-in-a-content-area/

Important Note on SearchText

Keep in mind, this setting only affects the serialization depth of ContentAreas for indexing purposes. It does not impact the maximum depth used by SearchText, which is responsible for collecting, concatenating, and indexing textual content within ContentAreas and their nested items. Meaning the content in your ContentArea will still be searchable.

 

Jun 04, 2025

Comments

Manoj Kumawat
Manoj Kumawat Jun 10, 2025 06:38 AM

It's not a feature I enjoy modifying, but I have to when I need something outside the content areas covered by Search and Navigation. It's good you pointed out that depth might be the cause of the slow queries.

dada
dada Aug 13, 2025 07:20 AM

Thanks for your feedback Manoj. This will eventually be changed in the product so that the default serialization depth of content areas is more sensible.

Please login to comment.
Latest blogs
A day in the life of an Optimizely OMVP - OptiGraphExtensions v2.0: Enhanced Search Control with Language Support, Synonym Slots, and Stop Words

Supercharge your Optimizely Graph search experience with powerful new features for multilingual sites and fine-grained search tuning. As search...

Graham Carr | Dec 16, 2025

A day in the life of an Optimizely OMVP - Optimizely Opal: Specialized Agents, Workflows, and Tools Explained

The AI landscape in digital experience platforms has shifted dramatically. At Opticon 2025, Optimizely unveiled the next evolution of Optimizely Op...

Graham Carr | Dec 16, 2025

Optimizely CMS - Learning by Doing: EP09 - Create Hero, Breadcrumb's and Integrate SEO : Demo

  Episode 9  is Live!! The latest installment of my  Learning by Doing: Build Series  on  Optimizely Episode 9 CMS 12  is now available on YouTube!...

Ratish | Dec 15, 2025 |

Building simple Opal tools for product search and content creation

Optimizely Opal tools make it easy for AI agents to call your APIs – in this post we’ll build a small ASP.NET host that exposes two of them: one fo...

Pär Wissmark | Dec 13, 2025 |