Take the community feedback survey now.

dada
Jun 4, 2025
  688
(1 votes)

Avoid Deep Indexing of ContentAreas Unless Necessary

In a typical Episerver CMS implementation, it’s common to encounter deeply nested structures — ContentAreas containing Blocks, which themselves contain more ContentAreas and Blocks, potentially going multiple levels deep.

The Problem with Deep Nesting in Indexing

By default, when indexing ContentAreas, there’s no sensible maximum nesting depth enforced. Only the default JSON serialization max depth at 25.This can lead to:

  • Increased strain on the search service

  • Longer indexing times

  • Larger mapping sizes

  • Degraded query performance

If you don’t specifically need to filter or retrieve deeply nested content, it’s a good idea to limit how deep the indexing goes.

How to Limit ContentArea Depth

This can be controlled using the MaxDepthContentAreaConverter. You can configure the maximum depth by adding the following code to your initialization module:

SearchClient.Instance.Conventions.ForInstancesOf<ContentArea>().ModifyContract(x => x.Converter = new MaxDepthContentAreaConverter(1));

This example limits the indexing depth to 1 level, but you can adjust the value based on your needs.

This information is also available in our documentation:
https://world.optimizely.com/documentation/Items/Developers-Guide/EPiServer-Find/11/Integration/episerver-77-5-cms/Indexing-content-in-a-content-area/

Important Note on SearchText

Keep in mind, this setting only affects the serialization depth of ContentAreas for indexing purposes. It does not impact the maximum depth used by SearchText, which is responsible for collecting, concatenating, and indexing textual content within ContentAreas and their nested items. Meaning the content in your ContentArea will still be searchable.

 

Jun 04, 2025

Comments

Manoj Kumawat
Manoj Kumawat Jun 10, 2025 06:38 AM

It's not a feature I enjoy modifying, but I have to when I need something outside the content areas covered by Search and Navigation. It's good you pointed out that depth might be the cause of the slow queries.

dada
dada Aug 13, 2025 07:20 AM

Thanks for your feedback Manoj. This will eventually be changed in the product so that the default serialization depth of content areas is more sensible.

Please login to comment.
Latest blogs
Quiet Performance Wins: Scheduled Job for SQL Index Maintenance in Optimizely

As Optimizely CMS projects grow, it’s not uncommon to introduce custom tables—whether for integrations, caching, or specialized business logic. But...

Stanisław Szołkowski | Oct 8, 2025 |

Image Generation with Gemini 2.5 Flash

Gemini 2.5 Flash Image, nicknamed Nano Banana, is Google DeepMind’s newest image generation & editing model. It blends text‑to‑image, multi‑image...

Luc Gosso (MVP) | Oct 8, 2025 |

Automated Page Audit for Large Content Sites in Optimizely

Large content sites often face significant challenges in maintaining visibility and control over thousands of pages. Content managers struggle to...

Sanjay Kumar | Oct 6, 2025

Optimizely CMS Roadmap – AI, automation and the future of digital experiences

A summary of the roadmap for Optimizely CMS from the Opticon conference on September 30, 2025.

Tomas Hensrud Gulla | Oct 6, 2025 |