AI OnAI Off
Hi, it should be ok to build such solution but you need to limit the way editors retrieve content. Also make sure your code works with large content. I would sugest that you have a script generating this amount of content before you release the website and test it.
Of course a page that is showing a lot of content from other part of the website might become slow. So look at ways to cache your data is important.
Thanks, Eric. We will certainly look at testing the solution as suggested. Although, I now think that we won't be using Fetch data quite so much.
15,000 pages refers to the number of site pages that can be seen by visitors to the site, and not the actual amount of pages/Composer blocks that will be created to manage the content.
We are building over 500 microsites, with about 30 pages each, that will all share around 80% of their content with a "Master site" that will sit at the start (home page) of the tree structure.
We will be sharing the content by creating a library of assets. In the library of assets there will be some:
Approximately 15 pages of the "Master site" will be the same on every site, so we expect to use Fetch data at least 7,500 times.
Has anyone else built anything similar, or used similar methods to build a complex site and come across any performance issues?
Thanks
Karen