Don't miss out Virtual Happy Hour this Friday (April 26).

Try our conversational search powered by Generative AI!

Anyone know how to "slow down" the reindex job?

Vote:
 

We have problem with getting this error from the re-indexing job:

EPiServer.Find.ServiceException: The remote server returned an error: (429) Too Many Requests.

We get this error when we try to remove a custom object from the index that should not be indexed anymore. It seem to be a lot of them and we use this code to remove:

SearchClient.Instance.Delete(id);

And that code seems to generate a new request.

So: Do anyone know a way to either slow down the job?

If that is not possible I need to do a special job that do this in a bulk operation.

#114220
Dec 08, 2014 8:25
Vote:
 

I'm guessing your doing the delete in a loop as you are sending in id:s

If your removing all objects of the type EducationInfoToIndex you could do a .Delete<EducationInfoToIndex>(x => x.id.GreaterThan(0)) that way you would only get one request to remove then all.

Else I guess you could just add a wait in the loop. Think the limit is 50 request/sec if I remember correctly.

#114221
Edited, Dec 08, 2014 8:33
Vote:
 

I am doing it inside the reindex job by

ContentIndexer.Instance.Conventions.ForInstancesOf<PageTypeName>().ShouldIndex(FunctionName);

And inside FunctionName I check if the object has expired or the editor has added a NOINDEX tag on it.
So it is inside a loop, that is true.

It is a little complicated since it is a Page but I do not index the main page, since it need to index a lot of other stuff also.
I am now looking into if there are any way to include custom data to the page during the indexing process so I can use the built in stuff

#114222
Edited, Dec 08, 2014 8:38
Vote:
 
// Do not index pages with HideFromSearch set or in the archive
            ContentIndexer.Instance.Conventions.ForInstancesOf<BasePageData>().ShouldIndex(x =>
            {
                var shouldIndex = !x.HideFromSearch;
                if(x.StartPage.ArchivePage != null && x.Ancestors().Contains(x.StartPage.ArchivePage.ID.ToString()))
                    shouldIndex = false;

                if (!shouldIndex)
                {
                    try
                    {
                        ContentIndexer.Instance.Delete(x);
                    }
                    catch
                    {
                        ////ignore 
                    }
                } 

                return shouldIndex;
            });

Not sure about your setup but I'm running this code and not getting the tomany request errors. Have you set the batchsize for the Indexer?

ContentIndexer.Instance.ContentBatchSize = 25;

(Could be that your Auzre server is to quick? ;) )

#114225
Dec 08, 2014 9:06
Vote:
 

I solved this now by rewriting my object so it is a Icontent and then be able to use EPiServer built in bulk functions.

Thanks Petter for the help pointing me in the right direction!

#114367
Edited, Dec 11, 2014 7:58
This topic was created over six months ago and has been resolved. If you have a similar question, please create a new topic and refer to this one.
* You are NOT allowed to include any hyperlinks in the post because your account hasn't associated to your company. User profile should be updated.