We have problem with getting this error from the re-indexing job:
EPiServer.Find.ServiceException: The remote server returned an error: (429) Too Many Requests.
We get this error when we try to remove a custom object from the index that should not be indexed anymore. It seem to be a lot of them and we use this code to remove:
And that code seems to generate a new request.
So: Do anyone know a way to either slow down the job?
If that is not possible I need to do a special job that do this in a bulk operation.
I'm guessing your doing the delete in a loop as you are sending in id:s
If your removing all objects of the type EducationInfoToIndex you could do a .Delete<EducationInfoToIndex>(x => x.id.GreaterThan(0)) that way you would only get one request to remove then all.
Else I guess you could just add a wait in the loop. Think the limit is 50 request/sec if I remember correctly.
I am doing it inside the reindex job by
And inside FunctionName I check if the object has expired or the editor has added a NOINDEX tag on it.So it is inside a loop, that is true.It is a little complicated since it is a Page but I do not index the main page, since it need to index a lot of other stuff also.I am now looking into if there are any way to include custom data to the page during the indexing process so I can use the built in stuff
// Do not index pages with HideFromSearch set or in the archive
var shouldIndex = !x.HideFromSearch;
if(x.StartPage.ArchivePage != null && x.Ancestors().Contains(x.StartPage.ArchivePage.ID.ToString()))
shouldIndex = false;
Not sure about your setup but I'm running this code and not getting the tomany request errors. Have you set the batchsize for the Indexer?
ContentIndexer.Instance.ContentBatchSize = 25;
(Could be that your Auzre server is to quick? ;) )
I solved this now by rewriting my object so it is a Icontent and then be able to use EPiServer built in bulk functions.
Thanks Petter for the help pointing me in the right direction!