I'm seeking some advice. I'm developing on an Commerce solution (neweste version) and there we're using the CSV import to import products into the catalog.
Now, however, the client wishes that we should autogenerate the Seo Information on import, and I'm struggling finding a proper entry point for solving this.
When doing the CSV Import - are there save events I can hook into? Or will I need to create an update function that runs afterwards and that can be called manually which (bulk) updates the information?And if so, should I run it as a direct SQL Script as I think fetching all products to save them again afterwards might be too cumbersome?
Thanks in advance for feedback.Regards
If you want to auto generate SeoUri and UriSegment, then you can implement your own UniqueSeoGenerator, and register it to override the default implementation
Here's a starting point: http://world.episerver.com/blogs/Son-Do/Dates/2015/3/extend-uniqueseogenerator-to-replace-seo-url-generation-in-episerver-commerce-8-9/
Hi.Thanks for the reply. I've found (and implemented) for the URL but I'm also needing for example the Meta Description - so the URL handler alone is not enough for me unfortunately.
Is there a similar function for the other fields in Seo Information?
Yeah, I know (and understand) the risk, but some times doing bulk operations in a non-bulk environment means you'll have to consider ...... "short-cuts" and alternative routes .
And this time, whether it's possible to do something with Seo Information (meta description) during / after a CSV import in bulk, or if I'll have to consider one of the altenative routes to solve my issue.
Where do you get the CSV from?
As a side note we generally write our own custom import even if we don't bulk them we are free to included all the customer's business logic which is really quite advantageous.
It's a client generated and the client upload it to the CSV import in the Commerce Manager. I would also have prefered to write my own importer to begin with, but it was deemed that the built in CSV importer would be the better approach (at the time of implementation) for a number of reasons which now is conflicting with current wishes from the client.
That's why I'm trying to gauge which approach will be the easiest/fastest and solve most of their issues because a perfect solution does not exists. For example if there were some events I could hook into within the CSV importer when it maps data, or before writing the data, or similar it would have been a possible route.
But it more and more looks like I'll try to fix this purely in the display phase and simply not persist the data in the data model, with the disadvantages that might bring from an administration viewpoint.
I was writing something, but Erik was faster than me in saying that you should not use direct database manipulation. Yes, we advise against that, can be very risky.
One option is to update the SEO information in batch. This can be done (or more precisely, have to be done) using ICatalogSystem.
This is one way to do it (pseudo code, some modifications might be needed):
CatalogSearchOptions options = new CatalogSearchOptions();
CatalogSearchParameters searchParams = new CatalogSearchParameters();
int totalCount = 0;
catalogSystem.FindItemsDto(searchParams, options, ref totalCount);
int recordsCount = 0;
while (totalCount > 0 && recordsCount < totalCount)
options.RecordsToRetrieve = 500; //We'll get 500 catalogentry each time
options.StartingRecord = recordsCount;
var catalogEntriesDto = CatalogContext.Current.FindItemsDto(searchParams, options, ref totalCount, new CatalogEntryResponseGroup(CatalogEntryResponseGroup.ResponseGroup.Info));
foreach (var catalogItemSeo in catalogEntriesDto.CatalogItemSeo)
recordsCount += 500;
I'll take a look at that approach - it seems like it might be a plausible way to go. Thanks for the info.
Do you know if there's an event fired after a CSV import so it could run this automatically after an import?