I am looking for recommendations on any modules that you may find useful in a wide range of projects. Anything from improving the CMS editing experience to utility functions such as automatic image optimization.
DbLocalizationProvider => https://github.com/valdisiljuconoks/LocalizationProvider/
For simpler translations for editors instead of carring about xml files
BVN.404Handler => https://github.com/geta/404handler
404 handler for dead links... We also use it for simple URLs (mostly because I managed to break the simple urls that episerver have made :'))
Geta.SEO.Sitemaps => https://github.com/Geta/SEO.Sitemaps
For handling Sitemaps
Gosso.EPiServerAddOn.QuickNavExtension => https://github.com/LucGosso/Gosso.EPiServerAddOn.QuickNavExtension
An Episerver addon that adds up to menu items to the QuickNavigationMenu when logged in on public site, link to admin, link to ContentType, and logout.
POSSIBLE.RobotsTxtHandler => https://github.com/markeverard/POSSIBLE.RobotsTxtHandler
For handling robots txt
And probably some more, but don't remember right now :)
Acually pretty much none that I use in all projects.
There are some great ones but no one that is a must I feel
Worth noting that all the add-ons above are available on the Episerver Nuget feed: https://nuget.episerver.com/
Also when it comes to image resizing the two most popular are ImageResizer.Plugins.EPiServerBlobReader (the classic and well used image resizer) or ImageProcessor.Web.Episerver (the new upstart approach to image resizer) both available on the Episerver Nuget feed.
Links for each:
The following plugins have been mentioned in this thread:
While both have benefits when initially developing an EpiServer solution, they are restricted quite significantly when wanting to implement a multi-site solution. If you think that there is a possibility of expanding to multiple sites then you are best writing your own solution for these two functions.
Thanks for your comments. I am curious to know why you think the POSSIBLE.RobotsTxtHandler is not suitable in a multi-site solution? I created the original version called EpiRobots (for CMS 6) which was later upgraded by a collegue to work with Episerver 7+ and it certainly worked in multi-site scenarios. So I'd like to hear your feedback to see if the tool can be improved !
Also, they are opensource, so instead of making your own solution, improve them if it doesn't work with for example multi-site projects :)
The initial issue with multi-site setup was in relation to the SiteMap utility and it's inflexibilty when trying to configure both multiple maps for a single site and also when trying to cover multiple sites. After investigation by our development team we decided that the sitemap plugin was not suited to our needs and our approach was to develop our own solution which met the needs of our site admins.
When we were then approached by our site administrators regarding the robots.txt plugin, they stated it did not meet their requirements for multi-site setup. We opted to extend what we had developed for the sitemap management to also cover the robots.txt solution as this was a fairly quick development to undertake.
As a development team we did feel that the original robots.txt solution worked but ultimately we are accountable to our customers - the administrators and editors of our site. Their main issue was around the user interface being a single text entry field; they wanted a solution which allowed them to select sections of the site from a tree structure and for these to be added to the robots.txt file. Then if the section of the site was renamed or removed the robots.txt would be updated automatically rather than this having to be done manually.
As we work for a single company, and not part of a software house, we are restricted by company policy around contributing to open-source solutions.
Further to my previous note, one issue we did notice with the robots.txt utility was the inability to restrict which users could edit the content of the file for a given site. We wanted to have the following setup in place:
Thanks for the feedback Darren - all useful thoughts and ideas for a potential future version of the robots.txt handler!