DDS of course can handle it. It's slower than normal DB access, yes, but with right techniques (use separated table for the data, use proper caching etc.) it should not be a problem. However, I don't know the implementation detail of SEO MOgul, so I can't really say if they can handle that 10000 rows or not.
Hi! As Quan noted, DDS can perform good enough if you don't need the data fast and are using right techniques. Assuming SEO MOgul uses separate DDS table and caches all the entries you should be fine. However, if it tries to look up each URL from DDS every time, it's going to be slow as hell, especially if it uses default DDS table. In addition to the specific collection size, the overall DDS row count does matter too. In our project we have found that using default DDS will get slow starting from about 1500 rows. We had collection with about 6500 rows (40000 total in DDS table) and it took more than 20 seconds to get any data out of it. So I reimplemented the logic to use separate table (not DDS) and now it takes only 16ms. So first off, I would check how SEO MOgul is implemented and if it's possible to customize it when needed. If it does not seem good then I would rather implement own solution.
I'm guessing you mean using SEO Manager from Mogul. I did some heavy work optimizing that for performance a few years back. I haven't seen any performance issues since then even for very very big sites with 100 000+ pages on multiple languages.
Worth noting is that it will only call DDS if it can't map to an existing Episerver Url so it won't do that on every request, only for requests to old urls and those are usually pretty few.
It also has a cache layer above the DDS so for 301s, the first call might take a few hundred ms but the rest will be instant.
Also worth noting is that is supports doing reg ex for mapping too. If you have 10 000 news articles you simple want to redirect to newslisting page, that's one redirect using reg ex in DDS.
So you should be good. Otherwise let me know and we'll fix it :)
You can always import all your 10 000 urls using Google Search Console / csv and try it out.
We are using SEO MOgul, which underline using DDS to store and handle redirection rules. We have some 10000+ urls need to be entered as 301. Do any body have any benchmark on how much DDS can handle without impacting much on performance side.