<?xml version="1.0" encoding="utf-8"?><feed xmlns="http://www.w3.org/2005/Atom"><title type="text">Blog posts by Johan Antila</title><link href="http://world.optimizely.com" /><updated>2024-03-06T20:11:50.0000000Z</updated><id>https://world.optimizely.com/blogs/johan-antila/</id> <generator uri="http://world.optimizely.com" version="2.0">Optimizely World</generator> <entry><title>When Visual Studio Code Metrics fail</title><link href="https://world.optimizely.com/blogs/johan-antila/dates/2024/3/when-visual-studio-code-metrics-fail/" /><id>&lt;p&gt;Visual Studio Code Metrics- that won&#39;t show up.&lt;/p&gt;
&lt;p&gt;At Optimizely Expert Services, we are often times asked to do code reviews and assessments of customer solutions and these typically involve using static code analysers and reading code bases. One good tool for finding complex parts of a solution is the Code Metrics tool in Visual Studio that gives you an indication of hot spots in the code base. For example, it can show you the complexity of the code, generate something it calls Maintainability Index, It can list lines of code and display what of this is executable code, the latter is especially handy for projects and namespaces where you have your views. Unfortunately, there&#39;s been a few times where I have happened upon a bug in the Visual Studio Roslyn engine that stops the Code Metrics from rendering in the Code Metrics Window. Instead of listing the namespaces and calculated metrics, the window is just blank. &lt;em&gt;In vain, I have struggled&lt;/em&gt; to find the reason behind this and to fix it for the specific solution but I have come up empty handed. When it happened again this time and I went searching for a solution, one of the pages I came across mentioned that there was an automated process calculating the metrics and this led me to investigate if there was a command line tool to do this and perhaps, this would work when the built one wouldn&#39;t. It turns out that indeed, Microsoft has created such a tool called &lt;a href=&quot;https://github.com/dotnet/roslyn-analyzers&quot;&gt;Microsoft.CodeAnalysis.NetAnalyzers&lt;/a&gt; You can either add a package reference to your project to be able to target it using msbuild or compile a stand alone command line tool called to do this calculation in a command shell. As the nuget package solution didn&#39;t work for me, I opted for the latter solution with a command line tool and followed &lt;a href=&quot;https://learn.microsoft.com/en-us/visualstudio/code-quality/how-to-generate-code-metrics-data?view=vs-2019&quot;&gt;this guide.&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;With the tool compiled, start up a Visual Studio Command Prompt, and run it to generate an xml file:&amp;nbsp;&lt;/p&gt;
&lt;pre&gt;metrics.exe /p:project myproject.csproj /out:report.xml&lt;/pre&gt;
&lt;p&gt;This generates an XML file with the code metrics in it but that&#39;s not the Excel output that the built in tool generates. Now what? Well you have to parse the xml file some way so I opted to parse it to json and iterate over it using a serialized version of the file.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://github.com/johan-antila-episerver/ParseMetricsXml&quot;&gt;I wrote a small tool to do just that&lt;/a&gt;, it takes the xml file and outputs a csv file that you can import in Excel. You can find it over at&lt;strong&gt; &lt;/strong&gt;&lt;a href=&quot;https://github.com/johan-antila-episerver/ParseMetricsXml&quot;&gt;&lt;strong&gt;GitHub.&lt;/strong&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;One final note, to be able to parse the json in a compile time fashion, you need to catch the json content after it has been parsed from XML, copy it and use Visual Studio -&amp;gt; Edit -&amp;gt; Paste Special -&amp;gt; &lt;strong&gt;Paste JSON &lt;/strong&gt;as classes to get classes that you now can use in the code.&lt;/p&gt;</id><updated>2024-03-06T20:11:50.0000000Z</updated><summary type="html">Blog post</summary></entry> <entry><title>Azure DevOps and failing VSTest task</title><link href="https://world.optimizely.com/blogs/johan-antila/dates/2022/9/azure-devops-and-failing-vstest-task/" /><id>&lt;p&gt;Sometime during the summer, our Test task for a customers Azure Devops CI/CD environment stopped working for no apparent reason and as it had been unchanged for more than a year, the problem looked like it would be related to the Azure Devops environment itself. Checking build logs, we could conclude that Microsoft did an update to the default build runner used in Azure DevOps sometime in August or perhaps late July, 2022 (We lack builds from just before so we don&#39;t know the exact date) and it included a later version of the VSTest task that had a different behaviour than the previous one. It turned out that our test task included a lot of dlls targeted for different frameworks that were not actual tests that should not have been included but this selection of dlls hadn&#39;t changed, it was always the same and it had always picked up a lot more test dlls than it should have done - all with different targets - but since that was never a problem with the previous runner, we didn&#39;t pay attention to this. Where the old version would ignore this situation, the newer version of the VSTask returned a fail when it picked up dlls from different framework targets, something it does not support, thus failing the build even though the tests it actually managed to run all passed. The solution to this is to specify a less exclusive search pattern than the default one that you get if you define the test task in your yaml file like this:&lt;/p&gt;
&lt;pre&gt;- task: VSTest@2&lt;br /&gt;&amp;nbsp; inputs:&lt;br /&gt;&amp;nbsp; &amp;nbsp; platform: &#39;$(buildPlatform)&#39;&lt;br /&gt;&amp;nbsp; &amp;nbsp; configuration: &#39;$(buildConfiguration)&#39;&lt;/pre&gt;
&lt;pre&gt;(Not specifying wildcard pattern defaults to this search pattern: **\*test*.dll,!**\*TestAdapter.dll,!**\obj\**)&lt;/pre&gt;
&lt;p&gt;In my build, this would include all these files:&lt;/p&gt;
&lt;pre&gt;vstest.console.exe &quot;D:\a\1\s\Customer.Web\bin\EPiServer.Marketing.Testing.Core.dll&quot;&lt;br /&gt;&quot;D:\a\1\s\Customer.Web\bin\EPiServer.Marketing.Testing.Dal.dll&quot;&lt;br /&gt;&quot;D:\a\1\s\Customer.Web\bin\EPiServer.Marketing.Testing.Web.dll&quot;&lt;br /&gt;&quot;D:\a\1\s\Tests\Tests.Customer.Framework\bin\Release\Microsoft.VisualStudio.TestPlatform.MSTestAdapter.PlatformServices.Interface.dll&quot;&lt;br /&gt;&quot;D:\a\1\s\packages\MSTest.TestAdapter.2.1.2\build\_common\pl\Microsoft.VisualStudio.TestPlatform.MSTestAdapter.PlatformServices.resources.dll&quot;&lt;br /&gt;&quot;D:\a\1\s\packages\MSTest.TestAdapter.2.1.2\build\_common\pl\Microsoft.VisualStudio.TestPlatform.TestFramework.resources.dll&quot;&lt;br /&gt;&quot;D:\a\1\s\packages\MSTest.TestAdapter.2.1.2\build\_common\pt\Microsoft.VisualStudio.TestPlatform.MSTest.TestAdapter.resources.dll&quot;&lt;br /&gt;&quot;D:\a\1\s\packages\MSTest.TestAdapter.2.1.2\build\_common\pt\Microsoft.VisualStudio.TestPlatform.MSTestAdapter.PlatformServices.resources.dll&quot;&lt;br /&gt;&quot;D:\a\1\s\packages\MSTest.TestAdapter.2.1.2\build\_common\pt\Microsoft.VisualStudio.TestPlatform.TestFramework.resources.dll&quot;&lt;br /&gt;&quot;D:\a\1\s\packages\MSTest.TestAdapter.2.1.2\build\_common\ru\Microsoft.VisualStudio.TestPlatform.MSTest.TestAdapter.resources.dll&quot;&lt;br /&gt;&quot;D:\a\1\s\packages\MSTest.TestAdapter.2.1.2\build\_common\ru\Microsoft.VisualStudio.TestPlatform.MSTestAdapter.PlatformServices.resources.dll&quot;&lt;br /&gt;&quot;D:\a\1\s\packages\MSTest.TestAdapter.2.1.2\build\_common\ru\Microsoft.VisualStudio.TestPlatform.TestFramework.resources.dll&quot;&lt;br /&gt;&quot;D:\a\1\s\packages\MSTest.TestAdapter.2.1.2\build\_common\tr\Microsoft.VisualStudio.TestPlatform.MSTest.TestAdapter.resources.dll&quot;&lt;br /&gt;&quot;D:\a\1\s\packages\MSTest.TestAdapter.2.1.2\build\_common\tr\Microsoft.VisualStudio.TestPlatform.MSTestAdapter.PlatformServices.resources.dll&quot;&lt;br /&gt;&quot;D:\a\1\s\packages\MSTest.TestAdapter.2.1.2\build\_common\tr\Microsoft.VisualStudio.TestPlatform.TestFramework.resources.dll&quot;&lt;br /&gt;&quot;D:\a\1\s\packages\MSTest.TestAdapter.2.1.2\build\_common\zh-Hans\Microsoft.VisualStudio.TestPlatform.MSTest.TestAdapter.resources.dll&quot;&lt;br /&gt;&quot;D:\a\1\s\packages\MSTest.TestAdapter.2.1.2\build\_common\zh-Hans\Microsoft.VisualStudio.TestPlatform.MSTestAdapter.PlatformServices.resources.dll&quot;&lt;br /&gt;&quot;D:\a\1\s\packages\MSTest.TestAdapter.2.1.2\build\_common\zh-Hans\Microsoft.VisualStudio.TestPlatform.TestFramework.resources.dll&quot;&lt;br /&gt;&quot;D:\a\1\s\packages\MSTest.TestAdapter.2.1.2\build\_common\zh-Hant\Microsoft.VisualStudio.TestPlatform.MSTest.TestAdapter.resources.dll&quot;&lt;br /&gt;&quot;D:\a\1\s\packages\MSTest.TestAdapter.2.1.2\build\_common\zh-Hant\Microsoft.VisualStudio.TestPlatform.MSTestAdapter.PlatformServices.resources.dll&quot;&lt;br /&gt;&quot;D:\a\1\s\packages\MSTest.TestAdapter.2.1.2\build\_common\zh-Hant\Microsoft.VisualStudio.TestPlatform.TestFramework.resources.dll&quot;&lt;br /&gt;&quot;D:\a\1\s\packages\MSTest.TestAdapter.2.1.2\build\netcoreapp1.0\Microsoft.VisualStudio.TestPlatform.MSTestAdapter.PlatformServices.dll&quot;&lt;br /&gt;&quot;D:\a\1\s\packages\MSTest.TestAdapter.2.1.2\build\uap10.0\Microsoft.VisualStudio.TestPlatform.MSTestAdapter.PlatformServices.dll&quot;&lt;br /&gt;&quot;D:\a\1\s\packages\MSTest.TestFramework.2.1.2\lib\net45\Microsoft.VisualStudio.TestPlatform.TestFramework.Extensions.dll&quot;&lt;br /&gt;&quot;D:\a\1\s\packages\MSTest.TestFramework.2.1.2\lib\net45\Microsoft.VisualStudio.TestPlatform.TestFramework.dll&quot;&lt;br /&gt;&quot;D:\a\1\s\packages\MSTest.TestFramework.2.1.2\lib\netstandard1.0\Microsoft.VisualStudio.TestPlatform.TestFramework.Extensions.dll&quot;&lt;br /&gt;&quot;D:\a\1\s\packages\MSTest.TestFramework.2.1.2\lib\netstandard1.0\Microsoft.VisualStudio.TestPlatform.TestFramework.dll&quot;&lt;br /&gt;&quot;D:\a\1\s\packages\MSTest.TestFramework.2.1.2\lib\uap10.0\Microsoft.VisualStudio.TestPlatform.TestFramework.Extensions.dll&quot;&lt;br /&gt;&quot;D:\a\1\s\packages\MSTest.TestFramework.2.1.2\lib\uap10.0\Microsoft.VisualStudio.TestPlatform.TestFramework.dll&quot;&lt;/pre&gt;
&lt;p&gt;&lt;br /&gt;After changing the way test assemblies are found like this:&lt;/p&gt;
&lt;pre&gt;- task: &lt;a href=&quot;mailto:VSTest@2&quot;&gt;VSTest@2&lt;/a&gt;&lt;br /&gt;&amp;nbsp; inputs:&lt;br /&gt;&amp;nbsp; &amp;nbsp; testSelector: &#39;testAssemblies&#39;&lt;br /&gt;&amp;nbsp; &amp;nbsp; testAssemblyVer2: |&lt;br /&gt;  &amp;nbsp; &amp;nbsp; **\tests.customer*.dll&lt;br /&gt;&amp;nbsp; &amp;nbsp; searchFolder: &#39;$(System.DefaultWorkingDirectory)&#39;&lt;br /&gt;&lt;br /&gt;&amp;nbsp; &amp;nbsp;&amp;nbsp;&lt;br /&gt;It now includes only the relevant assemblies:&lt;br /&gt;&lt;br /&gt;&lt;/pre&gt;
&lt;pre&gt;vstest.console.exe &quot;D:\a\1\s\Tests\Tests.Customer.Framework\bin\Release\Tests.Customer.Framework.dll&quot;&lt;br /&gt;&quot;D:\a\1\s\Tests\Tests.Customer.Framework\obj\Release\Tests.Customer.Framework.dll&quot;&lt;br /&gt;&quot;D:\a\1\s\Tests\Tests.Customer.Web\bin\Release\Tests.Customer.Web.dll&quot;&lt;br /&gt;&quot;D:\a\1\s\Tests\Tests.Customer.Web\obj\Release\Tests.Customer.Web.dll&quot;&lt;br /&gt;&lt;br /&gt;&lt;/pre&gt;
&lt;p&gt;And with this, the test now passes. Succes! In other words, make sure you have a coherent naming scheme for your tests that is easy to pick up with a wildcard pattern and make sure your test task only picks up these specific dlls. You can see the output of the VSTest task in the pipeline in Azure Devops to detrmine what dlls it&#39;s trying to probe for tests.&lt;br /&gt;&lt;br /&gt;&lt;/p&gt;</id><updated>2022-09-16T11:07:31.0000000Z</updated><summary type="html">Blog post</summary></entry> <entry><title>Scripted testing of IIS URL Rewrite rules</title><link href="https://world.optimizely.com/blogs/johan-antila/dates/2019/9/scripted-testing-of-iis-url-rewrite-rules/" /><id>&lt;p&gt;On most websites today you are likely to need to rewrite urls to match a specific requirement such as the typical SEO rules where you need urls to all end with a trailing slash and be all lowercase to prevent duplicate content in the search engines or add a few hard coded redirects for old discontinued sites. While these rules by themselves are easy enough to test, as soon as you start creating a few more rules and they contain host names, you will get a bit of a challenge to test these locally.&lt;br /&gt;While it is possible to test the rules locally in the IIS Administration tool, this is only an atomic test of the single rule and not a test of the entire redirect pipeline so if you introduce one bad rule, you can&#39;t tell if it breaks the other redirects unless you request it with a browser through IIS and with a lot of urls, this becomes increasingly impractical and in the scenario where you yourself aren&#39;t handling the go-live and you haven&#39;t been able to test the rules locally,&amp;nbsp;it creates the need to write these same rules as a human readable script for whoever is doing the deploy/rollback to test after applying the change. If only there was a way to script this. Well there is.&lt;br /&gt;Using &lt;strong&gt;&lt;a href=&quot;https://nodejs.org/en/&quot;&gt;nodejs&lt;/a&gt;&amp;nbsp;&lt;/strong&gt;with &lt;strong&gt;&lt;a href=&quot;https://github.com/GoogleChrome/puppeteer&quot;&gt;puppeteer&lt;/a&gt;&lt;/strong&gt;, you can easily create a test script in which you define all your urls you want to test and see if they are redirecting as expected.&lt;/p&gt;
&lt;p&gt;First you need to install nodejs&lt;strong&gt; &lt;a href=&quot;https://nodejs.org/en/&quot;&gt;&lt;/a&gt; &lt;/strong&gt;if you haven&#39;t already done that, then in a console window, &lt;strong&gt;&lt;a href=&quot;https://developers.google.com/web/tools/puppeteer/get-started&quot;&gt;install puppeteer&lt;/a&gt;&lt;/strong&gt; , a module to programmatically control googles &lt;strong&gt;&lt;a href=&quot;https://developers.google.com/web/updates/2017/04/headless-chrome&quot;&gt;Headless Chrome&lt;/a&gt;&lt;/strong&gt;.&lt;/p&gt;
&lt;p&gt;&lt;code&gt;npm install puppeteer&lt;/code&gt;&lt;/p&gt;
&lt;p&gt;With puppeteer you can do all sorts of things with automatic testing and we are going to use it in a pretty simple way, let it navigate to a url and see what url it ends up on and test if we were expecting this url. The nice thing here is that we&#39;re testing what a browser would actually do after it has completed all the redirects, even ones you&#39;ve written in code for redirecting users based on geo-ip for instance and if you have a redirect loop here, you will get an error.&lt;/p&gt;
&lt;p&gt;Now you need to make sure you have all the redirect rules in place for production in your local dev environment, ie, in web.config and if you&#39;re not comfortable with having these rules permanently in your development environment, you could set up a config transformation and a custom msbuild target that will create the web.config with local settings and the redirects included. If you&#39;re not familiar with how to do this, this is how I generate config files when I need to see how they would look when deployed to the Episerver DXC that uses &lt;strong&gt;&lt;a href=&quot;/link/64d168381a354d788e9897250f044b57.aspx&quot;&gt;config transformations&lt;/a&gt;&lt;/strong&gt;:&lt;/p&gt;
&lt;p&gt;&lt;code&gt; &amp;lt;Target Name=&quot;DXCConfigs&quot;&amp;gt;&lt;/code&gt;&lt;br /&gt;&lt;code&gt;&amp;nbsp; &amp;lt;TransformXml Source=&quot;web.config&quot; Transform=&quot;Web.integration.config&quot; Destination=&quot;web.integration.out.dxc.config&quot; /&amp;gt;&lt;/code&gt;&lt;br /&gt;&lt;code&gt;&amp;nbsp; &amp;lt;TransformXml Source=&quot;web.integration.out.dxc.config&quot; Transform=&quot;web.preproduction.config&quot; Destination=&quot;web.preproduction.out.dxc.config&quot; /&amp;gt;&lt;/code&gt;&lt;br /&gt;&lt;code&gt;&amp;nbsp; &amp;lt;TransformXml Source=&quot;web.preproduction.out.dxc.config&quot; Transform=&quot;web.production.config&quot; Destination=&quot;web.production.out.dxc.config&quot; /&amp;gt;&lt;/code&gt;&lt;br /&gt;&lt;code&gt;&amp;lt;/Target&amp;gt;&lt;/code&gt;&lt;/p&gt;
&lt;p&gt;and you run it from the command-line like this:&lt;/p&gt;
&lt;p&gt;&lt;code&gt;msbuild /T:DXCConfigs&lt;/code&gt;&lt;/p&gt;
&lt;p&gt;After this is done, you need to make sure you local IIS listens to all domains you are going to test. The easiest way to accomplish this is to just add a wildcard mapping but if this is not possible for you, add them one by one under Edit Bindings in IIS.&amp;nbsp;Note that if the redirect rules include upgrading the connection to https, you need to add a self signed certificate and add mappings in IIS for this too.&lt;/p&gt;
&lt;p&gt;Now edit the hosts file and add your hosts&lt;/p&gt;
&lt;p&gt;&lt;code&gt;127.0.0.1 episerver.com&lt;/code&gt;&lt;br /&gt;&lt;code&gt;127.0.0.1 www.episerver.com&lt;/code&gt;&lt;br /&gt;&lt;code&gt;127.0.0.1 ektron.com&lt;/code&gt;&lt;br /&gt;&lt;code&gt;127.0.0.1 www.episerver.se&lt;/code&gt;&lt;/p&gt;
&lt;p&gt;One thing to note is that if you are testing redirects with HTTPS, you need Chromium to ignore SSL certificate warnings for your local certs for the script to work, this is done by launching puppeteer with the ignoreHTTPSErrors flag set to true as can be seen in the script below.&lt;/p&gt;
&lt;p&gt;My script looks like this:&lt;/p&gt;
&lt;p&gt;&lt;code&gt;/*&lt;/code&gt;&lt;br /&gt;&lt;code&gt;Test redirects&lt;/code&gt;&lt;br /&gt;&lt;code&gt;*/&lt;/code&gt;&lt;/p&gt;
&lt;p&gt;&lt;code&gt;const puppeteer = require(&#39;puppeteer&#39;);&lt;/code&gt;&lt;/p&gt;
&lt;p&gt;&lt;code&gt;(async () =&amp;gt; {&lt;/code&gt;&lt;/p&gt;
&lt;p&gt;&lt;code&gt;&amp;nbsp; async function testurl( fromurl, tourl) {&lt;/code&gt;&lt;br /&gt;&lt;code&gt;&amp;nbsp; try {&lt;/code&gt;&lt;br /&gt;&lt;code&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp;await page.goto(fromurl, {&lt;/code&gt;&lt;br /&gt;&lt;code&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp;waitUntil: &#39;networkidle2&#39;&lt;/code&gt;&lt;br /&gt;&lt;code&gt;&amp;nbsp; &amp;nbsp;});&lt;/code&gt;&lt;br /&gt;&lt;code&gt;&amp;nbsp; &amp;nbsp;if (await page.url() === tourl) {&lt;/code&gt;&lt;br /&gt;&lt;code&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp;await console.log(&quot;Success: &quot;, page.url());&lt;/code&gt;&lt;br /&gt;&lt;code&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp;return true;&lt;/code&gt;&lt;br /&gt;&lt;code&gt;&amp;nbsp;&amp;nbsp; } else {&lt;/code&gt;&lt;br /&gt;&lt;code&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp;console.log(&quot;*** Fail. &quot; + fromurl + &quot; Got: &quot;, page.url() + &quot; expected: &quot; + tourl);&lt;/code&gt;&lt;br /&gt;&lt;code&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp;return false;&lt;/code&gt;&lt;br /&gt;&lt;code&gt;&amp;nbsp; &amp;nbsp;}&lt;/code&gt;&lt;br /&gt;&lt;code&gt;&amp;nbsp; } catch (error) {&lt;/code&gt;&lt;br /&gt;&lt;code&gt;&amp;nbsp; &amp;nbsp;console.log(&quot;*** Error &quot; + fromurl + &quot; &quot; + error);&lt;/code&gt;&lt;br /&gt;&lt;code&gt;&amp;nbsp;&amp;nbsp; }&lt;/code&gt;&lt;br /&gt;&lt;code&gt;&amp;nbsp; }&lt;/code&gt;&lt;/p&gt;
&lt;p&gt;&lt;code&gt;&amp;nbsp; // set up puppeteer to ignore HTTPS errors so it won&#39;t bomb on your self-signed certificates&lt;/code&gt;&lt;br /&gt;&lt;code&gt;&amp;nbsp; const browser = await puppeteer.launch({&lt;/code&gt;&lt;br /&gt;&lt;code&gt;&amp;nbsp; &amp;nbsp; ignoreHTTPSErrors: true&lt;/code&gt;&lt;br /&gt;&lt;code&gt;&amp;nbsp; });&lt;/code&gt;&lt;/p&gt;
&lt;p&gt;&lt;code&gt;&amp;nbsp; const page = await browser.newPage();&lt;/code&gt;&lt;/p&gt;
&lt;p&gt;&lt;code&gt;&amp;nbsp; await testurl(&#39;http://episerver.se&#39;, &#39;https://www.episerver.se/&#39;);&lt;/code&gt;&lt;br /&gt;&lt;code&gt;&amp;nbsp; await testurl(&#39;http://ektron.com/&#39;, &#39;https://www.episerver.com/&#39;);&lt;/code&gt;&lt;br /&gt;&lt;code&gt;&amp;nbsp; await testurl(&#39;https://www.EPiServer.COM/&#39;, &#39;https://www.episerver.com/&#39;);&lt;/code&gt;&lt;br /&gt;&lt;code&gt;&amp;nbsp; await testurl(&#39;https://www.episerver.com/ascend-conference/ascend-2019&#39;, &#39;https://www.episerver.com/ascend-conference/ascend-2019/&#39;);&lt;/code&gt;&lt;br /&gt;&lt;code&gt;&amp;nbsp; await testurl(&#39;http://www.episerver.no/dxc&#39;, &#39;https://www.episerver.no/produkter/funksjoner/plattform-som-en-tjeneste/&#39;);&lt;/code&gt;&lt;br /&gt;&lt;br /&gt;&lt;code&gt;&amp;nbsp; await browser.close();&lt;/code&gt;&lt;br /&gt;&lt;code&gt;&amp;nbsp; })();&lt;/code&gt;&lt;/p&gt;
&lt;p&gt;I only used my script to manually validate that the rules I introduced didn&#39;t break any of the other redirects (it turned out they did), especially the ones I had in the &lt;strong&gt;&lt;a href=&quot;https://github.com/Geta/404handler&quot;&gt;Geta 404 handler&lt;/a&gt;,&lt;/strong&gt;&amp;nbsp;but if you need to do this in a more unittest-like fashion, see &lt;strong&gt;&lt;a href=&quot;https://developers.google.com/web/updates/2017/06/headless-karma-mocha-chai&quot;&gt;this post&lt;/a&gt;&lt;/strong&gt; for inspiration on how to use an assertion framework with Headless Chrome.&lt;/p&gt;
&lt;p&gt;Finally, when you are sure you didn&#39;t break anything with your new rules and they were deployed into production, just comment out the entries in the hosts file and run the script again to test the rules in production if you&#39;d like to.&lt;/p&gt;</id><updated>2019-09-04T13:27:31.0000000Z</updated><summary type="html">Blog post</summary></entry></feed>