Blog posts by Ron Rangaiya2023-10-15T03:48:03.0000000Z/blogs/ron-rangaiya/Optimizely WorldOpticon - wrap uphttps://rangaiya.hashnode.dev/opticon-wrap-up2023-10-15T03:48:03.0000000Z<p>Day 2 kicked off with more insights from industry leaders and experts. The importance of empowering teams was emphasized, to reinvent how marketing and product teams work cohesively to create and optimise digital experiences. "Content is at the core of every digital experience".</p>
<p>Recapping some key product roadmap insights below.</p>
<h2 id="heading-orchestrate-roadmap">Orchestrate roadmap</h2>
<p>Exciting planned innovations for the CMS and CMP products.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1697326888141/e8296bda-8c24-4790-bd11-d87135a12d30.jpeg" alt class="image--center mx-auto" /></p>
<h3 id="heading-saas-core">SaaS Core</h3>
<p>Introducing composable architecture to the CMS giving total flexibility to build your solution.</p>
<p>Planned for H1 2024.</p>
<h3 id="heading-visual-builder">Visual Builder</h3>
<p>Experience builder in the CMS to create, design and reuse brand approved templates without .NET development.</p>
<p>Planned for H1 2024.</p>
<h3 id="heading-omni-channel-authoring-and-delivery">Omni channel authoring and delivery</h3>
<p>"Create content once and publish anywhere" from the CMP. With GenAI capabilities, marketers can create better content, faster and easily push to downstream channels via integrations.</p>
<p>Planned for Q4 2023.</p>
<h3 id="heading-web-experimentation-self-hosting">Web Experimentation self hosting</h3>
<p>The standard JavaScript snippet can now be self-hosted in your CDN, eliminating the need to make a connection to the Optimizely Web Experimentation CDN. This will overcome the known latency issues when running experiments.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1697339420539/8fadc0aa-31d6-416f-9603-130626fc3902.jpeg" alt class="image--center mx-auto" /></p>
<p>Whether this will also be possible for websites running on Optimizely DXP will need to be explored with Optimizely support.</p>
<h2 id="heading-optimizely-graph-the-new-search-engine">Optimizely Graph - the new search engine</h2>
<p>Optimizely Graph, the new name for Content Graph, is a multi-tenant SaaS service that allows to search, query and deliver content anywhere.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1697327581986/f542401c-bed3-4e46-9857-1eb46885539b.jpeg" alt class="image--center mx-auto" /></p>
<p>Optimizely Graph provides similar functionality to Search and Navigation with additional benefits such as search flexibility, performance and AI-enhanced semantic search. Semantic search improves search accuracy by understanding the searchers intent and the contextual meaning of the terms, compared to a standard keyword search which only searches for the keyword terms in the content. As shown in the below example, semantic search produces more relevant results.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1697327632158/72fb1d08-50e1-409c-8ad6-6c0333b508f4.jpeg" alt class="image--center mx-auto" /></p>
<p>However there are some current limitations with Optimizely Graph as per below including no support yet for nested queries (planned).</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1697332124264/1f9e1834-eb79-4fc2-b3d1-620e019e32a7.jpeg" alt class="image--center mx-auto" /></p>
<p>Search and Navigation is not going away and may continue to be the preferred first option for PaaS solutions but there is an opportunity to gradually start adopting Graph and use both engines side by side until such a time that Optimizely Graph gains parity on the missing features. There is also developer learning overhead to factor in i.e. using GraphQL queries instead of NET client library.</p>
<p>Optimizely Graph is straightforward to get started with by installing NuGet packages. Please refer to <a target="_blank" href="https://docs.developers.optimizely.com/digital-experience-platform/v1.4.0-content-graph/docs/installation-and-configuration">developer documentation</a> for more information.</p>
<p>It is available to all CMS customers on Optimizely DXP.</p>
<h2 id="heading-clean-water-changes-everything">Clean water changes everything</h2>
<p>Scott Harrison, CEO of charity: water ended the day by delivering a moving insight into his journey and mission to bring clean water to everyone on earth. It resonated with the room and brought awareness of a problem not familiar in developed countries - access to clean water.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1697338309840/3be7c1d3-432c-4586-8052-9ca2e2e1b9d4.jpeg" alt class="image--center mx-auto" /></p>
<p>His 100% model is all about bringing confidence to public donors that 100% of their donations is funding clean water. He does this by leveraging private donors i.e corporations and celebrities for all overhead costs.</p>
<p>Technology is a big enabler for charity: water, from experimentation-driven digital experiences to remote monitoring of well sensors to get alerts on malfunctions.</p>
<p>Clean water is a basic resource that makes a huge impact to peoples lives, it allows children, families and communities to thrive. It certainly changes everything!</p>
<h2 id="heading-wrapping-up">Wrapping up</h2>
<p>Optimizely delivered another world-class event focusing on seizing the moment and making an impact.</p>
<p>It's time to head back armed with learnings and expert insights to help customers navigate through their digital transformation / maturity journeys.</p>
Opticon - Day 1https://rangaiya.hashnode.dev/opticon-day-12023-10-12T04:36:45.0000000Z<p>Opticon 2023 officially kicked off today with Optimizely unveiling exciting product innovations and announcements.</p>
<p>Composability, simplicity and AI-accelerated digital experiences were the main themes in the Keynote with messages and insights from industry leaders about "embracing change", learning and growing.</p>
<p>We got previews and deep dives into product innovations throughout the day, showcasing Optimizely's vision for the future.</p>
<h2 id="heading-optimizely-one">Optimizely One</h2>
<p>Optimizely One was unveiled as the future for Optimizely's Digital Experience Platform, offering choice, composability and extensibility.</p>
<p>It's an all-in-one marketing operating system, with Content, Commerce and Experimentation, to support the marketing lifecycle.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1697065687795/98a2d2f6-eaf0-475a-b54c-a821f7c5d875.jpeg" alt class="image--center mx-auto" /></p>
<h2 id="heading-saas-cms"><strong>SaaS CMS</strong></h2>
<p>Introducing SaaS Core, Optimizely's new SaaS headless-first offering launching under the Orchestrate umbrella. The approach Optimizely has taken is to decouple the existing PaaS platform into SaaS components comprising of a headless SaaS core, visual experience builder and Optimizely Graph. SaaS Core will provide the benefits of being version-less, automatic upgrades and always the latest features.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1697073459998/1711dab4-8241-4a01-8df2-fa6e29c2dd8e.jpeg" alt class="image--center mx-auto" /></p>
<p>PaaS and SaaS Core offer true choice and flexibility for headless, hybrid or aggregated architecture. Optimizely Graph underpins both concepts for connecting and exposing all data.</p>
<p>Visual experience builder, currently in infancy stage, is all about empowering the content editors for building page templates.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1697075457160/f5eba348-cb9c-4ddb-ac7a-2b3bc78dd1e2.jpeg" alt class="image--center mx-auto" /></p>
<p>SaaS Core is currently in closed beta. You can <a target="_blank" href="https://www.optimizely.com/saas-core-waitlist/">join the waitlist</a>.</p>
<h2 id="heading-opal">Opal</h2>
<p>Opal encompasses all Optimizely AI capabilities with use cases across insights, recommendations and co-pilot.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1697076666654/c3cc625a-8480-49c4-942e-23f342855786.jpeg" alt class="image--center mx-auto" /></p>
<p>Optimizely's partnership with Writer was announced leveraging its enterprise-level generative AI capabilities.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1697082114370/240e5897-e16b-4a9b-a974-e2a8d912c8d2.jpeg" alt class="image--center mx-auto" /></p>
<h2 id="heading-composable-commerce">Composable Commerce</h2>
<p>Composability will also extend to the Commerce products with SaaS Core (Configured) and PaaS Core (Customised).</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1697073308823/cdbcf493-ca9e-4932-8bc2-96d78d835e00.jpeg" alt class="image--center mx-auto" /></p>
<p>Innovations such as Search, Product Recommendations and Translations, supercharged by Google, will be integrated into SaaS Core Commerce.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1697082691467/28ecc68a-3aa9-4456-bc0a-ac226d47852a.jpeg" alt class="image--center mx-auto" /></p>
<h2 id="heading-experimentation-collaboration">Experimentation Collaboration</h2>
<p>Recently launched, this provides purpose built workflows for visibility and transparency across the entire experimentation process directly from the CMS.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1697073159304/cef6c2cd-8744-402e-bf3b-ef07eaf5179b.jpeg" alt class="image--center mx-auto" /></p>
<h2 id="heading-in-summary">In Summary</h2>
<p>These product innovations are testament to Optimizely's commitment to deliver a smart and composable digital experience platform that serves and empowers both marketing practitioners and developers.</p>
Opticon - the journeyhttps://rangaiya.hashnode.dev/opticon-the-journey2023-10-09T03:39:32.0000000Z<p>Part 1 of my Opticon 2023 series</p>
<h2 id="heading-the-journey">The Journey</h2>
<p>My last Optimizely/Episerver conference was in Miami 2019, during the pre-pandemic era. It doesnt seem that long ago but a lot has happened since then - Episerver became Optimizely, making strategic acquisitions on the way and strengthening it's position as a market leading DXP.</p>
<p>Super excited to get the opportunity again to attend Opticon despite the long journey from Sydney to San Diego. Add in the misfortune of flying on my birthday. I did however get a birthday cake and wishes from my wife and daughter before I left. Plus, I'll get to celebrate my birthday in two different time zones.</p>
<h2 id="heading-touchdown">Touchdown</h2>
<p>As expected I didn't get any sleep on the plane. After an exhausting 24 hours which included spending 4 hours at LAX due to flight delays, I finally arrived in sunny San Diego. The jet lag was killing me!</p>
<h2 id="heading-whats-next">What's next</h2>
<p>It's going to be a jam packed few days, starting with the OMVP summit. This will be my first OMVP summit, nervous and excited at the same time to be rubbing shoulders with the elite in the Optimizely community.</p>
<p>The conference agenda looks exciting with breakout sessions and hands on workshops, I am having a hard time choosing which ones to attend. Particularly keen to learn more about the CMS roadmap and experimentation workshops.</p>
<p>Looking forward to networking, getting insights into DXP, CMS, Commerce and Experimentation and catching up with the Optimizely community.</p>
<p>But first up, some much needed rest!</p>
Tips for Deploying to Azure Web App/blogs/ron-rangaiya/dates/2023/7/tips-for-deploying-to-azure-web-app/2023-07-13T07:16:15.0000000Z<p>In this post, I will share my experience in deploying a CMS 12 solution to a self-managed Azure Web App. Although the <a href="https://docs.developers.optimizely.com/content-management-system/docs/deploying-to-azure-webapps">official documentation</a> provides a good starting point, I wanted to share some tips and potential pitfalls.</p>
<p>For this blog, I'll assume you already have a CMS 12 solution ready for deployment.</p>
<h2>1. Easily create Web App + Database</h2>
<p>You can use the "Web App + Database" <a href="https://portal.azure.com/#create/Microsoft.AppServiceWebAppDatabaseV3">resource create blade</a> from the MarketPlace which simplifies the creation of the web app and SQL database resources. This flow will automatically create a secure connection between the web app and the database and ensures best practices for secure database access. The database username and password will be automatically generated and the connection string added to the web app configuration. You can choose to update the database settings later if needed.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1688695670526/2a7d8bb9-a9ef-4fe7-b25a-0cd40dd3c4c6.png" alt="" /></p>
<p><span style="background-color: #ecf0f1;">💡 After creation, remember to update the automatically added connection string name on the web app configuration to <em>EPiServerDB</em> and add <em>MultipleActiveResultSets=True</em>.</span></p>
<h2>2. Choose the correct Service Bus tier</h2>
<p>The CMS event system requires Topics so you need to choose the "Standard" pricing tier at a minimum. If you choose the Basic tier (as I did), you'll get the following exception on startup:</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1688696456563/3eee97bc-8b31-44ba-94d9-f5d1891e27bb.png" alt="" /></p>
<h2>3. Cloud Configuration</h2>
<p>There are two ways of configuring the Azure resources</p>
<h3><strong>Configure via Code</strong></h3>
<p>For DXP deployments, you can simply add the <em>AddCmsCloudPlatformSupport</em> method however for self-hosted Azure deployments you will need to configure using <em>EPiServer.Azure</em> package directly. See this <a href="/link/ef36f18846ba41e19cb64f77f9ac2c86.aspx">post</a> for more on code configuration options.</p>
<pre class="language-csharp"><code>services.AddAzureBlobProvider(options => options.ContainerName = "mosey-media");
services.AddAzureEventProvider(options => options.TopicName = "mosey-events");</code></pre>
<p><span style="background-color: #ecf0f1;">💡 Add the <em>EPiServerAzureBlobs </em>and <em>EPiServerAzureEvents </em>connection strings on the web app configuration, which will be used by default.</span></p>
<h3><strong>Configure via appsettings file</strong></h3>
<p>In my opinion, code configuration is cleaner and provides more control however you can also configure via the appsettings.json as per the <a href="https://docs.developers.optimizely.com/content-management-system/docs/deploying-to-azure-webapps#3-update-the-configuration">Optimizely documentation</a>. When doing this do note that the "ConnectionString" attribute for AzureBlobProvider and AzureEventProvider is for the actual connection string and not the connection string name.</p>
<h2>4. Search Configuration</h2>
<p>To configure Search and Navigation, add the following application settings to the web app configuration</p>
<p><em>episerver__Find__DefaultIndex</em></p>
<p><em>episerver__Find__ServiceUrl</em></p>
<p><span style="background-color: #ecf0f1;">💡 Note the double underscore in the key name to ensure it gets translated into an environment variable.</span></p>
<h2>5. Create Admin User</h2>
<p>To register an Admin user, add the following line to your Startup.cs. This will register a startup middleware to redirect to the Register page on the first request to the environment.</p>
<pre class="language-csharp"><code>services.AddAdminUserRegistration();</code></pre>
<p><span style="background-color: #ecf0f1;">💡 If you created your site from the Alloy or CMS Empty site template then this line is already added.</span></p>
<h2>6. HTTPS Certificate Error</h2>
<p>If the site fails to load and you see<em> "BackgroundService failed"</em> and/or <em>"Unable to configure HTTPS endpoint" </em>exceptions in your log stream, check your "urls" appsetting and remove any https url.</p>
<pre class="language-markup"><code>"urls": "http://*:8080/"</code></pre>
<p>I was getting this exception when using the default domain <em><site>.azurewebsites.net</em></p>Unlock Digital Experiences with Optimizely Data Platform Activationshttps://rangaiya.hashnode.dev/unlock-digital-experiences-with-optimizely-data-platform-activations2023-07-05T07:33:52.0000000Z<p>As we dive into the second half of 2023, let's review Optimizely's continuous innovations to their customer data and intelligence capabilities, as highlighted in the <a target="_blank" href="https://www.optimizely.com/product-updates/roadmap-q2-2023/customer-data/">Q2 2023 roadmap for Optimizely Data Platform</a>. This includes the launch of key innovations such as the Connect Platform, Real-time Segmentation, and Experience Activations, now available in all regions, empowering brands through unified insights and intelligent decision-making for delivering relevant customer experiences.</p>
<h3 id="heading-optimizely-connect-platform">Optimizely Connect Platform</h3>
<p>The Optimizely Connect Platform (OCP) helps Optimizely work together with third-party platforms. It simplifies data ingestion and channel activation by providing marketers with easy-to-use "one-click install" integration Apps.</p>
<p>OCP's developer-friendly solution makes it easy for partners and developers to quickly build custom integrations on Optimizely's serverless compute platform. Integrations either send data into (ingest) or out of (activation) ODP. Developers can then publish their connectors to the App Directory in ODP, a centralised location for marketers to discover, manage, and use these low-code/no-code integration apps for creating digital experiences.</p>
<p>The list of <a target="_blank" href="https://support.optimizely.com/hc/en-us/articles/9189517028621-Regional-availability-of-integrations-in-ODP">ODP integrations in the App Directory</a> is constantly expanding with recent additions including HubSpot and Salesforce Marketing Cloud (currently in Beta) for marketing activations.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1688447464805/7ebf4cae-32fd-413c-b5b1-af4f674b6fcf.png" alt class="image--center mx-auto" /></p>
<h3 id="heading-real-time-segmentation">Real-time Segmentation</h3>
<p>Marketers can now leverage Real-Time Segmentation in Data Platform and the segment builder interface to target users based on their behaviours and events within the past 28 days. Custom-built real-time segments can be created using the RealtimeSegments API (based on GraphQL API) or the real-time segment builder interface in ODP.</p>
<p>Optimizely has recognised the importance of data freshness in marketing hence the data latency for real-time segments is less than 90 seconds, ensuring an accurate view of customer data. This will empower brands to adapt with speed and scale through the creation and activation of segments for web use cases that require real-time relevance.</p>
<h3 id="heading-activations">Activations</h3>
<p>ODP powers real-time activations through</p>
<ul>
<li><p><strong>Cross-product integrations</strong> allow the use of real-time segments in other Optimizely products, as audiences in Web Experimentation and Feature Experimentation for on-site experimentation and as Visitor Groups in CMS for real-time web personalisation.</p>
</li>
<li><p><strong>Channel activation</strong> with third-party marketing platforms such as Salesforce Marketing Cloud enables marketers to synchronise the segments in Data Platform for omnichannel marketing campaigns.</p>
</li>
<li><p><strong>Web modal and embed campaigns</strong>, this highly sought-after feature has returned under the "Activation" (previously "Campaigns") tab in ODP. It enables marketers to quickly create and launch engaging no-code modal/embed campaigns, which are excellent for capturing valuable information about customers or displaying offers and promotions. Real-time segment targeting for modals and embeds is earmarked for H2 2023 release.</p>
</li>
</ul>
<div class="embed-wrapper"><a class="embed-card" href="https://www.loom.com/share/f3a3f792e40b4ccea7aeeb0c3119287e?sid=ac42b087-c647-4ba5-979f-1a0c65f1b3e6&hide_owner=true&hide_share=true&hide_title=true&hideEmbedTopBar=true&autoplay=true">https://www.loom.com/share/f3a3f792e40b4ccea7aeeb0c3119287e?sid=ac42b087-c647-4ba5-979f-1a0c65f1b3e6&hide_owner=true&hide_share=true&hide_title=true&hideEmbedTopBar=true&autoplay=true</a></div>
<p> </p>
<p>Optimizely continues to make strides in the customer data landscape, and the latest innovations highlight its commitment to empowering brands to unlock the full potential of their customer data.</p>
Enabling Cloud Support for Optimizely CMS 12https://rangaiya.hashnode.dev/enabling-cloud-support-for-optimizely-cms-122023-06-30T07:29:28.0000000Z<p>In this post, I'll explore the configuration of Azure resources for self-managed (non-DXP) deployments.</p>
<p>For DXP deployments, the recommended way is to use the <em>EPiServer.CloudPlatform.Cms</em> package and simply add the following line</p>
<pre><code class="lang-csharp"><span class="hljs-keyword">if</span> (!_webHostingEnvironment.IsDevelopment())
{
<span class="hljs-comment">//For DXP deployments</span>
services.AddCmsCloudPlatformSupport(_configuration);
}
</code></pre>
<h2 id="heading-what-about-azure-deployments">What about Azure deployments?</h2>
<p>Recently I was setting up a CMS 12 solution to deploy to my Azure instance for R&D. I used the <em>AddCmsCloudPlatformSupport</em> method to see if it worked - to my surprise it did and I couldn't see any issues during my preliminary tests. However I wanted to understand what exactly this method does and if there are any reasons not to use it for non-DXP usage, so I set out to decompile the CloudPlatform assembly and this is what I found out.</p>
<h2 id="heading-addcmscloudplatformsupport-deconstructed">AddCmsCloudPlatformSupport Deconstructed</h2>
<p>List of services currently configured by this method:</p>
<pre><code class="lang-csharp">services.AddApplicationInsights();
services.AddAzureBlobProvider();
services.TryAddEnumerable(ServiceDescriptor.Singleton<IPostConfigureOptions<BlobProvidersOptions>, ValidateAzureBlobProviderOptionsConfigurer>());
services.AddCloudPlatformHealthCheck();
services.AddAzureEventProvider();
services.TryAddEnumerable(ServiceDescriptor.Singleton<IPostConfigureOptions<EventProviderOptions>, ValidateAzureEventProviderOptionsConfigurer>());
services.ConfigureDatabase();
services.ConfigureClientGeolocationOptions();
services.AddDataProtectionToBlobStorage(configuration);
services.ConfigureTelemetryOptions();
services.AddCloudPlatformForwardedFor();
</code></pre>
<p>Digging deeper into each method, I was able to uncover its purpose.</p>
<h3 id="heading-addapplicationinsights">AddApplicationInsights</h3>
<p>Configures Application Insights and registers the client JavaScript SDK for real user monitoring.</p>
<h3 id="heading-addazureblobprovider">AddAzureBlobProvider</h3>
<p>This is an extension method available in <em>EPiServer.Azure</em> to add the AzureBlobProvider. </p>
<pre><code class="lang-csharp"><span class="hljs-function"><span class="hljs-keyword">public</span> <span class="hljs-keyword">static</span> IServiceCollection <span class="hljs-title">AddAzureBlobProvider</span>(<span class="hljs-params">
<span class="hljs-keyword">this</span> IServiceCollection services,
Action<AzureBlobProviderOptions> configureOptions = <span class="hljs-literal">null</span></span>)</span>
{
services.Configure<BlobProvidersOptions>((Action<BlobProvidersOptions>) (options =>
{
options.DefaultProvider = <span class="hljs-string">"azure"</span>;
options.AddProvider<AzureBlobProvider>(<span class="hljs-string">"azure"</span>);
}));
<span class="hljs-keyword">if</span> (configureOptions != <span class="hljs-literal">null</span>)
services.Configure<AzureBlobProviderOptions>(configureOptions);
services.TryAddEnumerable(ServiceDescriptor.Singleton<IPostConfigureOptions<AzureBlobProviderOptions>, AzureBlobProviderOptionsConfigurer>());
<span class="hljs-keyword">return</span> services;
}
</code></pre>
<p>By default this is configured to use the <em>EPiServerAzureBlobs</em> connection string (if specified) in environment configuration and creates a container name "mysitemedia".</p>
<h3 id="heading-addcloudplatformhealthcheck">AddCloudPlatformHealthCheck</h3>
<p>Registers the CMS health and warmup health checks. Refer to this <a target="_blank" href="/link/e7fd6d67f00e484d94be75b644c597be.aspx">blog</a> for more details and how to add custom health checks.</p>
<h3 id="heading-addazureeventprovider">AddAzureEventProvider</h3>
<p>Similar to the <em>AddAzureBlobProvider</em> method, this is an extension method available in <em>EPiServer.Azure</em> to add the AzureEventProvider and by default uses the <em>EPiServerAzureEvents</em> connection string (if specified) in environment configuration and creates a topic named "mysiteevents".</p>
<h3 id="heading-configuredatabase">ConfigureDatabase</h3>
<p>Sets <em>updateDatabaseSchema</em> to true to apply automatic database updates.</p>
<h3 id="heading-configureclientgeolocationoptions">ConfigureClientGeolocationOptions</h3>
<p>Sets the request header names to use for user personalisation -<em>EPiServer.Personalisation.ClientGeolocationOptions</em></p>
<pre><code class="lang-csharp">options.IPAddressHeader = <span class="hljs-string">"X-Forwarded-For"</span>;
options.LocationHeader = <span class="hljs-string">"CF-IPCountry"</span>;
</code></pre>
<h3 id="heading-adddataprotectiontoblobstorage">AddDataProtectionToBlobStorage</h3>
<p>Extension method in <em>EPiServer.CloudPlatform.Cms</em> to add data protection for BLOB storage using Azure Key Vault.</p>
<h3 id="heading-configuretelemetryoptions">ConfigureTelemetryOptions</h3>
<p>Enables sending user telemetry diagnostics to Optimizely.</p>
<h3 id="heading-addcloudplatformforwardedfor">AddCloudPlatformForwardedFor</h3>
<p>Enables and configures the <a target="_blank" href="https://learn.microsoft.com/en-us/aspnet/core/host-and-deploy/proxy-load-balancer?view=aspnetcore-7.0#other-proxy-server-and-load-balancer-scenarios">Forwarded Header Middleware</a> for CDN and load balancer scenarios.</p>
<h2 id="heading-what-i-ended-up-with">What I ended up with</h2>
<p>With this knowledge, I realised not all configured services were relevant to my scenario so my configuration now looks like:</p>
<pre><code class="lang-csharp"> <span class="hljs-comment">//Base configuration for self-hosted Azure deployments</span>
services.AddApplicationInsightsTelemetry();
services.AddServiceProfiler();
services.AddAzureBlobProvider(options => options.ContainerName = <span class="hljs-string">"mosey-media"</span>);
services.AddAzureEventProvider(options => options.TopicName = <span class="hljs-string">"mosey-events"</span>);
services.Configure((Action<DataAccessOptions>)(options => options.UpdateDatabaseSchema = <span class="hljs-literal">true</span>));
</code></pre>
<p>Note, my use case is for R&D purposes so the above is a starting point and a real-world implementation of non-DXP deployment will likely require additional configuration.</p>
<h2 id="heading-to-summarise">To Summarise</h2>
<p><em>AddCmsCloudPlatformSupport</em> is specifically used for the configuration of services for DXP. While it looks like you can also use it for non-DXP deployments, configuring services directly is the way to go to gain more control of your cloud configuration options (and avoid potential breaking changes from future updates to the <em>EPiServer.CloudPlatform.Cm</em>s package).</p>
<p>I hope this post helps to provide the context and direction for enabling cloud support for your Azure deployments.</p>
Enabling Cloud Support for CMS 12 - a look under the hood/blogs/ron-rangaiya/dates/2023/6/enabling-cloud-support-for-cms-12---a-look-under-the-hood/2023-06-29T07:50:42.0000000Z<p>In this post I'll explore configuration of Azure resources for self managed (non-DXP) deployments.</p>
<p>For DXP deployments, the recommended way is to use the <em>EPiServer.CloudPlatform.Cms</em> package and simply add the following line</p>
<pre class="language-csharp"><code>if (!_webHostingEnvironment.IsDevelopment())
{
//For DXP deployments
services.AddCmsCloudPlatformSupport(_configuration);
}</code></pre>
<h2>What about Azure deployments?</h2>
<p>Recently I was setting up a CMS 12 solution to deploy to my Azure instance for R&D. I used the <em>AddCmsCloudPlatformSupport </em>method to see if it worked - to my surprise it did and I couldn't see any issues during my preliminary tests. However I wanted to understand what exactly this method does and if there are any reasons not to use it for non-DXP usage, so I set out to decompile the CloudPlatform assembly and this is what I found out.</p>
<h2>AddCmsCloudPlatformSupport Deconstructed</h2>
<p>List of services currently configured by this method:</p>
<pre class="language-csharp"><code>services.AddApplicationInsights();
services.AddAzureBlobProvider();
services.TryAddEnumerable(ServiceDescriptor.Singleton<IPostConfigureOptions<BlobProvidersOptions>, ValidateAzureBlobProviderOptionsConfigurer>());
services.AddCloudPlatformHealthCheck();
services.AddAzureEventProvider();
services.TryAddEnumerable(ServiceDescriptor.Singleton<IPostConfigureOptions<EventProviderOptions>, ValidateAzureEventProviderOptionsConfigurer>());
services.ConfigureDatabase();
services.ConfigureClientGeolocationOptions();
services.AddDataProtectionToBlobStorage(configuration);
services.ConfigureTelemetryOptions();
services.AddCloudPlatformForwardedFor();</code></pre>
<p>Digging deeper into each method, I was able to uncover its purpose.</p>
<h3>AddApplicationInsights</h3>
<p>Configures Application Insights and registers the client JavaScript SDK for real user monitoring.</p>
<h3>AddAzureBlobProvider</h3>
<p>This is an extension method available in <em>EPiServer.Azure </em>to add the AzureBlobProvider. </p>
<pre class="language-csharp"><code>public static IServiceCollection AddAzureBlobProvider(
this IServiceCollection services,
Action<AzureBlobProviderOptions> configureOptions = null)
{
services.Configure<BlobProvidersOptions>((Action<BlobProvidersOptions>) (options =>
{
options.DefaultProvider = "azure";
options.AddProvider<AzureBlobProvider>("azure");
}));
if (configureOptions != null)
services.Configure<AzureBlobProviderOptions>(configureOptions);
services.TryAddEnumerable(ServiceDescriptor.Singleton<IPostConfigureOptions<AzureBlobProviderOptions>, AzureBlobProviderOptionsConfigurer>());
return services;
}</code></pre>
<p>By default this is configured to use the <em>EPiServerAzureBlobs </em>connection string (if specified) in environment configuration and creates a container name "mysitemedia".</p>
<h3>AddCloudPlatformHealthCheck</h3>
<p>Registers the CMS health and warmup health checks. Refer to this <a href="/link/e7fd6d67f00e484d94be75b644c597be.aspx">blog </a>for more details and how to add custom health checks.</p>
<h3>AddAzureEventProvider</h3>
<p>Similar to the <em>AddAzureBlobProvider </em>method, this is an extension method available in <em>EPiServer.Azure </em>to add the AzureEventProvider and by default uses the <em>EPiServerAzureEvents </em>connection string (if specified) in environment configuration and creates a topic named "mysiteevents".</p>
<h3>ConfigureDatabase</h3>
<p>Sets <em>updateDatabaseSchema </em>to true to apply automatic database updates.</p>
<h3>ConfigureClientGeolocationOptions</h3>
<p>Sets the request header names to use for user personalisation - <em>EPiServer.Personalisation.ClientGeolocationOptions</em></p>
<pre class="language-csharp"><code>options.IPAddressHeader = "X-Forwarded-For";
options.LocationHeader = "CF-IPCountry";</code></pre>
<h3>AddDataProtectionToBlobStorage</h3>
<p>Extension method in <em>EPiServer.CloudPlatform.Cms </em>to add <span>data protection for BLOB storage using Azure Key Vault.</span></p>
<h3>ConfigureTelemetryOptions</h3>
<p>Enables sending user telemetry diagnostics to Optimizely.</p>
<h3>AddCloudPlatformForwardedFor</h3>
<p>Enables and configures the <a href="https://learn.microsoft.com/en-us/aspnet/core/host-and-deploy/proxy-load-balancer?view=aspnetcore-7.0#other-proxy-server-and-load-balancer-scenarios">Forwarded Header Middleware</a> for CDN and load balancer scenarios.</p>
<h2>What I ended up with</h2>
<p>With this knowledge, I realised not all configured services were relevant to my scenario so my configuration now looks like:</p>
<pre class="language-csharp"><code> //Configuration for self-hosted Azure deployments
services.AddApplicationInsightsTelemetry();
services.AddServiceProfiler();
services.AddAzureBlobProvider(options => options.ContainerName = "mosey-media");
services.AddAzureEventProvider(options => options.TopicName = "mosey-events");
services.Configure((Action<DataAccessOptions>)(options => options.UpdateDatabaseSchema = true));</code></pre>
<p>Note, my use case is for R&D purposes so the above is a starting point and a real-world implementation of non-DXP deployment will likely require additional configuration.</p>
<h2>To Summarise</h2>
<p><em>AddCmsCloudPlatformSupport </em>is specifically used for the configuration of services for DXP. While it looks like you can also use it for non-DXP deployments, configuring services directly is the way to go to gain more control of your cloud configuration options (and avoid potential breaking changes from future updates to the <em>EPiServer.CloudPlatform.Cm</em>s package).</p>
<p>I hope this post helps to provide the context and direction for enabling cloud support for your Azure deployments. </p>CI/CD YAML Pipelines for Optimizely CMS 12https://rangaiya.hashnode.dev/cicd-yaml-pipelines-for-optimizely-cms-122023-02-24T06:20:08.0000000Z<p>I had previously successfully used multi-stage YAML pipelines for CMS 11 projects. These YAML pipelines are great as they can be easily ported across projects for quick and consistent setup of CI/CD workflows.</p>
<h3 id="heading-build-and-package-for-net-5">Build and Package for .NET 5+</h3>
<p>In the new world of ASP.NET Core and CMS 12, I had to tweak the build and package processes to maintain the CI/CD functionality.</p>
<p>The <a target="_blank" href="https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/dotnet-core-cli-v2?view=azure-pipelines">.NET Core task</a> can be used to easily restore, build, test, package and publish a .NET Core project. The following code snippets outline the key steps:</p>
<p><strong>Restore dependencies</strong></p>
<pre><code class="lang-yaml"><span class="hljs-bullet">-</span> <span class="hljs-attr">task:</span> <span class="hljs-string">DotNetCoreCLI@2</span>
<span class="hljs-attr">displayName:</span> <span class="hljs-string">'NuGet Restore'</span>
<span class="hljs-attr">inputs:</span>
<span class="hljs-attr">command:</span> <span class="hljs-string">'restore'</span>
<span class="hljs-attr">projects:</span> <span class="hljs-string">'$(projects)'</span>
<span class="hljs-attr">feedsToUse:</span> <span class="hljs-string">'config'</span>
<span class="hljs-attr">nugetConfigPath:</span> <span class="hljs-string">'src/NuGet.config'</span>
</code></pre>
<p><strong>Build</strong></p>
<pre><code class="lang-yaml"><span class="hljs-bullet">-</span> <span class="hljs-attr">task:</span> <span class="hljs-string">DotNetCoreCLI@2</span>
<span class="hljs-attr">displayName:</span> <span class="hljs-string">'Build'</span>
<span class="hljs-attr">inputs:</span>
<span class="hljs-attr">command:</span> <span class="hljs-string">'build'</span>
<span class="hljs-attr">projects:</span> <span class="hljs-string">'$(projects)'</span>
<span class="hljs-attr">arguments:</span> <span class="hljs-string">'--configuration $(buildConfiguration) --framework $(targetFramework)'</span>
</code></pre>
<p><strong>Test</strong></p>
<p>Run any unit tests</p>
<pre><code class="lang-yaml"><span class="hljs-bullet">-</span> <span class="hljs-attr">task:</span> <span class="hljs-string">DotNetCoreCLI@2</span>
<span class="hljs-attr">displayName:</span> <span class="hljs-string">'Test'</span>
<span class="hljs-attr">inputs:</span>
<span class="hljs-attr">command:</span> <span class="hljs-string">'test'</span>
<span class="hljs-attr">projects:</span> <span class="hljs-string">'**/*Tests/*.csproj'</span>
<span class="hljs-attr">arguments:</span> <span class="hljs-string">'--configuration $(buildConfiguration)'</span>
</code></pre>
<p><strong>Package and Publish</strong></p>
<p>The files need to be packaged in NUPKG file format</p>
<pre><code class="lang-yaml"><span class="hljs-bullet">-</span> <span class="hljs-attr">task:</span> <span class="hljs-string">DotNetCoreCLI@2</span>
<span class="hljs-attr">displayName:</span> <span class="hljs-string">'Publish output'</span>
<span class="hljs-attr">inputs:</span>
<span class="hljs-attr">command:</span> <span class="hljs-string">'publish'</span>
<span class="hljs-attr">publishWebProjects:</span> <span class="hljs-literal">true</span>
<span class="hljs-attr">arguments:</span> <span class="hljs-string">'--configuration $(BuildConfiguration) --framework $(targetFramework) --output $(Build.ArtifactStagingDirectory)\PackageContent\'
zipAfterPublish: false
modifyOutputPath: false
- task: ArchiveFiles@2
displayName: '</span><span class="hljs-string">Zip'</span>
<span class="hljs-attr">inputs:</span>
<span class="hljs-attr">rootFolderOrFile:</span> <span class="hljs-string">'$(Build.ArtifactStagingDirectory)\PackageContent\'
includeRootFolder: false
archiveType: '</span><span class="hljs-string">zip'</span>
<span class="hljs-attr">archiveFile:</span> <span class="hljs-string">'$(Build.ArtifactStagingDirectory)/Packages/$(appName).cms.app.$(appVersion).nupkg'</span>
<span class="hljs-attr">replaceExistingArchive:</span> <span class="hljs-literal">true</span>
<span class="hljs-bullet">-</span> <span class="hljs-attr">task:</span> <span class="hljs-string">PublishPipelineArtifact@1</span>
<span class="hljs-attr">displayName:</span> <span class="hljs-string">'Publish artifact'</span>
<span class="hljs-attr">inputs:</span>
<span class="hljs-attr">targetPath:</span> <span class="hljs-string">'$(Build.ArtifactStagingDirectory)/Packages'</span>
<span class="hljs-attr">artifact:</span> <span class="hljs-string">'$(artifactName)'</span>
<span class="hljs-attr">publishLocation:</span> <span class="hljs-string">'pipeline'</span>
</code></pre>
<h3 id="heading-deployment">Deployment</h3>
<p>The actual deployment processes that use the Deployment API to upload and deploy the code package to DXP environments remain the same.</p>
<h3 id="heading-pipelines">Pipelines</h3>
<p>The following YAML pipeline files can be found in my <a target="_blank" href="https://github.com/rrangaiya/opti-ci-cd/">GitHub repository</a>,</p>
<ul>
<li><p>Integration - builds, tests and deploys to the Integration environment.</p>
</li>
<li><p>Release - builds, tests and enables staged deploys of release candidates to Preproduction and Production environments.</p>
</li>
</ul>
<p>As these pipelines directly use the EpiCloud Powershell module, you have full control of the scripts to change as per your CI/CD requirements.</p>
<p>Refer to my series on Deployment API and YAML pipelines for more context - <a target="_blank" href="https://world.optimizely.com/blogs/ron-rangaiya/dates/2020/5/azure-devops-release-flow-for-episerver-dxp/">Part 1</a>, <a target="_blank" href="https://world.optimizely.com/blogs/ron-rangaiya/dates/2020/5/cicd-using-episerver-dxp-deployment-api-and-multi-stage-yaml-pipelines---part-2/">Part 2</a>, <a target="_blank" href="https://world.optimizely.com/blogs/ron-rangaiya/dates/2022/6/deployment-api-and-multi-stage-yaml-pipelines---an-update/">Part 3</a>.</p>
<p>Stay tuned for more updates!</p>
Optimizely Data Platform (ODP): an intelligent customer data platformhttps://rangaiya.hashnode.dev/optimizely-data-platform-an-intelligent-customer-data-platform2023-02-20T04:32:32.0000000Z<p><em>[Updated and republished blog from March 2022]</em></p>
<p>A customer data platform (CDP) collects and unifies customer data from a variety of touchpoints into individual unified customer profiles. This data is then activated, and available to connect to the systems and platforms you engage customers with, to deliver personalised customer experiences in real time.</p>
<p>In March 2021 Optimizely acquired Zaius, a CDP, and rebranded it as ODP further strengthening its digital experience platform (DXP) capabilities. Heres how ODP can enable your business to deliver relevant experiences to your customers.</p>
<h3 id="heading-customer-data-challenges">Customer data challenges</h3>
<p>Data is essential to unlocking digital experiences but often the data is siloed and disconnected. While many organizations may already have a big data strategy, they face challenges in getting the right insights into the data and then efficiently activating the data.</p>
<p>Even if you can bring together all the disparate data silos, its difficult to get a unified view of customers. Without understanding your customers and their needs, you cant deliver relevant experiences to them.</p>
<h3 id="heading-unify-your-data-and-create-relevant-digital-experiences-for-your-customers">Unify your data and create relevant digital experiences for your customers</h3>
<p>The Optimizely Data Platform (ODP) is a central hub to harmonise, analyse and act on your data. It serves as the connective tissue that unifies data from your entire digital ecosystem, bringing the ABCs of data (assets, behaviours and customers) into a centralised location and providing data science-driven insights through the lens of the customer.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1676611263601/76d2ef5c-8913-4905-86b4-383024cdcd37.png" alt class="image--center mx-auto" /></p>
<p>The customer context can help your business to deliver the right experience at the right time through personalised messages, content, and product recommendations.</p>
<h3 id="heading-harmonise-customer-data">Harmonise customer data</h3>
<p>ODP collects and stitches customer data (offline, online, historical and real-time) to give you a 360-degree view of your customers. There are over 50 pre-built connectors providing one-click integrations that enable you to bring together all your customer data.</p>
<p>If you have a system that doesnt have a pre-built connector, ODP provides tools for seamless custom integration - historical customer data can be uploaded via files and ongoing data sync via REST APIs by tracking customer interactions (e.g. form submissions, orders, account registration etc.).</p>
<h3 id="heading-understand-your-customers">Understand your customers</h3>
<p>Analytics enhanced by out-of-the-box artificial intelligence (AI) and machine learning-driven predictions provide complete visibility through the context of a customer profile that adapts to customer behaviour in real time. This enables marketers to get easily accessible and meaningful insights on end-to-end customer journeys to drive personalization strategies without manual or guesswork.</p>
<p>Features such as AI-driven reports and real-time segments allow for more dynamic personalisation of web experiences based on customers behaviours to drive more conversion, revenue and growth.</p>
<p>Data from ODP can be pushed to other BI tools (e.g. Power BI) or shared via Snowflake.</p>
<h3 id="heading-personalise-with-real-time-segments">Personalise with real-time segments</h3>
<p><strong><em>Note*</em></strong>, this feature is currently available to Charter Program members only.*</p>
<p>ODP offers pre-built real-time segments allowing for segmentation of site visitors and delivery of relevant content based on their engagement with your site. The GraphQL Explorer, an interface available through your ODP account, allows for querying of real-time segment results.</p>
<p>Custom real-time segments can also be created using the real-time segments builder (currently in private beta).</p>
<p>Real-time segments can then be used in other Optimizely products (e.g. CMS and Web Experimentation) via integrations making it easier to do web personalisation and audience targeting in experiments.</p>
<h3 id="heading-data-platform-lite">Data Platform Lite</h3>
<p>Optimizely has recognised the need to enhance its DXP with a unified data layer across its product suite. Data Platform Lite is a free service that is available to all Optimizely cloud customers (Content, Commerce, B2B or Experimentation).</p>
<p>The key limitations of Data Platform Lite compared to ODP are:</p>
<ul>
<li><p>It harmonises data from Optimizely DXP products only and provides access to view-only analytics.</p>
</li>
<li><p>It doesnt offer any data activation features.</p>
</li>
<li><p>While it is free, there is a usage limit of 250K monthly active users (MAUs)</p>
</li>
</ul>
<p>Data Platform Lite provides Optimizely cloud customers with a pathway to ease into a CDP and start centralising and understanding their data. When the user base grows beyond the 250K limit or to access more powerful ODP features such as one-click integrations to bring data from their whole digital ecosystem together, segmentation and AI/ML-driven predictions and visualisations, customers can choose to upgrade to ODP.</p>
<h3 id="heading-in-summary"><strong>In summary</strong></h3>
<p>ODP empowers organisations through unified insights and intelligent decision-making to deliver outsized outcomes.</p>
<p>Getting data together is just the first step, its what you do with it that creates value. Knowing your customer and anticipating their real-time intent can help you create relevance at every customer interaction driving customer loyalty, revenue and business growth.</p>
Deployment API and multi-stage YAML pipelines - an update/blogs/ron-rangaiya/dates/2022/6/deployment-api-and-multi-stage-yaml-pipelines---an-update/2022-06-22T07:29:14.0000000Z<h2>Recap</h2>
<p>It has been almost 2 years since my last blog on <a href="/link/9859e5f013194c10b8d8fc7765d11c1f.aspx">reusable multi-stage YAML pipelines</a> to manage Optimizely DXP deployments.</p>
<p>A lot has changed since then so wanted to take the opportunity to give them a much needed update.</p>
<p>To recap, I created these portable YAML pipelines to use across projects for quick and consistent setup of CI/CD workflows. These pipelines directly use the <a href="https://docs.developers.optimizely.com/digital-experience-platform/v1.3.0-DXP-for-CMS11-COM13/docs/deploy-using-powershell">EpiCloud Powershell module</a> so you have full control and access to the scripts if you need to change anything to fit your requirements or quickly leverage any updates to the EpiCloud module.</p>
<h2>What's New?</h2>
<p>The following changes have been made to the YAML pipelines</p>
<ul>
<li>The `Integration` <span>pipeline updated to support direct deploy to the Integration environment using the code package approach. </span></li>
<li><span>The old Integration pipeline renamed to `Integration-WebDeploy` for deploying using the Azure App Service Deploy method. This is provided for legacy purposes and can be used to deploy to an additional DXP environment that serves as the Development / Integration environment.</span></li>
<li><span>The 'Release` pipeline updated to support Smooth / Zero downtime deployments</span></li>
<li><span>The `Release` pipeline </span>updated to allow for manual validation step before completing Production deployment</li>
<li>Support for variable groups</li>
<li>Added runtime parameters</li>
</ul>
<h2>What about .Net 5?</h2>
<p>While the pipelines currently don't support .Net 5 build process, they can be easily repurposed for a .Net 5 / CMS 12 project. Apart from the build and publish commands, the rest of the processes should be the same. Otherwise stay tuned for .Net 5 YAML pipelines in the next update.</p>
<h2>Summary</h2>
<p>Traditionally, the Azure DevOps Classic UI editor is preferred for build and release pipelines. However, there are some benefits to using YAML pipelines including source-controlled pipeline files and having a unified view of your build and deployment processes. Read more about the <a href="https://docs.microsoft.com/en-us/azure/devops/pipelines/get-started/pipelines-get-started?view=azure-devops">differences between YAML and Classic pipelines</a> and decide if YAML suits your DevOps approach.</p>
<p><span>For the latest pipeline files and documentation, please refer to my <a href="https://github.com/rrangaiya/opti-pipelines-yaml">GitHub repository</a></span></p>
<p><span>I'd love to get any feedback/comments/questions.</span></p>Optimizely Sydney Meetup - 23rd June/blogs/ron-rangaiya/dates/2022/6/optimizely-sydney-meetup---23rd-june/2022-06-20T03:32:56.0000000Z<p><strong>Calling all Optimizers in Sydney for an in-person meetup!</strong></p>
<p>In-person developer meetup is back again in Sydney after a long hiatus. Thanks to Optimizely for organising the event<strong>. </strong></p>
<p>It's a fun packed agenda with several speakers including Steven and Thomas from Optimizely, Matthew from Sudo Roux and myself, exploring a range of interesting topics and demos. Come and enjoy a night of networking with your fellow Optimizers - there will be food and beverages.</p>
<p><strong>When</strong></p>
<p>23rd June 5pm - 8pm</p>
<p><strong>Where</strong></p>
<p>The Arthouse Hotel, <span class="address-block">275 Pitt Street, </span><span>Sydney</span></p>
<p><strong>Session 1</strong></p>
<ul>
<li>Defining a Strategy for an eCommerce implementation (Matthew Sayer, Sudo Roux)</li>
<li>Increase velocity and developer productivity using Feature Flags (Steven Croft, Optimizely)</li>
</ul>
<p><strong>Session 2</strong></p>
<ul>
<li>Introduction to Data Core Services and Platform (Ronil Rangaiya, Empired)</li>
<li>Unlocking the power of real-time segments for personalization and experimentation (Thomas Gattringer, Optimizely)</li>
</ul>
<p>Hope to see all Optimizely enthusiasts there!</p>
<p>This is a free event however registration is mandatory</p>
<p><a href="https://live.optimizely.com/events/details/optimizely-tech-community-presents-sydney-meetup-1/">Register here!</a></p>Best practices for SendGrid SMTP Integration/blogs/ron-rangaiya/dates/2020/10/best-practices-for-sendgrid-smtp-integration/2020-10-06T07:21:34.0000000Z<p>Episerver DXP service includes a SendGrid account for sending emails. Adding SMTP configuration is straightforward and typically common knowledge, in this post I'll highlight some best practices to secure your SendGrid account.</p>
<p>As part of your DXP setup, Episerver Managed Services will provide your SendGrid account credentials (username and password).</p>
<h3><strong>1. Use API keys for authentication</strong></h3>
<p><strong>Do not</strong> use the supplied account username and password for authenticating against the SendGrid SMTP API. This username and password allow full access to your SendGrid account so it is a security risk if this credential gets compromised.</p>
<p><span>API Keys add an additional layer of security for your account and is the recommended way to securely talk to SendGrid APIs. You can create API keys from the Settings section of the <a href="https://app.sendgrid.com/">SendGrid Portal</a>. If your API key does get compromised, it is easy to delete and create a new one and update your environment variables. </span></p>
<p>Use the API key for Bearer authentication when calling SendGrid APIs. F<span>or Episerver to send out email notifications, you will need to add</span> SMTP settings in web config. Set the <em>username</em> to "apikey" and use the API key for the <em>password </em>value. While this is still authenticating via Basic authentication, it is using an API key which is recommended.</p>
<h3><strong>2. Restricted permissions for API keys</strong></h3>
<p><strong></strong>API keys should be created with the minimum required permission level <span>to provide access to different functions of your account. To further improve security, you should create separate API keys for use in each DXP environment.</span></p>
<p><span>For example, the below API key only has permission to send emails.</span></p>
<p><img src="/link/0ef039cfa7ad4d198f3edf693014816d.aspx" /></p>
<p><img src="/link/88e82974179845cbac1784a51838cbfe.aspx" /></p>
<h3><strong>3. Secret variables for API keys</strong></h3>
<p>As a good practice, <strong>do not</strong> store API keys in source control. It is sensitive data and shouldn't be accessible to anyone who has access to the code repository. Instead, you should store them in your Azure Pipelines as secret variables or in Azure Key Vault and access them from your Azure Pipeline.</p>
<p>I used a 3rd party extension <a href="https://marketplace.visualstudio.com/items?itemName=qetza.replacetokens">Replace Tokens</a> as a step in my Azure Pipeline to inject the API key into the SMTP settings in my web config.</p>
<p><img src="/link/6b9dadaf42ec4a71b4db3e2720e5762f.aspx" /></p>
<p>Below is my Replace Tokens task (YAML) to update the environment web config files on the fly with the required SendGrid credentials before pushing the code package to DXP using the Deployment API.</p>
<p><img src="/link/4a2f52f4cf284bb78e572dd693a4d178.aspx" /></p>
<p>Azure pipeline variables for SendGrid credentials</p>
<p><img src="/link/4a6247366b694f10a7f860e79cecc5e0.aspx" /></p>
<p>If you are using the App Service Deploy task to deploy to your DXP environment, refer to this <a href="/link/5548a737b6c4497cbdd51944e324e9c4.aspx">blog</a> on how to do variable substitution using parameters.xml.</p>
<h3><strong>4. Two-factor authentication</strong></h3>
<p>For improved security, enable <a href="https://sendgrid.com/docs/ui/account-and-settings/two-factor-authentication/">two-factor authentication</a> for your account. It looks like SendGrid will soon be enforcing this soon</p>
<p><img src="/link/1e0e93a4cd65431bb6616bdc928da759.aspx" /></p>
<p>Note once you enable two-factor authentication, SendGrid will no longer accept the account username and password for API authentication. Thus further protecting your account from malicious use if account credentials are compromised. </p>
<h3><strong>5. Sender authentication</strong></h3>
<p>Setup sender authentication to improve your domain's reputation and email deliverability. Request the TXT record from Episerver Managed Services and give it to your DNS provider to configure the Sender Policy Framework (SPF) record.</p>Episerver CMS 11 (2020 Update) certification tips/blogs/ron-rangaiya/dates/2020/6/episerver-cms-11-2020-update-certification-tips/2020-06-11T22:34:28.0000000Z<p>Yesterday I renewed my CMS certification (a customary biennial exercise to maintain your ECD status). Capturing my brain dump here while it is still fresh in my head.</p>
<p>There have been a lot of updates in the CMS 11 product in the past 2 years (time certainly flies) and this is reflected in the certification exam. The 2020 updated version of the exam has questions on the Content Delivery API and the latest DXP service features.</p>
<h2>Preparation</h2>
<p>The exam is not exactly a walk in the park. To earn your Episerver Certified Developer (ECD), you are expected to have a broad knowledge of the product and it measures your developer knowledge and skills. The following preparation is generally recommended </p>
<ul>
<li>Experience in a few CMS project implementations prior to taking the exam</li>
<li>Study the <a href="/link/f3882e248540406fa3bc4a6858a3b244.aspx">Developer guide</a></li>
<li>Review the <a href="https://episerver.sharefile.com/d-sa7000126cf14d448">skills measured</a> in the exam</li>
<li>Take the Developer e-learning courses (<strong>Note</strong> these are currently free till 30th June 2020, you get 30 days of access after enrolling in the course)
<ul>
<li><a href="https://education.episerver.com/collections/elearning/products/episerver-cms-advanced-development">CMS Advanced Development</a></li>
<li><a href="https://education.episerver.com/collections/elearning/products/developing-for-dxc-service">Developing for DXC Service</a></li>
</ul>
</li>
</ul>
<h2>Exam format</h2>
<p>There were 77 multi-choice questions with the required pass rate of <strong>60%</strong>.</p>
<p>You get 2 hours, the key tip here is you can skip questions you are unsure of and come back to it later. You cannot go back to answered questions. I tried to spend no more than 1 minute per question and skip it for later if I wasn't confident of the answer.</p>
<p>Some of the answer options are tricky especially the ones where you need to select two or three options so read the question and all the options carefully.</p>
<h2>Hints and Tips</h2>
<p>Some hints and tips of the knowledge areas to cover (as much as I can remember). Please be mindful that this is <strong>not</strong> an exhaustive list of the areas that need to be studied.</p>
<h3>Product Knowlege</h3>
<ul>
<li>Review breaking changes in last 2 major versions (10 and 11)</li>
<li>Review features and add-ons <strong>not</strong> supported in DXP</li>
<li>Understand how to programmatically work with projects</li>
<li>Familiarise yourself with the Editing and Admin UI</li>
</ul>
<h3>Installation, Operation and Configuration</h3>
<ul>
<li>Requirements for setting up a CMS site</li>
<li>How to install updates</li>
<li>Understand the licensing model for cloud-based licensing and multi-site setup</li>
<li>Review the Backup and Failover definitions in the <a href="/link/c6294f540c574fe7a8da480c7a9430f6.aspx">DXP service description</a></li>
<li>Familiarise yourself with the Episerver Visual Studio Extension</li>
<li>Default location for public and protected add-ons</li>
<li>Review the "Permissions for Functions" in the Admin UI e.g. how to enable detailed error messages</li>
</ul>
<h3>Content Model</h3>
<ul>
<li>Familiarise with the Episerver concepts of page types, templates and blocks</li>
<li>Know the supported property types</li>
<li>Familiarise with the various property attributes, including validation attributes</li>
<li>Understand the criteria for refactoring content type and properties e.g when renaming content types</li>
<li>How to check user access rights on a content</li>
<li>How to setup preview rendering for a block</li>
<li>How to use tags to render a property</li>
<li>How to customise properties using <em>EditorDescriptor</em> and <em>UIHint</em></li>
<li>Review the various ways to configure property settings both in code and Admin UI</li>
<li>How to start a content approval</li>
<li>Partial routing concepts and the different methods to implement</li>
<li>Learn how A/B testing works and how to create custom KPIs</li>
<li>Understanding <a href="/link/6ffa5cb8173a414eac25740deeafdbc8.aspx">Content Delivery API</a> - there are a few questions on this so make sure to review it thoroughly</li>
<li>Episerver Forms configuration including how to configure data encryption</li>
</ul>
<h3>Creating Websites</h3>
<ul>
<li>ASP.NET Identity setup</li>
<li>Scenarios for translating content when working with multiple languages</li>
<li>What happens to the friendly URL when you move pages </li>
<li>How to setup localisation of the UI</li>
<li>Location of language XML files</li>
<li>Where to setup the fallback language</li>
<li>Understand the configuration of the different cache options (Output, Browser and Object)</li>
<li>Familiarise with personalising blocks using visitor groups in content areas</li>
<li>Understanding of both Episerver Search and Episerver Search & Navigation (formerly Find)</li>
<li>Understand how to filter content for the current visitor </li>
</ul>
<h3>Advanced Concepts</h3>
<ul>
<li>Understand the areas in the CMS UI where editors can add gadgets</li>
<li>Attributes required to create plug-ins</li>
<li>Understand how to configure/disable gadgets for specific roles</li>
<li>Understand the different ways to add/remove components from the CMS UI</li>
<li>How to implement, configure and run Scheduled Jobs</li>
<li>Understand the usage of the <em>InitializationModule</em> and <em>ModuleDependency</em> attribute</li>
<li>Content Provider concepts</li>
<li>Working with the Notification API i.e <em>UserNotificationRepository</em></li>
</ul>
<p>Hopefully, the above tips help you prepare for your next certification attempt.</p>
<p>Best of luck!!!</p>CI/CD using Episerver DXP Deployment API and multi-stage YAML pipelines - Part 2/blogs/ron-rangaiya/dates/2020/5/cicd-using-episerver-dxp-deployment-api-and-multi-stage-yaml-pipelines---part-2/2020-05-28T14:27:53.0000000Z<h2>Recap</h2>
<p>In my <a href="/link/8619f059317d40d9954668b7f857c01f.aspx">previous post</a>, I explored the possibility of using Release Flow as a branching strategy for Episerver DXP deployments.</p>
<p>Moving on to the Deployment workflow, there are already a number of excellent blog posts like <a href="/link/21aee4bc5c4d44518c1ed29c846185e2.aspx">this one</a> on how to use the Deployment API with Azure DevOps Classic pipelines.</p>
<p>When I started working with the Deployment API, my goal was to create reusable YAML pipelines that I can use across projects for CI/CD workflows. </p>
<h2>Why YAML</h2>
<p><a href="https://devblogs.microsoft.com/devops/whats-new-with-azure-pipelines/">Multi-stage pipelines</a> allow for a combined build and release pipeline in one YAML file. The key benefits of using YAML over the Classic pipeline are</p>
<ul>
<li>The YAML file is source-controlled so you can treat it like any other file in the repository, to view history, branch, peer review changes.</li>
<li>It can be easily forked/copied to new projects so you can have standardised pipelines across your projects.</li>
<li>It provides a unified experience and view of your deployment pipeline.</li>
</ul>
<h2>Show me the YAML</h2>
<p>The following YAML pipeline files can be found in the GitHub repository - <a href="https://github.com/rrangaiya/epi-dxp-devops">https://github.com/rrangaiya/epi-dxp-devops</a>. </p>
<ul>
<li><strong>Integration</strong> - a pipeline to build and deploy to the Integration environment.</li>
<li><strong>Release - </strong>a pipeline to build and deploy a release candidate to Preproduction and Production environments.</li>
</ul>
<p>Obviously, these reusable pipelines are a guide only based on my experience with Episerver DXP projects. They are based on the Release Flow branching strategy but can be tweaked to suit your project requirements.</p>
<h2>Integration pipeline</h2>
<p><img src="/link/cc0b77057ec34af48a09f705c78f68cb.aspx" width="560" height="260" /></p>
<p>This pipeline is set up to run as part of the continuous integration workflow i.e. when code is merged to the <em>master </em>branch. <span>It creates a web package and deploys to the Integration environment using the <strong>Azure App Service Deploy</strong> task</span>. Deployments done using the Deployment API takes around 30 minutes. As this pipeline will be run frequently, a fast deployment time (~1 min) to Integration ensures developers and testers are not waiting to test deployed features.</p>
<ul>
<li>CI stage - builds the solution, runs any unit tests, creates a web package.</li>
<li>Integration stage - deploys to Integration. This requires setting up <span>a <a href="https://docs.microsoft.com/en-us/azure/devops/pipelines/library/connect-to-azure?view=azure-devops">service connection</a> to your DXP Azure subscription on Azure DevOps. The service principal details required for the connection can be requested from Episerver Managed Services.</span></li>
</ul>
<p><img src="/link/71e290f235e445eb90761030c6270cdc.aspx" width="969" height="594" /></p>
<h2>Release pipeline</h2>
<p><img src="/link/0ee8d0c620f1429dbb643ca54f049146.aspx" width="805" height="255" /></p>
<p>This<span> </span>pipeline is triggered on any change to <em>release/*</em><span> </span>branch. It follows a "Build once, deploy many" DevOps convention using the Deployment API code package approach. </p>
<ul>
<li>Package stage - builds the solution, runs any unit tests, creates a Nuget package, and uploads it to Episerver DXP for deployment. </li>
<li>Preproduction stage - on approval, deploys the package to the Preproduction environment.</li>
<li>Production stage - on approval, deploys the package to the Production environment.</li>
</ul>
<h3>Pipeline variables</h3>
<p>The following variables for the Deployment API credentials need to be added on the pipeline from the UI:</p>
<p><img src="/link/f0bbd881ffc147b5957f6209fbb41f9f.aspx" width="477" height="358" /></p>
<p>The ApiSecret variable should be created as a <strong>secret variable</strong>. Add the Deployment API credentials on the DXP Portal, you can select multiple environments for a credential. I have set up the credential to match the environments the pipeline will deploy to, this ensures that a credential is only used for its intended purpose.</p>
<p><img src="/link/755ce78e81df45c39db3332c04f2fff9.aspx" width="472" height="285" /></p>
<h3>Upload Package</h3>
<p>The following task uploads the code package to DXP ready for deployment. Note any secret variables e.g. <em>ApiSecret</em> need to be mapped to an environment variable to make it available for use in the Powershell script.</p>
<p><img src="/link/5c0a8d8348024a06a73af1febe1ecba3.aspx" width="815" height="484" /></p>
<h3>Preproduction deployment</h3>
<p>The Preproduction deployment stage has the following tasks. </p>
<p><img src="/link/49b9e959934e407395c10c841334d49f.aspx" /></p>
<h5>Deploy to slot</h5>
<p>The first task deploys the uploaded code package to the Preproduction slot. It outputs a variable, <em>deploymentId,</em> that is used in the later tasks. </p>
<p><img src="/link/b417bc1c45494b0287f07fedb868e718.aspx" width="633" height="546" /></p>
<h5>Validate slot</h5>
<p>This<em> </em>task only performs a basic web request to the slot validation link. </p>
<p><img src="/link/bc4c2cbe8b9c491f96419ac2a92008fb.aspx" width="628" height="373" /></p>
<h5>Reset</h5>
<p>This task is set up to only run if the previous 'Validate slot' task fails</p>
<p><img src="/link/4bcca640f47a45ad8409d2bca1e66420.aspx" width="635" height="224" /></p>
<h5>Complete deployment</h5>
<p>And finally...</p>
<p><img src="/link/9574ab74095b493897ce75204b1fd75c.aspx" width="620" height="199" /></p>
<h3>Production deployment</h3>
<p>The Production stage is the same as the Preproduction stage. One thing to call out is that YAML currently doesn't support <strong>Manual Intervention </strong>task which<span> will allow to pause an active pipeline and perform manual validation of the slot before completing the deployment stage. </span></p>
<h2>Hotfixes</h2>
<p>When a hotfix is merged/cherry-picked to the release branch, the Release pipeline will be triggered and the hotfix can be deployed to Preproduction and Production.</p>
<h3>Deploy hotfix straight to Production</h3>
<p>If there is a need to deploy an urgent fix straight to Production, the Release pipeline can be manually run to deploy to selected stages.</p>
<p><img src="/link/357a644b0b6944a9a76a3e4e2ea79a0a.aspx" width="446" height="310" /></p>
<h2>Environments</h2>
<p>Environments in Azure Pipelines represent the environment(s) targetted by the pipeline. It provides deployment history for each environment. I have kept the environment names consistent with the DXP environments however they can be named to match the environment usage e.g. Test, UAT and Production.</p>
<p><img src="/link/6df194570d094387b94d8dc7c6a7d7b9.aspx" width="523" height="232" /></p>
<p>The version deployed to an environment comes from the pipeline build number. For release branches, I use the naming convention <em><major.minor> </em>e.g. <em>release/1.0</em> . This helps to identify the release branch that an environment is on for subsequent deployments e.g. bug fixes. </p>
<p><img src="/link/30c3fe142510435a9e606ea69bd99b92.aspx" width="656" height="54" /></p>
<h3>Approvals</h3>
<p>Approvals and checks can be configured for each environment. This ensures any pipeline targeting the environment will require manual approval before running the stage.</p>
<p><img src="/link/4b01617e4d1145d1b5400cb75681021f.aspx" width="1005" height="254" /></p>
<h2>Limitations</h2>
<p>The main limitations with multi-stage YAML pipelines that I have encountered are</p>
<ul>
<li>Currently, no support for Manual Intervention task within a stage. This feature<span> is on the Azure Pipelines roadmap.</span></li>
<li>Stages cannot be skipped for an automatically triggered pipeline. Currently, to skip a stage, the pipeline has to be manually run.</li>
</ul>
<p>By no means, these limitations outweigh the benefits.</p>
<h2>Wrap up</h2>
<p>Hopefully, this post goes some way in providing guidance to fully integrate your DXP deployments in Azure DevOps using multi-stage YAML pipelines and Deployment API. </p>
<p>For the latest YAML files and documentation, please refer to the GitHub repository <a href="https://github.com/rrangaiya/epi-dxp-devops">https://github.com/rrangaiya/epi-dxp-devops</a></p>
<h2>References</h2>
<p><a href="/link/cecd04dddeea47f89a14d7f163913728.aspx">https://world.episerver.com/documentation/developer-guides/digital-experience-platform/deploying/episerver-digital-experience-cloud-deployment-api/</a></p>
<p><a href="https://devblogs.microsoft.com/devops/whats-new-with-azure-pipelines/">https://devblogs.microsoft.com/devops/whats-new-with-azure-pipelines/</a></p>CI/CD using Episerver DXP Deployment API and multi-stage YAML pipelines - Part 1/blogs/ron-rangaiya/dates/2020/5/azure-devops-release-flow-for-episerver-dxp/2020-05-22T03:20:18.0000000Z<p>This post is the first in a series to share my recent experience in setting up CI/CD pipelines using the Deployment API and YAML pipelines. The end goal is to have a CI/CD process that can be quickly setup for any new project. The focus of this post will be a branching and release strategy which is key in supporting a robust DevOps process.</p>
<h2>Preface</h2>
<p>Before the release of the Deployment API, Episerver DXP only supported a linear deployment model whereby you could only deploy to Integration and then promote the package through the environments using the PaaS Portal. This created challenges for managing concurrent development of consecutive releases or to deploy an urgent hotfix to Production. The Deployment API, using the code package approach, has made it possible to fully integrate and manage CI/CD pipelines in Azure DevOps.</p>
<p>Having used Azure Pipelines Classic UI to setup CI and Release pipelines, I'm continually looking for DRY ways to speed up the setup. Azure Pipelines now has support for multi-stage YAML pipelines so you can configure pipelines to do CI and CD tasks in one YAML file. Compared to Classic UI features, YAML still has some catching up to do but given the benefits of version control and portability between projects, using YAML is an excellent way to standardise CI and CD pipelines across your projects. </p>
<p>For branching strategy, after using GitFlow for a number of years, I wanted to try something simpler without the overhead of branch management and merge hell. I recently discovered the Release Flow model and was amazed by its simplicity. </p>
<h2>What is Release Flow</h2>
<p>Release Flow is Microsoft's DevOps workflow to build and deploy all their products. The interesting bit is that they apply this methodology using Azure DevOps to deploy changes to Azure DevOps.</p>
<p>It uses a trunk-based branching strategy for development and releases. This is a lesser-known branching model that is becoming more prevalent and is practiced by large organisations like Google and Facebook. </p>
<h2>Branching strategy</h2>
<p><img src="/link/8d9f4ee02531491e88f8d01c489e315b.aspx" width="742" height="369" /></p>
<p><em>There is a link at the end of the post if you want to learn more about concepts of Trunk based development (TBD)</em></p>
<h3>Development</h3>
<p>The key concept of this branching model is there is only one main branch, <strong>master </strong>(the trunk) providing a single consistent view of the codebase. All development work, either to implement a feature or fix bugs requires developers to create short-lived feature branches off <em>master</em>. Typically these feature branches will only last 1-2 days. </p>
<p>Feature toggles (flags) can be used in avoiding long-lived feature branches, to control the release of incomplete or new features. The basic idea is to have a config file with flags or you can store the flags in CMS.</p>
<p>Merge into <em>master</em> branch is done via pull requests which facilitates peer reviews and ensures branch policies are satisfied before the pull request can be completed. Merging small changes as often as possible to <em>master </em>is encouraged for a quicker code review process and to get early feedback from other developers.</p>
<p>When the feature branch is merged into <em>master</em>, this will trigger the CI/CD YAML pipeline to build, run any unit tests, package, and deploy to the Integration environment. Functional and end to end tests are done on the Integration environment, any bugs are subsequently fixed and merged to master.</p>
<p><img src="/link/021271a80e744de2944d7bfae0cc3e5f.aspx" width="511" height="225" /></p>
<h3>Release</h3>
<p>At the end of a sprint or a few days before a planned monthly release, developers create a new release branch from <strong>master</strong>. It is important to have a consistent naming convention for release branches e.g. <em>release/<major.minor>. </em>This version number can then be used by the release pipeline to tag the release and identify the version deployed to an environment in Azure DevOps (more on this in the next post).</p>
<p><span>When the release branch is pushed to the server, this will trigger the Release YAML pipeline to build, create a code (Nuget) package, and upload it to Episerver DXP ready for staged deployments to Preproduction and Production (upon approval).</span></p>
<p>Any bugs discovered in Preproduction should be fixed off <em>master</em> and cherry-picked/merged onto the release branch. This branching model forbids developers committing to release branches and merging them back into <em>master</em></p>
<p>As a tidy up activity, release branches are deleted once the release is no longer in Production.</p>
<p><img src="/link/f4122c316f5f4f85bf6fc079810144a8.aspx" width="663" height="218" /></p>
<h4>Hotfixes</h4>
<p>To fix a production bug, the normal development workflow is followed <span>i.e. create a branch off <em>master</em></span><span> and </span><span>merge into <em>master </em>via pull request. Once completed, the fix can then be cherry-picked/merged into the release branch. Azure DevOps provides an option on completed pull requests to cherry-pick it to release branch(es) without needing to pull the release branch locally. This will open a new pull request targetting the selected release branch. </span></p>
<p><span>Following this process guarantees the fix is in <strong>master </strong>first and won't be regressed in later releases. If you fix in the release branch first, there is a risk of regression if you forget to merge it into <em>master </em>in the chaos of releasing the fix to Production. </span></p>
<p><span>In case the bug can't be reproduced on <em>master </em>then you have no choice but to do it the other way around i.e. fix on the release branch and merge back to <em>master</em>. This should only be used as a fallback option.</span></p>
<p><span>The Release YAML pipeline can also be used to deploy the hotfix directly to Production if there is an urgent outage to resolve (more on this in the next post).</span></p>
<h2><span>Wrap up</span></h2>
<p>Release Flow model may not be for every development team/project and developers familiar with GitFlow will find it different but I found its simplicity works well with deploying to Episerver DXP using the Deployment API. The other plus is less merge pain. I'm keen to see how the model scales after using it on a few projects.</p>
<p>In the <a href="/link/9859e5f013194c10b8d8fc7765d11c1f.aspx">next post</a> I'll be doing a deep dive into the reusable CI/CD YAML pipelines. </p>
<h2><span>References</span></h2>
<p><span><a href="https://docs.microsoft.com/en-us/azure/devops/learn/devops-at-microsoft/release-flow">https://docs.microsoft.com/en-us/azure/devops/learn/devops-at-microsoft/release-flow</a></span></p>
<p><span><a href="https://trunkbaseddevelopment.com/">https://trunkbaseddevelopment.com/</a></span></p>Lessons from the Episerver Ascend 2019 conferencehttps://rangaiya.hashnode.dev/lessons-from-the-episerver-ascend-2019-conference2019-12-18T03:57:35.0000000Z<p>The Episerver Ascend 2019 conference at Miami Beach was one of the largest gatherings of the Episerver community of industry leaders, partners, customers and developers. With a plethora of sessions across product, development, and marketing tracks and hands-on labs, it provided excellent insights into the future direction of the platform and what others are doing in their organisation from a process perspective.</p>
<p>Chad Wolf, EVP Chief Customer and Sales Officer, delivered the keynote. The main message is that to accommodate growth you must have products that add value. Episerver has delivered this added value and continues to invest in its ecosystem to drive success for customers and partners. David Bowen, Head of Product Management, followed it up with a key statistic: 41 weekly releases with over 160 features delivered to date in 2019.</p>
<h2 id="heading-customer-centricity">Customer-centricity</h2>
<p>The prevailing message throughout the event was customer-centricity and how Episerver empowers organisations to intuitively build customer-centric digital experiences. Everything, from the platform to strategy to implementation, should focus on delivering value to the customer. We envision a world where digital experiences mirror the convenience and empathy of face-to-face interactions. To understand the intent of the customer, we need to understand human interactions digitally.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1676868793443/743b1909-44c0-4197-8f11-3a132317ca25.jpeg" alt class="image--center mx-auto" /></p>
<p>A customer-centric digital experience platform includes content, commerce, personalisation, analytics, marketing automation and the ability to integrate into other applications. This is basically Episerver's core business and their product roadmap reflected this.</p>
<h3 id="heading-content-has-a-soul">Content has a soul</h3>
<p>Deane Barker, Senior Director of Content Management and a veteran on content management, gave some interesting and passionate insights on content and its consumption.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1676869106302/2ea8d3f7-0406-4bc2-aed4-7989b182e220.jpeg" alt class="image--center mx-auto" /></p>
<p>Content represents who we are and connects human beings. He described content as having three distinct characteristics - It is created by humans through a subjective editorial process, ultimately intended for human consumption and intended to further an organisation's goal. Content is designed to connect and influence.</p>
<p>He pointed out that websites were ultimately containers of content and expressed concern that many websites are being built with no thought for the content thats going in them,</p>
<blockquote>
<p>"We are so concerned about building the swimming pool that we often forget the water"</p>
</blockquote>
<h2 id="heading-episerver-insights-analytics">Episerver <s>Insights</s> Analytics</h2>
<p>Episerver Insights, which is the user interface for Episervers customer data platform, is being given a much-needed revamp starting with a name change to Episerver Analytics. Among the soon-to-released features are a new embedded analytics dashboard and the ability to segment on user interactions.</p>
<p>I got the opportunity to connect with Joakim Platbarzdis, Senior Product Manager, who acknowledged the current limitations of Episerver Insights and shared a valuable look into the product roadmap that would enable business users and marketers to create engaging and personalised experiences by using data stored in Episervers customer data platform.</p>
<p>REST APIs have already been released that allow developers to create rules-based filter definitions and segments from these filters, with this capability soon to be available through the management UI.</p>
<h2 id="heading-episerver-foundation">Episerver Foundation</h2>
<p>Episerver Foundation is a robust accelerator solution to help partners and customers to accelerate time to market thus reducing costs. This solution will include production quality templates and features and provide a reference architecture for all Episerver products. Episerver Foundation currently includes CMS, Commerce, Personalisation, Find and Social, with more to follow.</p>
<p>It is modular, allowing developers to incorporate the product-specific projects as needed in their solution.</p>
<h2 id="heading-cms-and-commerce-roadmap">CMS and Commerce Roadmap</h2>
<p>The Editor UI has undergone a refresh to make it more modern, consistent and user-friendly. Editing enhancements such as comments for content in edit mode, inline block edit and Smart Publish have already been released to make the editing process more efficient.</p>
<p>With the release of Commerce 13 earlier this year, Episerver continues to release significant product improvements. This includes improvements to the Promotion engine, Power BI embedded analytics dashboards and reports, as well as a standalone Customer Service Application (beta) as a replacement for the Commerce Manager and intended to meet the needs of a call centre or live chat.</p>
<h2 id="heading-cloud-services">Cloud Services</h2>
<p>Episervers mantra is an always-on, always-available model and they continue to invest in their cloud services to make it easier for business users and developers alike.</p>
<p>The PaaS Portal has been enhanced with more self-service capabilities for self-deployment, troubleshooting and a Deployment API has been released in beta to facilitate seamless DevOps integration. Updates to the management UI for the SaaS based products will now be automatically pushed avoiding the need for developer intervention and deployment.</p>
<p>There are also plans for smoother deployments with zero downtime and to allow to maintain content in read-only.</p>
<h3 id="heading-wrapping-up">Wrapping up</h3>
<p>I left the conference with renewed confidence in the Episerver platform and excited by its continued growth as a customer-centric digital experience platform.</p>
Visitor Intelligence and Personalization - enrich and use visitor profile data/blogs/ron-rangaiya/dates/2019/6/enrich-and-use-profile-store-data-for-personalization/2019-06-21T01:24:51.0000000Z<p>Recently I've been working with Profile Store, specifically around enriching the profile with custom behaviourable data that can then be used for content personalisation. Here's a quick overview of the implementation. </p>
<h3><strong>Tracking</strong></h3>
<p><strong>Profile Store </strong>is Episerver's customer data platform for storing website visitor profile and behaviour data collected by tracking. There is detailed official documentation on how to <a href="/link/45b5ee8c872545b48ce190f771089c96.aspx">install and configure tracking for Profile Store .</a> The simplest way to start collecting tracking data is by using the [<strong>PageViewTracking</strong>] attribute. </p>
<h3><strong>Enrich Profile data</strong></h3>
<p>First, a look at the structure of the profile data that is captured out of the box. </p>
<p><img src="/link/c28f73ac60e544e4918b51b930d61389.aspx" /></p>
<p>"Name" and "Info" properties will be populated with user data if your website has sign-in capability. The key property here is the "Payload" which can be used to store any custom data. </p>
<p>The <a href="/link/fad2fe2c46504832921d5e2eea0292a4.aspx">Profile Store API</a> makes it easy to query and update the visitor profiles. I started with a simple Payload model and a helper service with methods to retrieve and update the profile payload.</p>
<p><img src="/link/6163713a5e674aecbe955d8419d2f892.aspx" /></p>
<p><img src="/link/09abac212d2e430ca800fb684d838b03.aspx" width="827" height="591" /></p>
<p>Note in the "AddPayloadToProfile" method, I'm overwriting the profile payload property however it can be made smarter to preserve the existing payload in case you have multiple processes that update the payload.</p>
<p>The Profile Store adds a request cookie "_madid", <span>which is the DeviceId</span> and is used to retrieve the profile. </p>
<p><img src="/link/429cda75d8a0461db5af2c4be301bfb1.aspx" /></p>
<h3><strong>Personalization</strong></h3>
<p>Once the profile has been enriched, the next step is to create custom VisitorGroups to use for personalization.</p>
<p>I created a VisitorGroup to use for simple value comparison. Depending on your custom data, you can create visitor groups to suit different criteria requirements.</p>
<p><img src="/link/3be1e9721ab54ef793d8a56d312c9cc9.aspx" /></p>
<p><img src="/link/a0ae545b426747d0b89f43f8fe8a4398.aspx" /></p>
<p>Now you can add VisitorGroup criteria and apply to your website content accordingly</p>
<p><img src="/link/62b4ff7e93804fcca6addeaf21c004ed.aspx" /></p>
<p><img src="/link/80210d26fcf7447588c2c01739de38b4.aspx" /></p>
<h3><strong>Wrapping up</strong></h3>
<p>Been able to push custom data to the profile using the Profile Store API opens up endless possibilities to enrich the profile data for personalization purposes. It can be as simple as pushing data captured from a Form on your website or via a data insights tool that analyses and enriches profile data for improved customer experience.</p>