Blog posts by Aniket Gadre2023-12-19T23:39:30.0000000Z/blogs/aniket-gadre/Optimizely WorldPart 1: Planning and estimation for Upgrade (CMS 12/Commerce 14) /blogs/aniket-gadre/dates/2023/12/upgrade-cms-12commerce-14---things-to-consider---part-1/2023-12-19T23:39:30.0000000Z<p>In this series, I will talk about the Optimizely CMS/Commerce upgrade project(s) with some do's and don'ts as well as tips & tricks for a successful implementation. We can all agree depending on the complexity of the implementation, the ugprade project has a potential to go off track pretty easily. This blog post will allow you to get ahead of some of these challenges. </p>
<p><span style="text-decoration: underline;">5P's principle: <strong>P</strong>roper <strong>P</strong>lanning <strong>P</strong>revents <strong>P</strong>oor <strong>P</strong>erformance.</span></p>
<p>In Part 1 of this series let's talk through the most important (& often neglected) phase - "Planning and Estimatation". </p>
<h2><strong>Planning & Estimation</strong></h2>
<p>This is one of the most difficult part of the process. How do you estimate an upgrade? Unfortunately, I don't have a silver bullet, as each project is unique but here are a few critical things to consider that will help you plan and estimate it better.</p>
<p><span style="text-decoration: underline;"><strong>Onboarding & kickoff</strong></span>: If you are planning to use a new sidecar team for the upgrade, you need to account for any setup/onboaring time for new developers + kickoff. </p>
<p><span style="text-decoration: underline;"><strong>Customizations</strong></span>: Take inventory of the all the customizations to the core product. This is by far the biggest variable in the estimate and can cause the project to go off track quickly if you don't have a plan for how you plan to support it.</p>
<p><span style="text-decoration: underline;"><strong>Third Party Integrations</strong></span>: With the upgrade the underlying engine needs to be upgraded to .NET Core. It will be useful to document all third party integrations including nuget packages that rely on the old .NET framework & haven't been updated to support the new .NET Core architecture. You may need to work with the client to find alternatives or remove this functionality completely. </p>
<p><span style="text-decoration: underline;"><strong>Commerce Manager</strong></span>: Commerce Manager will no longer available in the new Commerce 14 and will have Order Manager instead. Ensure you are documenting any updates/customizations that need to be supported in the new Order Manager for processing your orders.</p>
<p><span style="text-decoration: underline;"><strong>Linux Containers</strong></span>: The new DXP is hosted in Linux containers unlike the existing DXP which is hosted on Window's servers. This is a huge diference as Windows Forms WPF and other Window specific packages will not work in the the Linux containers. This is difficult to anticipate as the upgrade project may work perfectly fine locally but throw random errors in the new DXP. </p>
<p><span style="font-size: 10pt;"><em>Did you know: Linux has removed support for TLS1.0 & TLS 1.1 certificates? We ran into this issue after deploying to DXP although it worked perfectly fine locally (windows v/s linux). </em></span></p>
<p><span style="text-decoration: underline;"><strong>Breaking Changes</strong></span>: There are lot of breaking changes in this version. I would recommend carefully reviewing these <a href="https://docs.developers.optimizely.com/content-management-system/docs/breaking-changes-in-content-cloud-cms-12">here</a> before the start of the project to avoid any big surprises during implementation.</p>
<p><span style="text-decoration: underline;"><strong>Ongoing development & branching</strong></span>: If you have a huge team of developers working on the existing project (we did in this case), you need to keep the upgrade branch updated regularly. Have a solid branching strategy figured out on keeping the old and new in sync. This along with ongoing QA can be challenging as new functionality and features will need to be merged in and tested. </p>
<p><span style="text-decoration: underline;"><strong>Deployments</strong></span>: Make sure you account for creating build & release pipelines, CI/CD etc. There may be instances where the existing DXP deployment pipelines do not work with the new DXP. In our project, the bundled js and css files weren't getting deployed correctly (due to differences in .NET & .NET Core build). </p>
<p><span style="text-decoration: underline;"><strong>Content Freeze</strong></span>: This may not impact the budget but does affect the timeline. Content freezes are essential and need to be co-ordinated with the client as we get closer to the finish line. Ensure you are accounting for how long the content freeze will last and </p>
<p><span style="text-decoration: underline;"><strong>Code freeze</strong></span>: It may be a good idea to freeze any code development (or keep it to minor bug fixes) before going LIVE to prevent introducing new bugs. Adding new features may add new defects, which could potentially delay the Go LIVE.</p>
<p><span style="text-decoration: underline;"><strong>Go Live Prep</strong></span>: Ensure the right people are available for GO LIVE, ex: IT team with DNS access needs to be involved to switch from old to the new DXP environments (this is a two step process). </p>
<p><span style="text-decoration: underline;"><strong>Add Buffer</strong></span>: Always buffer your estimates for unknowns. Depending on the complexity of the project, there's a chance you missed something which could potentially take hours (if not days) to figure out & completely derail the project.</p>
<p>Hope this helps! In Part 2 of thia series I will talk about implementation & known issues to bypass. </p>Serializable Carts - Website crash? Wait what?.../blogs/aniket-gadre/dates/2023/2/serializable-carts-use-with-caution/2023-03-01T04:45:05.0000000Z<p><strong>Stop and do this right now:</strong></p>
<p>If you are running a transactional ecommerce website on Optimizely please check the health and size of your commerce database in Application Insights. This gets typically overlooked unless you have an alert setup or there's a issue on the website.</p>
<p>Okay maybe I lied in the title. It won't be a website crash but none of the carts can be created (naturally no orders can be created until this issue is resolved). This will be SEV 1 issue with the client yelling on the phone to get this resolved ASAP.<br /><br /></p>
<p><strong>What are Serializable Carts</strong>:</p>
<p>Serializable carts have been around for a while so it's not a net new functionality. Documentation here: <a href="https://docs.developers.optimizely.com/commerce/v14.0.0-commerce-cloud/docs/serializable-carts?_gl=1*t9pqkp*_ga*MTUwOTA5MDEyMS4xNjc3MjY2Mjcy*_ga_C7SLJ6HMJ5*MTY3NzU5NjExNy4xMS4xLjE2Nzc1OTYzNTQuNTAuMC4w">https://docs.developers.optimizely.com/commerce/v14.0.0-commerce-cloud/docs/serializable-carts</a></p>
<p>The SerializableCarts table is pretty simple. The data column holds the entire cart as a JSON string (which can use up some good bytes in the DB).</p>
<p><img src="/link/069820c1f71a450da10e846c3e0cc1fa.aspx" /></p>
<p>Serializable carts are created/updated in the database every time a user adds/updates an item to the cart (because of SaveCart() calls in your code). Saving carts in the database is a useful piece of functionality for keeping the users' history of products as well as retrieving their saved items even after they close their browser (as long as they don't delete cookies). Seriaziable Carts are also used for 'Wishlist' to create favorites or other lists that are stored in the database.</p>
<p><strong>Anything wrong here?</strong></p>
<pre class="language-csharp"><code> public ICart GetInMemoryCart(Guid contactGuid)
{
// Load cart to get a fake cart, if not create one
var cart = _orderRepository.LoadOrCreateCart<ICart>(contactGuid, "Default");
return cart;
}
public void CallingMethodDoPromotionCalculations()
{
// Get a fake cart to do custom calculations
var cart = GetInMemoryCart(Guid.NewGuid());
if (cart != null)
{
// Add line items to the cart
// Basic shipping address etc.
// Do promotion calculations on the cart to check if user will qualify
// More code
_orderRepository.Save(cart);
}
// Return the results of the calculations
}</code></pre>
<p><strong></strong></p>
<p><strong></strong></p>
<p><strong>What's wrong?</strong></p>
<p>If you couldn't figure out the problem the issue isn't evident at first sight. The above code will create a new cart with a new GUID (meant to be for in-memory calculations etc.) in the database. Now if this code runs (on say home page), every time the user browses to the website (whether they add items to the cart or not) you got a problem.</p>
<p>There are 2 things to watch out for:</p>
<ol>
<li>It's using the LoadOrCreate() creating a cart with a new GUID every time, instead of using something like CustomerContext.Current.CurrentContactId</li>
<li>It's calling the SaveCart() which actually commits this cart to the database with the new Customer GUID.</li>
</ol>
<p><strong></strong></p>
<p><strong>Death by a thousand paper cuts</strong>:</p>
<p>Here's a scenario to consider where this issue can go unnoticed for days/months until your database is full:</p>
<ol>
<li>The above piece of code was accidentally introduced on a frequently visited page and it's creating thousands of carts every day. </li>
<li> Well luckily Optimizely provides a scheduled job called "Remove expired carts" that deletes carts that haven't been modified in the last 30 days and by default runs once every day. </li>
<li>The issue is the scheduled job won't get to these fake/junk carts until 30 days later and in the meanwhile created 1 million carts/rows in the database. </li>
<li>Well we have the scheduled job no issue right? (hopefully!). However, now when the scheduled job tries to get all serializable carts to delete within the last 30 days it may start failing due to timeouts. Then your website is doomed....very slowly :) because if the failing scheduled job goes unnoticed for a while you are accumalating carts and absolutely nothing to purge them. </li>
<li>Soon enough the SerializableCarts table will grow expoentially and eat up all the allocated database space (currently set to 250 GB in DXP)</li>
<li>When this happens in production, no new carts will be created. God forbid if it happens during peak traffic hours or even worse on Cyber Monday this will be SEV 1 right away!!!<br /><br /></li>
</ol>
<p><strong>Short Term Fix</strong>:</p>
<ol>
<li>Call/Email Optimizely customer support and ask them to increase the database size from 250GB to 500 GB. This will resolve the issue right away and your website will be operational and buy you some time to get to the bottom of this issue.</li>
</ol>
<p><strong></strong></p>
<p><strong>Long term Solution</strong>:</p>
<ol>
<li>Check application insights to see how long this has been going on and also get a sense of daily increase in the database size<br /><br /></li>
<li>Ask support to run the following query to get the size of the tables in the ecommerce database. If this is a seriazible cart issue, the table will show up to the top of the list. <br />
<pre class="language-markup"><code>SELECT
t.NAME AS TableName,
s.Name AS SchemaName,
p.rows,
SUM(a.total_pages) * 8 AS TotalSpaceKB,
CAST(ROUND(((SUM(a.total_pages) * 8) / 1024.00), 2) AS NUMERIC(36, 2)) AS TotalSpaceMB,
SUM(a.used_pages) * 8 AS UsedSpaceKB,
CAST(ROUND(((SUM(a.used_pages) * 8) / 1024.00), 2) AS NUMERIC(36, 2)) AS UsedSpaceMB,
(SUM(a.total_pages) - SUM(a.used_pages)) * 8 AS UnusedSpaceKB,
CAST(ROUND(((SUM(a.total_pages) - SUM(a.used_pages)) * 8) / 1024.00, 2) AS NUMERIC(36, 2)) AS UnusedSpaceMB
FROM
sys.tables t
INNER JOIN
sys.indexes i ON t.OBJECT_ID = i.object_id
INNER JOIN
sys.partitions p ON i.object_id = p.OBJECT_ID AND i.index_id = p.index_id
INNER JOIN
sys.allocation_units a ON p.partition_id = a.container_id
LEFT OUTER JOIN
sys.schemas s ON t.schema_id = s.schema_id
WHERE
t.NAME NOT LIKE 'dt%'
AND t.is_ms_shipped = 0
AND i.OBJECT_ID > 255
GROUP BY
t.Name, s.Name, p.Rows
ORDER BY
TotalSpaceMB DESC, t.Name</code></pre>
</li>
<li>Ask Optimizely support to run the following script to clean up the ecommerce database SerializableCarts table (PLEASE TEST THIS ON LOCAL AND OTHER LOWER ENVIRONMENTS FIRST)<br />
<pre class="language-markup"><code>-- The -300 is the lookback (depends on how far back this issue has been around. The script will need to be run in increments by support -300, -200, -100 down to -30.
DELETE from SerializableCart where Modified < DATEADD(day, -300, GETDATE()) and Name != 'WishList'</code></pre>
</li>
<li>Lower the frequency of the "Remove Expired Carts" scheduled job to run every hour or multiple times in a day (this wouldn't help unless you clean up the database first)<br /><br /></li>
<li>Investigate the root cause with special attention to SaveCart(), LoadOrCreateCart() and LoadCart().<br /><br /></li>
<li>Setup alerts with Optimizely to be notified when the database size increases 20% more than the baseline.<br /><br /></li>
<li>Manually check the scheduled job is functioning regularly to avoid the same problem from happening again. </li>
</ol>
<p>Happy coding!</p>
<p> </p>Azure Service Bus Messaging (Topic/Queues) for transferring data between external PIM/ERP to Optimizely/blogs/aniket-gadre/dates/2023/2/using-azure-topic-queues-for-transferring-data-between-external-pim-to-optimizely/2023-02-27T03:49:41.0000000Z<p>Optimizely provides a PIM solution as a part of the DXP package. More information here: <a href="https://www.optimizely.com/product-information-management/">https://www.optimizely.com/product-information-management/</a></p>
<p>More often that not, clients have their existing PIM and/or ERP systems that feed other systems in their organization. For ex: Their PIM/ERP system may be serving physical stores, running reports, feeding their invoicing details and the SOURCE of TRUTH. There are numerous blog posts on importing catalog one time into Optimizely using the out-of-the-box Optimizely APIs.</p>
<p>Needless to say, as updates are made to say pricing, inventory, assets, delivery charges, taxes etc. in ERP/PIM/DAM, we need to keep that data synchronized in the Optimizely catalog. to ensure the customers see the most up-to-date information on the website as quickly as possible.</p>
<p>This requires a strategy to figure out how to move content between two systems and do it on a regular fault tolerant basis. A quick solution is the use of Optimizely's scheduled job to fetch data and update it in the database. though there are some limitations with a scheduled job - timeouts, low fault tolerance, logging, speed, resource constraints, alerting etc. </p>
<p>Another alternative is to <strong><a href="https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-queues-topics-subscriptions">Azure Service Bus Messaging</a> </strong>to line up the product updates from the source system (client's PIM/ERP) and synchronize it to the Optimizely catalog on a configurable schedule. Azure Service bus have a lot of advantages as described below and you can also read up online. </p>
<p><strong>Advantages</strong>:</p>
<ul>
<li>Message Sessions</li>
<li>Auto-forwarding</li>
<li>Dead-lettering</li>
<li>Scheduled Delivery</li>
<li>Message deferral</li>
<li>Transactions</li>
<li>Auto-delete on idle</li>
<li>Duplicate detection</li>
<li>Geo Disaster recovery</li>
</ul>
<p>You can use the Azure Service Bus .NET SDK for integration: <a href="https://learn.microsoft.com/en-us/dotnet/api/overview/azure/service-bus?preserve-view=true&amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;view=azure-dotnet">https://learn.microsoft.com/en-us/dotnet/api/overview/azure/service-bus?preserve-view=true&view=azure-dotnet</a></p>
<p><strong>Strategy</strong>:</p>
<p>We have used the following strategy on a huge B2C retail client and works really well. </p>
<ol>
<li>Our custom C# function/console app (extract job) deployed on Azure gets all products that have been updated in the last 'x' mins/hours by pinging the custom endpoint provided by client</li>
<li>This function app is run using a 'TimerTrigger' configurable in Azure function app configuration. More info on function apps: <a href="https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-your-first-function-visual-studio?tabs=in-process">https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-your-first-function-visual-studio?tabs=in-process</a></li>
<li>This function app is responsible for getting the data from the endpoint, serializing each message as a JSON and send it to the ASB topic (product extract topic)</li>
<li>A second custom C# function app (transload job) which was subscribed to the above topic in ASB using 'ServiceBusTrigger' (executes every time there's a new message)</li>
<li>This function app's job was to read the message from the topic, deserialize it and update the product item using Optimizely Service API</li>
</ol>
<p><strong>Diagram</strong>:</p>
<p><img src="/link/862c73d87f444d0fbb3d0e3304447413.aspx" /></p>
<p><strong>Sample Code (Export Job)</strong>:</p>
<pre class="language-csharp"><code>namespace ClientNamespace.Export.Features.CartCheckout.TaxSync
{
using System;
using System.Linq;
using System.Net.Http;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Azure.WebJobs;
using ClientNamespace.Export.Core.Features.CartCheckout.TaxRateSync.Models;
using ClientNamespace.Export.Core.Features.Infrastructure.Azure.Constants;
using ClientNamespace.Export.Core.Features.Infrastructure.Azure.Services;
using ClientNamespace.Export.Core.Features.Infrastructure.Logging;
using ClientNamespace.Export.Features.Infrastructure.Azure.Constants;
using ClientNamespace.Export.Features.Infrastructure.Azure.Extensions;
using ClientNamespace.Export.Features.Infrastructure.Azure.Services;
using ClientNamespace.Export.Features.Infrastructure.Rfapi.Clients;
using Serilog;
using Serilog.Core;
using ConnectionStringNames = ClientNamespace.Export.Core.Features.Infrastructure.Azure.Constants.ConnectionStringNames;
using ExecutionContext = Microsoft.Azure.WebJobs.ExecutionContext;
public class TaxRatesExportFunction
{
private const int ShortCircuit = 100_000;
private IHttpClientFactory _clientFactory;
public TaxRatesExportFunction(IHttpClientFactory clientFactory)
{
_clientFactory = clientFactory;
}
#if !DEBUG // remove this line to run locally as a console app
[FunctionName("TaxRatesExport")]
#endif
public async Task Run(
[TimerTrigger(
ScheduleExpressions.TaxRatesExport,
RunOnStartup = false)]
TimerInfo myTimer)
{
var log = LoglevelWrapper.WrapLogger(Log.Logger);
try
{
log.Information("Starting TaxRatesExportFunction: {starttime}", DateTime.UtcNow);
using (var topicMessageSender = new TopicMessageSender(ConnectionStringNames.ServiceBusTaxRates, TopicNames.TaxRates, log))
{
var taxRates = await apiClient.TaxesAllAsync(); // custom endpoint from the client
var export = new TaxRateExport
{
TaxRates = taxRates
.Select(x => new TaxRate
{
Percentage = x.TaxRate ?? 0.000,
PostalCode = x.PostalCode,
TaxCode = x.TaxCode,
TaxableDelivery = x.TaxableDelivery,
TaxablePlatinum = x.TaxablePlatinum,
})
.ToList(),
};
// Send the message to the topic to be consumed by the the transload function app
try
{
var message = new Message(Encoding.UTF8.GetBytes(JsonConvert.SerializeObject(export)))
{
MessageId = Guid.NewGuid().ToString(),
SessionId = "sesionid",
};
string connectionString = Environment.GetEnvironmentVariable("connectionStringName");
if (string.IsNullOrEmpty(connectionString))
{
connectionString = Environment.GetEnvironmentVariable($"CUSTOMCONNSTR_{"connectionStringName"}");
}
var topicClient = new TopicClient(connectionString, "topicName", RetryPolicy.Default);
await topicClient.SendAsync(message);
}
catch (Exception ex)
{
// logging
}
}
}
catch (Exception ex)
{
log.Error(ex, "Unhandled exception in TaxRatesExportFunction {exception}", ex);
}
finally
{
log.Information("TaxRatesExportFunction Complete: {endtime}", DateTime.UtcNow);
}
}
}
}
</code></pre>
<p><strong>Sample code Import job</strong>:</p>
<pre class="language-csharp"><code>namespace .Website.Import.Features.CartCheckout.TaxSync
{
using System;
using System.Net.Http;
using System.Threading.Tasks;
using Infrastructure.Azure.Constants;
using Microsoft.Azure.WebJobs;
using Newtonsoft.Json;
using ClientNamespace.Export.Core.Features.CartCheckout.TaxRateSync.Models;
using ClientNamespace.Export.Core.Features.Infrastructure.Azure.Constants;
using ClientNamespace.Export.Core.Features.Infrastructure.Logging;
using .Website.Core.Features.Infrastructure.Episerver.Clients;
using Serilog;
using Serilog.Context;
using ConnectionStringNames = ClientNamespace.Export.Core.Features.Infrastructure.Azure.Constants.ConnectionStringNames;
public class TaxRatesImportFunction
{
private readonly IHttpClientFactory _clientFactory;
public TaxRatesImportFunction(IHttpClientFactory clientFactory)
{
_clientFactory = clientFactory;
}
#if !DEBUG // Remove this to run locally (will be triggered when it sees a message on the topic it's subscribed to)
[FunctionName(FunctionNames.TaxRatesImport)]
#endif
public async Task Run(
[ServiceBusTrigger(
TopicNames.TaxRates,
SubscriptionNames.TaxRates,
Connection = ConnectionStringNames.ServiceBusTaxRates,
IsSessionsEnabled = true)]
string mySbMsg)
{
var log = LoglevelWrapper.WrapLogger(Log.Logger);
try
{
log.Information("Starting TaxRatesImportFunction: {starttime}", DateTime.UtcNow);
log.Debug("Tax Rates Import Message: {message}", mySbMsg);
TaxRateExport export = null;
try
{
// Get taxes from topic queue
export = JsonConvert.DeserializeObject<TaxRateExport>(mySbMsg);
}
catch (Exception ex)
{
log.Error(ex, "Could not JSON deserialize tax rates message {message} with exception {exception}", mySbMsg, ex);
}
if (export?.TaxRates == null)
{
log.Warning("Tax rates deserialized, but data was null");
return;
}
// Load taxes into Episerver
var serviceApiClient = EpiserverApiClientFactory.Create(log, _clientFactory);
foreach (var taxRate in export.TaxRates)
{
try
{
using (LogContext.PushProperty("importtaxrate", taxRate.TaxCode))
{
// Update the taxes table (either custom endpoint or using Service API)
await serviceApiClient.SaveTaxRateAsync(taxRate);
}
}
catch (Exception ex)
{
// Don't fail the group
// Custom logic to handle exception when updating in the Optimizely dtabase.
}
}
}
catch (Exception ex)
{
log.Error(ex, "Unhandled exception in TaxRatesImportFunction {exception}", ex);
}
finally
{
log.Information("TaxRatesImportFunction Complete: {endtime}", DateTime.UtcNow);
}
}
}
}
</code></pre>
<p>As you see, with minimal code you can create a more fault tolerant synchronization to the optimizely database. You can now visualize this scaling to other areas of your website. For ex: We have scaled this system to automate processing of orders - As orders come in, the serialized order object is placed on the Azure service bus for automated processing all the way to completing the orders. Yes the client's IT team needs to write some code to automate it on their side but it has saved them hundred's of thousands of dollars in costs of manually updating each order by a keying team member. </p>
<p>Can you think of other ways to scale the Optimizely system to use Azure Service Bus Messaging? </p>
<p>Happy coding!</p>ChatGPT/OpenAI integration for text generation using Prompt in Optimizely/blogs/aniket-gadre/dates/2023/2/gpt-open-ai-integration-text-generation-on-content-publish/2023-02-27T01:44:42.0000000Z<p>Here's how you can use a simple publishing event to generate content using OpenAI.</p>
<p>The code is pretty simple - I will avoid getting into too many details as Tomas has done a wonderful job of explaining it in his blog post here: <br /><a href="https://www.gulla.net/en/blog/integrating-generative-ai-in-optimizely-cms-a-quick-test-with-openai/">https://www.gulla.net/en/blog/integrating-generative-ai-in-optimizely-cms-a-quick-test-with-openai/</a></p>
<p>...And from Allan here:<br /><a href="https://www.codeart.dk/blog/2022/11/ai-assisted-content-creation---in-optimizely-cms--commerce-ai-series---part-2/">https://www.codeart.dk/blog/2022/11/ai-assisted-content-creation---in-optimizely-cms--commerce-ai-series---part-2/</a> </p>
<p>Here's sample code which has been requested by a few people. </p>
<pre class="language-csharp"><code>namespace ClientName.CMS.Features.Sample
{
using System;
using System.Collections.Generic;
using System.Linq;
using System.Net.Http;
using System.Threading.Tasks;
using System.Web;
using EPiServer;
using EPiServer.Core;
using EPiServer.Framework;
using EPiServer.Framework.Initialization;
using EPiServer.ServiceLocation;
using Newtonsoft.Json;
using ClientName.CMS.Features.Basics.RichTextBlock.Models;
using ClientName.Common.Features.Foundation.Threading.Utilities;
[InitializableModule]
[ModuleDependency(typeof(EPiServer.Web.InitializationModule))]
public class OpenAIBlockInitialization : IInitializableModule
{
private static readonly HttpClient _client = new HttpClient();
private readonly string _apiKey = "YOUR API KEY GOES HERE"; // You can generate it here: https://platform.openai.com/account/api-keys
public void Initialize(InitializationEngine context)
{
// Add initialization logic, this method is called once after CMS has been initialized
var contentEvents = ServiceLocator.Current.GetInstance<IContentEvents>();
contentEvents.PublishingContent += ContentEvents_PublishingContent;
}
public async Task<dynamic> SendRequestAsync(string model, string prompt, int maxTokens)
{
var requestUrl = "https://api.openai.com/v1/engines/" + model + "/completions";
var requestData = new
{
prompt = prompt,
max_tokens = maxTokens
};
var jsonRequestData = JsonConvert.SerializeObject(requestData);
var requestContent = new StringContent(jsonRequestData, System.Text.Encoding.UTF8, "application/json");
_client.DefaultRequestHeaders.Authorization = new System.Net.Http.Headers.AuthenticationHeaderValue("Bearer", _apiKey);
var response = await _client.PostAsync(requestUrl, requestContent);
response.EnsureSuccessStatusCode();
var responseContent = await response.Content.ReadAsStringAsync();
var responseData = JsonConvert.DeserializeObject<dynamic>(responseContent);
return responseData;
}
private void ContentEvents_PublishingContent(object sender, EPiServer.ContentEventArgs e)
{
try
{
if (e.Content != null)
{
if (e.Content is RichTextBlock richTextBlock)
{
var blockData = e.Content as RichTextBlock;
string textToOpenAI = blockData.OpenAIPrompt;
// Call Open AI and get results
var response = AsyncHelper.RunSync(async () => await SendRequestAsync("text-davinci-003", textToOpenAI, 3000));
blockData.OpenAIGeneratedText = response?.choices[0]?.text;
}
}
}
catch (Exception ex)
{
// Optinal logging
}
}
public void Uninitialize(InitializationEngine context)
{
// Add uninitialization logic
var contentEvents = ServiceLocator.Current.GetInstance<IContentEvents>();
contentEvents.PublishingContent -= ContentEvents_PublishingContent;
}
}
}</code></pre>
<p>AsyncHelper class to run async method synchronously (due to initialization functions limitations)</p>
<pre class="language-csharp"><code>namespace ClientName.Common.Features.Foundation.Threading.Utilities
{
using System;
using System.Threading;
using System.Threading.Tasks;
public static class AsyncHelper
{
private static readonly TaskFactory TaskFactory =
new TaskFactory(CancellationToken.None, TaskCreationOptions.None, TaskContinuationOptions.None, TaskScheduler.Default);
/// <summary>
/// Executes an async Task method which has a void return value synchronously
/// USAGE: AsyncUtil.RunSync(() => AsyncMethod());
/// </summary>
/// <param name="task">Task method to execute</param>
public static void RunSync(Func<Task> task) => TaskFactory.StartNew(task).Unwrap().GetAwaiter().GetResult();
/// <summary>
/// Executes an async Task<T> method which has a T return type synchronously
/// USAGE: T result = AsyncUtil.RunSync(() => AsyncMethod<T>());
/// </summary>
/// <typeparam name="TResult">Return Type</typeparam>
/// <param name="task">Task<T> method to execute</param>
/// <returns></returns>
public static TResult RunSync<TResult>(Func<Task<TResult>> task) =>
TaskFactory.StartNew(task).Unwrap().GetAwaiter().GetResult();
}
}</code></pre>
<p>Happy coding!</p>The beauty of Decorator pattern in Optimizely/blogs/aniket-gadre/dates/2023/2/the-beauty-of-decorator-pattern-in-optimizely/2023-02-26T23:21:43.0000000Z<p>Decorator pattern is one of my favorite design pattern for backend code development.</p>
<p><span>From wikipedia:</span></p>
<p><span>A decorator pattern is a design pattern that allows a behavior to be added to an individual object </span><span>dynamically, without affecting the behavior of other objects from the same classes.</span></p>
<p><span><img src="/link/61c8ea2655be4aa09b909b11ab10fcd4.aspx" /></span></p>
<p><span><strong>Advantages</strong>: </span></p>
<ul>
<li><span>Helps you extend the behaviour of the classes/services without modifying the behavior. </span></li>
<li><span>Helps enforcing single responsibility principle (one class one responsibility) and Open/Closed principle (classes can be extended but not modified). </span></li>
<li><span>More efficient than subclassing because the objects behavior can be augmented without definining an entierly new object. </span></li>
<li><span>Mainly used for caching keep the layer separate (including the keys can be made unique per functionality)</span></li>
<li><span>Additional scenarios - logging, alerting, processing etc.</span></li>
</ul>
<p><span><strong>Implementation</strong>:</span></p>
<p><span>A simple example is an alert needs to be sent every time an order is submitted or there's an unhandled exception in the order service after a user submits the order. It might be tempting to add an 'Alert Sender Email' dependency directly to the main order service class. However, if we need to stick to SRP and O/C SOLID principles, order service should only perform 1 job (submit the order). </span></p>
<p>In that case, one way to extend the behavior of the order service class is to create a new class that inherits from the same interface (IOrderSubmitService) that sends the email. This means you don't need to add a new interface (unlike sub-classing) which makes the interfaces slim and helps the interface segregation principle indirectly. </p>
<p>Sample code:</p>
<pre class="language-csharp"><code>namespace RF.Website.CMS.Features.CartCheckout.OrderSubmit.Alerting
{
using System;
using System.Threading.Tasks;
using RF.Website.CMS.Features.CartCheckout.OrderSubmit.Services;
using RF.Website.CMS.Features.CartCheckout.OrderSubmit.ViewModels;
using RF.Website.Common.Features.Foundation.Alerts.Services;
public class AlertingOrderSubmitService : IOrderSubmitService
{
private readonly IOrderSubmitService _implementation;
private readonly IAlertSender _alertSender;
public AlertingOrderSubmitService(
IOrderSubmitService orderSubmitService,
IAlertSender alertSender)
{
_implementation = orderSubmitService ?? throw new ArgumentNullException(nameof(orderSubmitService));
_alertSender = alertSender ?? throw new ArgumentNullException(nameof(alertSender));
}
public async Task<OrderSubmitViewModel> SubmitOrderAsync(string associateName, int cartVersion, string kountSessionId)
{
try
{
return await _implementation.SubmitOrderAsync(associateName, cartVersion, kountSessionId);
// Potential to add code to send email after every successful submission.
}
catch (Exception exception)
{
string subject = "SubmitOrderAsync Error";
string body = "An error occurred while calling SubmitOrderAsync.";
await _alertSender.SendAlertAsync(subject, body, exception);
throw;
}
}
}
}</code></pre>
<p>The statement in the try block is the one that calls the implementation of the submit order</p>
<p>The IOrderSubmitService:</p>
<pre class="language-csharp"><code>namespace ClientName.CartCheckout.OrderSubmit.Services
{
using System.Threading.Tasks;
using ClientName.CartCheckout.OrderSubmit.ViewModels;
public interface IOrderSubmitService
{
Task<OrderSubmitViewModel> SubmitOrderAsync();
}
}</code></pre>
<p>Next you will need to ensure the above code wraps the main code by using a decorator pattern. Luckily, it comes as a part of the Structure map and can be easily incorporate this in your code.</p>
<pre class="language-csharp"><code>public void ConfigureContainer(ServiceConfigurationContext context)
{
context.StructureMap().Configure(container =>
{
container.For<IOrderSubmitService>().Use<DefaultOrderSubmitService>();
// Can be used for logging or extending other behaviors of the Submit Order service
// container.For<IOrderSubmitService>().DecorateAllWith<LoggingOrderSubmitService>();
container.For<IOrderSubmitService>().DecorateAllWith<AlertingOrderSubmitService>();
}
}</code></pre>
<p>That's it. Add a breakpoint in the AlertingOrderSubmitService to see it in action. Every time it will hit the wrapper/decorator class and then into your concrete implementation of the functionality.</p>
<p>Happy coding!</p>Sending Files to Amazon S3 storage/blogs/aniket-gadre/dates/2023/2/optimizely-sending-files-to-amazon-s3-storage/2023-02-25T00:40:35.0000000Z<p>I am sure you have been asked by clients to create a scheduled job that generates a report and sends it to third party system for further processing.</p>
<p>One of our client asked us to generate a daily report of all the completed orders for further processing in Snowflake. We looked at multiple options, but one option that stood out was creating a scheduled job and storing these report/CSV files in Amazon's S3 storage (considering it was heavily used by the client). </p>
<p>The integration to Amazon's S3 was relatively simpler than I thought.</p>
<h2>Step 1: Keys/Credentials for S3 storage</h2>
<p>Once the client to generate the keys and the bucket and provide us with the following:</p>
<ul>
<li>User Name</li>
<li>Access Key ID</li>
<li>Secret Access Key</li>
</ul>
<h2>Step 2: Install S3 Browser</h2>
<p>This tool allows you to browse all the buckets with the client's S3 storage. You need to use the above credentials to browse the reports generated and published to this bucket.</p>
<p>URL: https://s3browser.com/download.aspx</p>
<p><img src="/link/344572b8fba648e082c39b624e20d11d.aspx" /></p>
<h2>Step 2: Install the AWS S3 SDK</h2>
<p><img src="/link/b7d8a299774c4c73a3ad2d31ff05eea1.aspx" /></p>
<h2>Step 3: Code</h2>
<p>Add the required code to transfer CSV files to AWS S3 storage</p>
<pre class="language-csharp"><code>namespace ClientName.Orders.DailyOrders.Services
{
using System;
using Amazon;
using Amazon.S3;
using Amazon.S3.Transfer;
public static class AmazonUploaderService
{
public static bool SendMyFileToS3(System.IO.Stream fileStream, string fileNameInS3)
{
try
{
var client = new AmazonS3Client(Amazon.RegionEndpoint.USEast1);
// create a TransferUtility instance passing it the IAmazonS3 created in the first step
TransferUtility utility = new TransferUtility(client);
// making a TransferUtilityUploadRequest instance
TransferUtilityUploadRequest request = new TransferUtilityUploadRequest();
request.BucketName = System.Configuration.ConfigurationManager.AppSettings["AWSBucketName"];
request.Key = fileNameInS3;
request.InputStream = fileStream;
utility.Upload(request);
}
catch (Exception ex)
{
return false;
}
return true; // indicate that the file was sent
}
}
}</code></pre>
<p>Call the above function using the following code.</p>
<pre class="language-csharp"><code>_s3FileName = $"{DateTime.UtcNow.ToString("yyyy-MM-dd-hh-mm-ss")}-Snowflake-W2-orders.csv";
StringBuilder _report = "Col1, Col2, Col3, Col4..."; // List of all orders using string builder
if (!string.IsNullOrWhiteSpace(_report?.ToString()))
{
byte[] byteArray = Encoding.ASCII.GetBytes(_report.ToString());
using (MemoryStream memoryStream = new MemoryStream(byteArray))
{
uploaded = AmazonUploaderService.SendMyFileToS3(memoryStream, _s3FileName);
}
}</code></pre>
<h2>Step 4: Test using S3 Browser</h2>
<p>Once the code is run successfully you should be able to view a list of all the reports generated using S3 browser you installed in step 2.</p>
<p>That's it! Happy coding :)</p>Alexa skill integration with Episerver - Part 1/blogs/aniket-gadre/dates/2019/3/alexa-skill-for-episerver/2019-03-05T20:38:25.0000000Z<p>We all know Episerver is a very powerful Enterprise CMS. The content authors and marketers have complete control over content, personalization, analytics as well as access to user data at their finget tips. </p>
<p>With that said, the technological landscape is changing and so is user interaction with new innovative devices. In my opinion, websites (including responsive mobile websites) will always be the most popular way for presenting information, but as developers & marketers we should be prepared for this giant wave of new trend heading our way. Example: "<span><em>50% of all searches will be voice searches by 2020</em>"</span>. Episerver is prepared with it's Headless API to allow developers and marketers to ride the wave with ease. Episerver headless API allows serving same content to various devices including voice devices, mobile apps and others. </p>
<p>I recently implemented an <span style="text-decoration: underline;"><strong>Alexa skill</strong></span> that queries Episerver CMS and returns meaningful data to Alexa device. Though this is a very basic version/proof-of-concept for an Alexa skill it opens up a whole lot of possibilites depending on your user's interaction with the website. The Alexa skill I implemented reads out the two latest <span style="text-decoration: underline;">news</span> or <span style="text-decoration: underline;">events</span> from a website. This can very well be extended to "<em>Give me the store locations for XYZ near Boston</em>", "<em>Are there any new promotions for ABC?</em>", "<em>Company's profit summary for this quarter</em>". You get the point :) </p>
<p>As an end user, if you can get quick information from your favorite brand without having to spend time searching for it on a website, it's a huge value add in terms of convienience and saved time. No more booting up a device, navigating to a website, typing in a search box, enduring frustrating UI and performance issues. Simply ask "Alexa" what you need and listen to it while getting ready to go to work or during commercials on TV. If you are a technology (or tech savvy) organization it's even more important to display innovation to let your users know that you always "keep up" with new trends in technology.</p>
<p><strong>Alexa Skill</strong>:</p>
<p>The core Alexa concept is pretty simple. It consists of:</p>
<ol>
<li><strong>Alexa skill kit </strong>(Front-end code) - <a href="https://developer.amazon.com/alexa/console/ask">https://developer.amazon.com/alexa/console/ask</a> </li>
<li><strong>Lambda function</strong> (Back-end code) - <a href="https://aws.amazon.com/lambda/">https://aws.amazon.com/lambda/</a> </li>
</ol>
<p>You will need to setup 2 accounts:</p>
<ol>
<li><strong>Amazon</strong> <strong>Developer</strong> account (to configure front end interactions using Amazon provided UI)</li>
<li><strong>AWS</strong> account (to host the back end code called lambda function)</li>
</ol>
<p><strong>NOTE</strong>: Amazon Developer Account has a <span style="text-decoration: underline;"><strong>BETA </strong><strong>feature</strong></span> that allows you to host the skill and the code in the same interface which is a super convinient and easy to understand. This is highly recommed if you want to get your Alexa skill up and running in minutes. When you create a new skill select the option "Alexa Hosted (Beta)" and you should be able to host and update the code in the same Amazon developer account. </p>
<p><span style="text-decoration: underline;"><strong>Definitions</strong></span>:</p>
<p><strong>Skill Name</strong>: The name of the skill that will be used when you publish your skill to Amazon.</p>
<p><strong>Invocation Name</strong>: The term user will call out to invoke/start interaction with your skill. For ex: If the invocation name is "Fun Demo" the user can say Alexa open "Fun Demo" or Alexa start "Fun Demo"</p>
<p><strong>Intent</strong>: <span>Intents allow you to specify what a user will say to invoke the skill. For ex: Get me the latest news or find me the closest stores in Boston area. You can create a custom intent as well as update the out-of-the-box Amazon provided Intents (such as CancelIntent or HelpIntent).</span></p>
<p><span><strong>Slots</strong>: Slots are nothing but parameters you can pass to the Intent to allow dynamic terms. For example: Order me {number} {size} pizza. The terms number and size are two dymamic parameters passed to the Intent.</span></p>
<p><strong>Endpoint</strong>: Endpoint is to connect your front end code (Invocation, Intents) to the backend code. If you are using Alexa hosted beta feaure, then no configuration is necessary. If the back end code is self-hosted (or a rest end point) these values need to be configured. The end point can be a REST endpoint which returns valid data or a lambda function that hosts your backend code. </p>
<p>To get started, as a first step I recommend setting up a simple Web API REST endpoint in Episerver that returns a JSON object. Ideally you would want to setup Episerver Headless API but for a quick demo a simple REST endpoint should be enough.</p>
<p>I will get into the details of Alexa skill implementation and integrating it with Episerver in my next blog post.</p>
<p><br />Stay tuned!</p>Max & Min element validation for Content Area or Link Collection/blogs/aniket-gadre/dates/2018/10/max--min-validators-for-content-area-or-link-collection/2018-10-07T04:35:21.0000000Z<p>Reccently we had a business requirement for setting minimum & maximum limit for blocks in Content Areas. While we can educate the content authors to set the correct number of blocks in the content area it's always recommended to add validation within the CMS to avoid human errors.</p>
<p>Here's the code to to set the maximum number of blocks in the content area and Link Item Collection. </p>
<pre class="language-html"><code>/// <summary>
/// Sets the maximum element count in a linkcollection, a content area - or any other type of collection.
/// </summary>
[AttributeUsage(AttributeTargets.Property, AllowMultiple = false)]
public class MaxElementsAttribute : ValidationAttribute, IMetadataAware
{
public int MaxCount { get; set; }
public void OnMetadataCreated(ModelMetadata metadata)
{
//TODO: Use to disable editor drag and drop at a certain point.
}
protected override ValidationResult IsValid(object value, ValidationContext validationContext)
{
if (value == null)
{
return null;
}
if (value is LinkItemCollection)
{
if ((value as LinkItemCollection).Count > MaxCount)
{
return new ValidationResult("This field exceeds the maximum limit of " + MaxCount + " items");
}
}
else if (value is ContentArea)
{
if ((value as ContentArea).Count > MaxCount)
{
return new ValidationResult("This field exceeds the maximum limit of " + MaxCount + " items");
}
}
return null;
}
public MaxElementsAttribute(int MaxElementsInList)
{
this.MaxCount = MaxElementsInList;
}
}</code></pre>
<p>On the Content Area field (or Link Item Collection) set the MaxElements attribute as shown below. </p>
<pre class="language-html"><code>[Display(
Name = "Items",
Description = "Items",
GroupName = SystemTabNames.Content,
Order = 30)]
[MaxElements(25)]
public virtual ContentArea Items { get; set; }</code></pre>
<p>You can use the same logic for setting the minimum number of elements as well. </p>
<p><br /><br /></p>
<p>Happy coding :)</p>Boston's Episerver Developer/Editor meetup/blogs/aniket-gadre/dates/2018/5/boston-summer-2018-episerver-developereditor-meetup/2018-05-29T22:26:41.0000000Z<!DOCTYPE html>
<html>
<head>
</head>
<body>
<p>Join us on <strong>Tuesday, June 12th 2018</strong> from <strong>5:30 - 7:30 pm</strong> for the first Episerver Developer/Editor Meetup in Boston hosted by Rightpoint.</p>
<p>Speakers include <a href="https://www.linkedin.com/in/robfolan/">Rob Folan</a> from <a href="http://www.episerver.com/">Episerver</a> & <a href="https://www.linkedin.com/in/aniketdgadre/">myself</a> from <a href="https://www.rightpoint.com/">Rightpoint</a>.</p>
<p>Rob will talk about the cool refreshing summer updates with the Episerver platform (Insight, Perform, Advance & Campain) and I will talk about the hot Digital Experience Cloud.</p>
<p>To continue with the theme, we will be serving HOT pizza and COLD beverages.</p>
<p>For more information and to RSVP, please visit the Eventbrite page <a href="https://www.eventbrite.com/e/episerver-developer-editor-boston-summer-meetup-tickets-45780188777">here</a>.</p>
<p>Hope to see you all there!</p>
</body>
</html>Dynamic Drop down list editable by content authors using ISelectionFactory & Property List/blogs/aniket-gadre/dates/2018/4/dynamic-drop-down-list-editable-by-content-authors-using-iselectionfactory---property-list/2018-04-27T16:00:51.0000000Z<!DOCTYPE html>
<html>
<head>
</head>
<body>
<p>On one of my recent projects, we wanted the ability for content authors to manage a list of items in the dropdown through the CMS. Though, it's simple to create a list of items that can be retrieved from a constants class or an appsetting.config, it does require developer to make that change. This also means there needs to be a deployment to production for a simple name/value change in a dropdown.</p>
<p>To get around this standard implementation, we implemented a simple Property List on the Start Page and referenced the property in the ISelectionFactory implementation to get the key value pair.</p>
<p>The dropdown list can now be managed by the content authors without relying on the developer to make a change. </p>
<p>Here's the code:</p>
<pre class="language-html"><code> public class AccountPropertiesFactory : ISelectionFactory
{
public IEnumerable<ISelectItem> GetSelections(ExtendedMetadata metadata)
{
var contentRepository = ServiceLocator.Current.GetInstance<IContentLoader>();
var startPage = contentRepository.Get<StartPage>(ContentReference.StartPage);
var selectItems = new List<SelectItem>();
foreach(var accountProperty in startPage.AccountTypePropertyList)
{
selectItems.Add( new SelectItem()
{
Text = accountProperty.Text,
Value = accountProperty.Text
}
);
}
return selectItems.ToArray();
}
}</code></pre>
</body>
</html>Continuous Integration & Deployment with VSTS & DXC (Azure Integration environment)/blogs/aniket-gadre/dates/2017/4/continuous-builddeployment-with-vsts--dxc/2017-04-04T21:03:31.7770000Z<!DOCTYPE html>
<html>
<head>
</head>
<body>
<p>I recently implemented continous integration & deployment using Visual Studio Team Services in Epi's DXC environment. Though I now feel it's pretty simple, it can be a little tricky if you have no prior experience configuring it.<br /><br /></p>
<h2>What is CI/CD?</h2>
<p>For folks who are new to build & deployment process, Continuous Integration (CI) is a development practice that requires developers to integrate code into a shared repository several times a day. Each check-in is then verified by an automated build, allowing teams to detect problems early. <span>Because you’re integrating so frequently, there is significantly less back-tracking to discover where things went wrong, so you can spend more time building features. Continuous Deployment (CD) is an extension of CI, where you can auto-deploy your integrated changes to the server (Integration environment in case of DXC). This significantly reduces the risks, costs and effort to deploy and allows you to deliver features to the clients sooner.<br /><br /></span></p>
<h2>Publish Profile:</h2>
<p>One important thing you need to get your head wrapped around is the concept of Publish Profile. If you are using Epi DXC service, you have azure portal access to the Integration environment (<a href="https://portal.azure.com">https://portal.azure.com</a>). If you don't, you should ask Episerver managed services to give you the access you need. The publish profile allows you to publish your code to Azure/Integration environment from Visual Studio (by setting up publishing profile) or VSTS (by adding it to the build definition). I prefer using VSTS since there are a lot more configuration options avaialable to you. </p>
<p>Here's how you download the publishing profile from Azure (Integration environment).</p>
<p>App Services (left menu) > Select your App Service. You should see the "Get publish profile" button in your rightmost pane.</p>
<p><img src="/link/8326df1d6cda47279d2412fc96f022e3.aspx" alt="Image PublishProfile.GIF" /> </p>
<p>You will need this in order to integrate with VSTS for CI/CD. <br /><br /></p>
<h2>Buid Definition:</h2>
<p>VSTS allows you to setup multiple build definitions.</p>
<p>You can have one build definition called "Develop CI" (that builds the develop branch) & one for "Integration CI/CD" (that builds & also deploys the code from the master branch). By separating out these build definitions, you can make sure that only master branch code (that is final) gets auto-deployed to the the Integration environment. </p>
<p><img src="/link/427d0c62540b46cda6e6c84f53d0a7cd.aspx" width="664" alt="Image Build-Definition-3.GIF" height="332" /></p>
<p>For integration deployment under MSBuild Arguments you can provide details using the publishing profile you downloaded from Azure Integration environmnent. This will publish your changes after successfully building the "master" branch. If you see the "Get Sources" points to the master branch v/s develop branch as shown above.</p>
<p><img src="/link/52062711434a4493950a4312a1dfae35.aspx" width="1362" alt="Image Integration-Build-Definition.GIF" height="472" /></p>
<p>You can also setup triggers to run a build on every check-in (or scheduled) for the master branch, so when someone (hopefully a team lead/architect) pushes to the mater branch VSTS builds the solution (including nuget, gulp processes) and publishes your changes to the Integration environment without any manual intervention.</p>
<p>This is just a basic process to setup CI/CD in the DXC environment. You can get get more creative with a lot of in-built features from VSTS.</p>
<p>Hope this helps! Comments are welcome...</p>
<p>AG </p>
</body>
</html>Episerver Certification - Tips & Tricks!!!/blogs/aniket-gadre/dates/2017/3/episerver-certification-9-0/2017-03-24T15:48:49.6330000Z<!DOCTYPE html>
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />
<meta name="author" content="Aniket Gadre" />
<meta name="description" content="Episerver certification, exam tips, tips and tricks" />
<meta name="keywords" content="Episerver Certification, exam tips, episerver cms, CMS, Episerver exam" />
<meta property="og:title" content="Episerver Certification - Tips & Tricks!" />
<meta property="”og:image”" content="http://cdn.certmag.com/wp-content/uploads/2015/07/Passed-the-exam-300x202.jpg" alt="Image result for exam you passed" />
<title>Episerver Certification - Tips & Tricks!</title>
</head>
<body>
<p>Yesterday, I passed the Episerver 9 Certifcation, so sharing my thoughts while it's fresh in my mind.</p>
<p><img src="http://cdn.certmag.com/wp-content/uploads/2015/07/Passed-the-exam-300x202.jpg" alt="Image result for exam you passed" /></p>
<h2><span style="text-decoration: underline;">Preparation</span>:</h2>
<ul>
<li>If you have attended "<strong>Bootcamp</strong>", make sure you know the materials covered in the class inside out. This is your bible for the exam. I would say, at least 30-40% of the exam questions will be based on the material covered during the course. About 30-40% will be based on your understanding of these concepts. The rest of the questions can be based on admin, editor guides, SDK or sample Alloy website.</li>
<li>Alloy, Alloy, Alloy. Did I mention the <strong>Alloy</strong> website? Look at every single method and class (if it doesn't make sense, google is your friend).</li>
<li>Have at least one or two projects under your belt. To give you an anology - it's the difference between reading a book on swimming and actually swimming in the water.</li>
<li>Understand all areas of CMS like editing interface, gadgets, mirroring, personalization, multi-site setup, languages, friendly urls, access rights, scheduled jobs etc. by clicking on each link and/or reading the admin/editor guides. </li>
<li>Know all areas of development like CRUD operations on blocks/pages, templates, caching, modules, add ons, filtering, all base classes & APIs, localization, attributes, custom editor, dynamic data etc.</li>
<li>Focus on the concepts. Ask yourself two questions - "<strong>How</strong> does it work" & "<strong>Why</strong> does it work that way". </li>
</ul>
<h2><span style="text-decoration: underline;">About the Exam</span>:</h2>
<p>Knowledge areas you will be tested on:</p>
<ul>
<li>Production knowledge</li>
<li>Installation</li>
<li>Content Model</li>
<li>Creating Websites</li>
<li>Advanced Concepts</li>
</ul>
<p>It's a multiple choice exam with <strong>64 questions</strong> and 120 mins, so roughly <strong>2 mins per question</strong>. I don't think time is an issue because I completed mine 45 mins before time. One tip I would like to give is, if you aren't confident about something, skip the quetsion and come back to it later. You can't go back once you have answered a question.</p>
<p>There are 3 types of multiple choice questions:</p>
<ol>
<li>Select <strong>ONE</strong> of the following 4 options.</li>
<li>Select <strong>TWO</strong> of the following 4 options.</li>
<li>Which of these is <strong>NOT</strong> true of the 4 options.</li>
</ol>
<p>The last two are a little tricky because if you answer one out of the two wrong, your answer will be marked as incorrect. Be careful becuase the options are a little tricky :) </p>
<h2><span style="text-decoration: underline;">Before Exam (Remote)</span>:</h2>
<ul>
<li>You will receive an email from Episerver about the instructions 24-48 hours before the exam. Make sure you read these instructions carefully. This is a closed book (application) exam, so make sure you have nothing around you (including cell phones) and close all your applications. No multiple screens allowed during the exam.</li>
<li>Book a quite conference room with a good internet connection (preferably wired). </li>
<li>For remote exam, make sure your computer/laptop is compatible based on the instructions provided. This includes web camera, microphone, operating system etc.</li>
<li>I would recommend you take the practice test (included in the instructions), so you are familair with the remote proctoring software and not figuring it out during the exam.</li>
</ul>
<h2><span style="text-decoration: underline;">During Exam</span>:</h2>
<ul>
<li>Keep your cool and don't rush. Some of the questions are difficult so you may have to wing it at the end. But you will get enough easy/medium level questions to pass the exam. </li>
<li>Use some educated guesses. As long as you know the concepts on "How" and "Why" it will be easy for you to pick the right answer based on what you know.</li>
<li>Don't leave the room or talk to anyone and avoid any distractions. </li>
</ul>
<h2><span style="text-decoration: underline;">After Exam</span>:</h2>
<p>You will receive a PASS or FAIL right after you complete the exam. If you passed the exam, you will receive an official email from Episerver after reviewing your recording within 5 working days. If you fail, you will be able to re-take the exam after 21 days. </p>
<p>Hope this helps, if you have any questions reach out to me on <a href="mailto:agadre@rightpoint.com">agadre@rightpoint.com</a></p>
<p>Good luck!</p>
<p>AG</p>
</body>
</html>