<?xml version="1.0" encoding="utf-8"?><feed xmlns="http://www.w3.org/2005/Atom"><title type="text">Blog posts by Aniket Gadre</title><link href="http://world.optimizely.com" /><updated>2026-03-12T02:21:00.0000000Z</updated><id>https://world.optimizely.com/blogs/aniket-gadre/</id> <generator uri="http://world.optimizely.com" version="2.0">Optimizely World</generator> <entry><title>Optimizely Commerce vs Composable Commerce: What Should You Do with CMS 13?</title><link href="https://world.optimizely.com/blogs/aniket-gadre/dates/2026/3/my-pov-on-optimizely-cms-13-/" /><id>&lt;p&gt;As organizations modernize their digital experience platforms, a common architectural question emerges:&amp;nbsp;Should we continue using Optimizely Commerce with CMS 13, or move to a composable commerce platform?&lt;/p&gt;
&lt;p&gt;This decision is becoming increasingly important as companies adopt headless frontends, API-driven architectures, and AI-powered content workflows. While Optimizely continues to provide a strong enterprise CMS platform, the broader commerce ecosystem has evolved significantly in recent years.&lt;/p&gt;
&lt;p&gt;In this post, I&#39;ll break down the two primary approaches, the trade-offs between them, and how to determine which model is right for your organization.&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;The Traditional Optimizely Stack: CMS + Commerce&lt;/h2&gt;
&lt;p&gt;Historically, most enterprise Optimizely implementations used a tightly integrated stack combining CMS and Commerce.&lt;/p&gt;
&lt;h3&gt;Typical Architecture&lt;/h3&gt;
&lt;pre&gt;&lt;code&gt;Frontend (MVC / Next.js) -&amp;gt; Optimizely CMS -&amp;gt; Optimizely Commerce Connect 14 -&amp;gt; Catalog / Cart /Orders
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;In this model:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;CMS manages content, pages, and marketing experiences.&lt;/li&gt;
&lt;li&gt;Commerce manages the product catalog, pricing, cart, checkout, and orders.&lt;/li&gt;
&lt;li&gt;Both systems run within the same application environment.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;Advantages&lt;/h3&gt;
&lt;p&gt;The integrated approach offers several benefits:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;Strong content-commerce integration&lt;/strong&gt;&lt;br /&gt;Editors can easily blend marketing content with product experiences.&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Built-in commerce functionality&lt;/strong&gt;&lt;br /&gt;Commerce includes catalog management, promotions, pricing, carts, and order management.&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Mature enterprise platform&lt;/strong&gt;&lt;br /&gt;Optimizely Commerce has powered many large-scale digital commerce implementations.&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Unified editorial experience&lt;/strong&gt;&lt;br /&gt;Marketing and merchandising teams operate within a single system.&lt;/li&gt;
&lt;/ol&gt;
&lt;h3&gt;Challenges&lt;/h3&gt;
&lt;p&gt;However, this architecture can introduce limitations:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Scaling commerce independently from CMS can be difficult.&lt;/li&gt;
&lt;li&gt;Deployments and releases are often tightly coupled.&lt;/li&gt;
&lt;li&gt;Customization can become complex over time.&lt;/li&gt;
&lt;li&gt;Innovation cycles may lag behind newer SaaS commerce platforms.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;These challenges are one reason many organizations are exploring a different approach.&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;The Rise of Composable Commerce&lt;/h2&gt;
&lt;p&gt;In recent years, many companies have shifted toward composable commerce architectures, where content and commerce platforms are separated and integrated through APIs.&lt;/p&gt;
&lt;h3&gt;Typical Architecture&lt;/h3&gt;
&lt;pre&gt;&lt;code&gt;Frontend (Next.js / React) -&amp;gt; Optimizely CMS 13 -&amp;gt; GraphQL / APIs -&amp;gt; Composable Commerce Platform
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Examples of composable commerce platforms include:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;commercetools&lt;/li&gt;
&lt;li&gt;Shopify&lt;/li&gt;
&lt;li&gt;Elastic Path&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;In this model:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;CMS focuses on content and experience&lt;/li&gt;
&lt;li&gt;Commerce platforms handle transactions, catalogs, pricing, and orders&lt;/li&gt;
&lt;li&gt;The frontend integrates both systems via APIs.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;Benefits&lt;/h3&gt;
&lt;p&gt;Composable commerce introduces several architectural advantages.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Independent scalability&lt;/strong&gt;&lt;br /&gt;Commerce services can scale separately from content platforms.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Faster innovation&lt;/strong&gt;&lt;br /&gt;Modern SaaS commerce vendors release features rapidly.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Best-of-breed architecture&lt;/strong&gt;&lt;br /&gt;Organizations can choose specialized platforms for each capability.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Better alignment with headless development&lt;/strong&gt;&lt;br /&gt;API-first commerce platforms integrate seamlessly with frameworks like Next.js and React.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;strong&gt;Trade offs&lt;/strong&gt;:&lt;/p&gt;
&lt;p&gt;However, composable commerce also introduces complexity.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Integration work increases.&lt;/li&gt;
&lt;li&gt;Architecture governance becomes more important.&lt;/li&gt;
&lt;li&gt;Editorial experiences may require additional tooling to replicate traditional CMS-commerce workflows.&lt;/li&gt;
&lt;/ul&gt;
&lt;hr /&gt;
&lt;h2&gt;Why This Conversation Is Happening Now&lt;/h2&gt;
&lt;p&gt;Three major industry trends are driving this architectural shift.&lt;/p&gt;
&lt;h3&gt;1. Headless Frontends&lt;/h3&gt;
&lt;p&gt;Many organizations are moving to modern frontend frameworks such as Next.js and React.&lt;/p&gt;
&lt;p&gt;These frameworks work best with API-first services, making composable commerce platforms a natural fit.&lt;/p&gt;
&lt;h3&gt;2. Rapid Innovation in Commerce Platforms&lt;/h3&gt;
&lt;p&gt;Commerce vendors like commercetools and Elastic Path are delivering rapid innovation in areas such as:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;subscription commerce&lt;/li&gt;
&lt;li&gt;marketplace models&lt;/li&gt;
&lt;li&gt;advanced promotions&lt;/li&gt;
&lt;li&gt;global scaling capabilities&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;3. Optimizely&amp;rsquo;s Strategic Focus&lt;/h3&gt;
&lt;p&gt;Optimizely has been heavily investing in:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;content management&lt;/li&gt;
&lt;li&gt;experimentation&lt;/li&gt;
&lt;li&gt;personalization&lt;/li&gt;
&lt;li&gt;AI workflows through Opal&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Commerce remains part of the ecosystem, but many organizations are evaluating whether external commerce platforms better align with modern architecture strategies.&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;When Optimizely Commerce Still Makes Sense&lt;/h2&gt;
&lt;p&gt;For many existing customers, staying with Optimizely Commerce remains the most practical choice.&lt;/p&gt;
&lt;p&gt;Continuing with Commerce Connect is typically ideal when:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;You already operate an Optimizely Commerce implementation&lt;/li&gt;
&lt;li&gt;Catalog complexity is moderate&lt;/li&gt;
&lt;li&gt;Editorial and merchandising teams rely on tight CMS integration&lt;/li&gt;
&lt;li&gt;Replatforming costs would be significant&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;For these organizations, CMS 13 paired with Commerce 14 remains a stable and proven architecture.&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;When Composable Commerce Is the Better Choice&lt;/h2&gt;
&lt;p&gt;Composable commerce may be the better option when:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Building a new commerce platform&lt;/li&gt;
&lt;li&gt;Operating at global scale&lt;/li&gt;
&lt;li&gt;Supporting marketplaces or subscription models&lt;/li&gt;
&lt;li&gt;Adopting microservices-based architectures&lt;/li&gt;
&lt;li&gt;Delivering experiences across multiple channels (web, mobile, kiosks, apps)&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;In these scenarios, separating the experience layer from the transaction engine can offer significant flexibility.&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;The Hybrid Model Many Enterprises Are Adopting&lt;/h2&gt;
&lt;p&gt;Interestingly, many organizations are landing somewhere in the middle.&lt;/p&gt;
&lt;p&gt;Instead of replacing Optimizely CMS, they use it as the experience orchestration layer, while delegating commerce operations to a composable platform.&lt;/p&gt;
&lt;h3&gt;Hybrid Architecture&lt;/h3&gt;
&lt;pre&gt;&lt;code&gt;Next.js Frontend -&amp;gt; Optimizely CMS -&amp;gt; Composable Commerce API -&amp;gt; ERP / PIM / Payment Services
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;In this approach:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;CMS manages content and experience orchestration&lt;/li&gt;
&lt;li&gt;Commerce platforms handle transactions and catalog services&lt;/li&gt;
&lt;li&gt;Frontend applications unify the experience&lt;/li&gt;
&lt;/ul&gt;
&lt;hr /&gt;
&lt;h2&gt;Final Thoughts&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;There is no universal answer to the Optimizely Commerce vs composable commerce question.&lt;/li&gt;
&lt;li&gt;For existing Optimizely customers, continuing with Commerce 14 is often the most pragmatic choice.&lt;/li&gt;
&lt;li&gt;For new digital commerce initiatives, however, many organizations are evaluating composable commerce platforms alongside Optimizely CMS to build more flexible architectures.&lt;/li&gt;
&lt;li&gt;The key is understanding your organization&amp;rsquo;s priorities: editorial workflows, architectural flexibility, innovation velocity, and long-term platform strategy.&lt;/li&gt;
&lt;li&gt;Optimizely CMS remains a powerful enterprise content platform. The real question is simply how commerce fits into the future architecture around it.&lt;/li&gt;
&lt;/ul&gt;</id><updated>2026-03-12T02:21:00.0000000Z</updated><summary type="html">Blog post</summary></entry> <entry><title>Creating a custom tool for Opal AI in Google Cloud using Python SDK</title><link href="https://world.optimizely.com/blogs/aniket-gadre/dates/2025/11/unlock-opal-productivity-step-by-step-process-of-creating-a-custom-tool-for-ado/" /><id>&lt;div class=&quot;markdown markdown-main-panel tutor-markdown-rendering enable-updated-hr-color&quot;&gt;
&lt;div class=&quot;markdown markdown-main-panel tutor-markdown-rendering enable-updated-hr-color&quot;&gt;
&lt;p&gt;I had the opportunity of participating in the Opal AI Hackathon challenge, where we built a custom tool using Optimizely&#39;s Opal Python SDK.&lt;/p&gt;
&lt;p&gt;This article walks you through building the tool in Python and deploying it securely on Google Cloud Run.&amp;nbsp;The goal of this tool is to enable an Opal Agent to instantly create a fully-detailed Azure DevOps (ADO) User Story from a simple request.&lt;/p&gt;
&lt;h2&gt;The Core: The Optimizely Opal Tools SDK&lt;/h2&gt;
&lt;p&gt;Optimizely provides a variety of SDKs to connect to their services. For an Opal Agent to use your service, it first needs a blueprint defining what your tool does, what inputs it needs, and how to execute it. This blueprint is called the&amp;nbsp;&lt;strong&gt;Tool Manifest&lt;/strong&gt;, exposed via a standardized `/discovery`&amp;nbsp;endpoint.&lt;/p&gt;
&lt;p&gt;The Python Opal Tools SDK abstracts away the complexity of managing this API contract. By simply decorating a standard Python function with `@tool`, the SDK automatically handles:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;Generating the required OpenAPI-compatible &lt;strong&gt;Discovery Manifest&lt;/strong&gt; at &#39;/discovery&#39;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Routing incoming `POST` requests to the correct function.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Validating and parsing the JSON input based on your Pydantic model.&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;h3&gt;Tool Service Code (main.py)&lt;/h3&gt;
&lt;p&gt;The main.py file is the main entry point of your code and the service uses FastAPI for routing and the Opal SDK for tool definition. You can definine a single application instance that hosts multiple tool endpoints. In this case, I built a single tool endpoint.&lt;/p&gt;
&lt;pre class=&quot;language-python&quot;&gt;&lt;code&gt;import os
import base64
import httpx
import pdb
from fastapi import FastAPI
from pydantic import BaseModel, Field
from opal_tools_sdk import ToolsService, tool
from dotenv import load_dotenv

# --- Configuration ---
# In production, load these from os.environ for security
ADO_ORG = &quot;rightpoint&quot;
ADO_PROJECT = &quot;Optimizely-Opal-Challenge-2025&quot;
load_dotenv()
ADO_PAT = os.environ.get(&quot;ADO_PAT&quot;)

tag = &quot;opal-2025&quot;

# Encode PAT for Basic Auth
auth_str = f&quot;:{ADO_PAT}&quot;
b64_auth = base64.b64encode(auth_str.encode()).decode()
HEADERS = {
    &quot;Authorization&quot;: f&quot;Basic {b64_auth}&quot;,
    &quot;Content-Type&quot;: &quot;application/json-patch+json&quot;
}

# --- App Setup ---
app = FastAPI()
# This initializes the /discovery endpoint automatically
service = ToolsService(app)

# --- Parameters Model ---
class UserStoryParams(BaseModel):
    title: str = Field(..., description=&quot;The title of the user story&quot;)
    description: str = Field(..., description=&quot;Detailed description of the user story&quot;)
    acceptance_criteria: str = Field(None, description=&quot;Acceptance criteria for the story&quot;)

# --- The Tool Definition ---
@tool(
    name=&quot;create_ado_user_story&quot;,
    description=&quot;Creates a new User Story in Azure DevOps with a title and description.&quot;,
)

async def create_ado_user_story(params: UserStoryParams):
    &quot;&quot;&quot;
    Creates a User Story in Azure DevOps.
    &quot;&quot;&quot;
    url = f&quot;https://dev.azure.com/{ADO_ORG}/{ADO_PROJECT}/_apis/wit/workitems/$User%20Story?api-version=7.1&quot;

    # Azure DevOps requires a JSON Patch document
    payload = [
        {
            &quot;op&quot;: &quot;add&quot;,
            &quot;path&quot;: &quot;/fields/System.Title&quot;,
            &quot;value&quot;: params.title
        },
        {
            &quot;op&quot;: &quot;add&quot;,
            &quot;path&quot;: &quot;/fields/System.Description&quot;,
            &quot;value&quot;: params.description
        },
        {
            &quot;op&quot;: &quot;add&quot;,
            &quot;path&quot;: &quot;/fields/System.Tags&quot;,
            &quot;value&quot;: tag
        }
    ]

    if params.acceptance_criteria:
        payload.append({
            &quot;op&quot;: &quot;add&quot;,
            &quot;path&quot;: &quot;/fields/Microsoft.VSTS.Common.AcceptanceCriteria&quot;,
            &quot;value&quot;: params.acceptance_criteria
        })

    response = None
    try:
        # Use httpx.AsyncClient for non-blocking I/O inside an async function
        async with httpx.AsyncClient(timeout=30.0) as client:
            response = await client.post(url, headers=HEADERS, json=payload)
            response.raise_for_status()
            data = response.json()

        # Return a dictionary directly. This dictionary is the FINAL return value
        # and should not be awaited by the SDK&#39;s wrapper.
        return {
            &quot;status&quot;: &quot;success&quot;,
            &quot;id&quot;: data.get(&quot;id&quot;),
            &quot;link&quot;: data.get(&quot;_links&quot;, {}).get(&quot;html&quot;, {}).get(&quot;href&quot;),
            &quot;message&quot;: f&quot;User Story #{data.get(&#39;id&#39;)} created successfully.&quot;
        }
    except httpx.RequestError as e: 
        # Handle all httpx communication errors (DNS, connection, etc.)
        response_text = &quot;No response body available.&quot; if response is None else response.text
        print(f&quot;HTTPX Request Error: {str(e)}\nDetails: {response_text}&quot;)
        return {
            &quot;status&quot;: &quot;error&quot;,
            &quot;message&quot;: f&quot;Connection/Request Error: {str(e)}&quot;,
            &quot;details&quot;: response_text
        }
    except Exception as e:
        # Handle all other exceptions (including raise_for_status errors)
        response_text = &quot;N/A&quot;
        if response is not None and response.text:
            response_text = response.text
            
        print(f&quot;Unexpected Error: {str(e)}\nResponse Body: {response_text}&quot;)
        return {
            &quot;status&quot;: &quot;error&quot;,
            &quot;message&quot;: f&quot;An unexpected error occurred: {str(e)}&quot;,
            &quot;details&quot;: response_text
        }

# Run locally for testing
if __name__ == &quot;__main__&quot;:
    import uvicorn
    # CRITICAL: Ensure you are running this command, which uses the uvicorn async server.
    uvicorn.run(app, host=&quot;0.0.0.0&quot;, port=8000)&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;When writing tools for high-performance cloud environments like Cloud Run, it&#39;s essential to use asynchronous (async) code. This prevents a single network request (like waiting for the slow ADO API response) from blocking the entire Python process, allowing the server to handle dozens of requests simultaneously.&lt;/p&gt;
&lt;p&gt;We achieve this by defining the function as `async def` and using the asynchronous HTTP client, `httpx`.&lt;/p&gt;
&lt;p&gt;The tool constructs the ADO JSON Patch payload, uses an injected Personal Access Token (PAT) for authentication, and executes the asynchronous network call.&lt;/p&gt;
&lt;div class=&quot;attachment-container unknown&quot;&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;/div&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h2&gt;Deployment: Building on Google Cloud Run&lt;/h2&gt;
&lt;p&gt;To make your tool publicly accessible to Optimizely Opal, we deploy it as a serverless container on Google Cloud Run.&lt;/p&gt;
&lt;h3&gt;Deployment Files&lt;/h3&gt;
&lt;p&gt;We use a `requirements.txt`&amp;nbsp;to manage dependencies and a `Dockerfile`&amp;nbsp;for the deployment.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;a) Dependencies (`requirements.txt`): &lt;/strong&gt;This tells the application the required dependencies to install for the application to run successfully.&lt;/p&gt;
&lt;pre class=&quot;language-python&quot;&gt;&lt;code&gt;fastapi==0.110.0
uvicorn==0.27.1
requests==2.31.0
optimizely-opal.opal-tools-sdk
pydantic
python-dotenv&lt;/code&gt;&lt;/pre&gt;
&lt;div class=&quot;attachment-container unknown&quot;&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;/div&gt;
&lt;p&gt;&lt;strong&gt;b) Container Blueprint (`Dockerfile`):&lt;/strong&gt;&lt;/p&gt;
&lt;pre class=&quot;language-python&quot;&gt;&lt;code&gt;# Use the official lightweight Python image.
# https://hub.docker.com/_/python
FROM python:3.12-slim

# Allow statements and log messages to immediately appear in the Knative logs
ENV PYTHONUNBUFFERED True

# Copy local code to the container image.
ENV APP_HOME /app
WORKDIR $APP_HOME
COPY . ./

# Install production dependencies.
RUN pip install --no-cache-dir -r requirements.txt

# Run the web service on container startup. Here we use the gunicorn
# webserver, with one worker process and 8 threads.
# For environments with multiple CPU cores, increase the number of workers
# to be equal to the cores available.
# Timeout is set to 0 to disable the timeouts of the workers to allow Cloud Run to handle instance scaling.
CMD exec uvicorn main:app --host 0.0.0.0 --port $PORT&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;This container ensures security by running the application under a non-root user and explicitly defines the startup command.&lt;/p&gt;
&lt;div class=&quot;attachment-container unknown&quot;&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;/div&gt;
&lt;h3&gt;Deployment Steps (gcloud CLI)&lt;/h3&gt;
&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Secure the PAT:&lt;/strong&gt; Upload your Azure DevOps PAT to Google Cloud Secret Manager (recommended, as shown in previous context).&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Build and Deploy:&lt;/strong&gt; Use the `gcloud` CLI to build the container from source and deploy, mapping the secret to the `ADO_PAT`&amp;nbsp;environment variable.&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;&lt;!----&gt;
&lt;div class=&quot;code-block ng-tns-c4228668850-44 ng-animate-disabled ng-trigger ng-trigger-codeBlockRevealAnimation&quot;&gt;&lt;!----&gt;
&lt;div class=&quot;formatted-code-block-internal-container ng-tns-c4228668850-44&quot;&gt;
&lt;div class=&quot;animated-opacity ng-tns-c4228668850-44&quot;&gt;
&lt;p&gt;&lt;strong&gt;Enable APIs (if necessary): &lt;/strong&gt;Note you may need to enable the google cloud billing for certain services to work.&lt;br /&gt;gcloud services enable cloudbuild.googleapis.com run.googleapis.com secretmanager.googleapis.com&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;&lt;br /&gt;Build and run the application:&lt;br /&gt;&lt;/strong&gt;gcloud run deploy opal-ado-tool \&lt;br /&gt;&amp;nbsp; &amp;nbsp; --region us-central1 \&lt;br /&gt;&amp;nbsp; &amp;nbsp; --source . \&lt;br /&gt;&amp;nbsp; &amp;nbsp; --allow-unauthenticated \&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Integrating the Tool into Optimizely Opal&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Once deployed, your service provides a public URL (e.g., `https://opal-ado-tool-xyz.run.app`).&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Register Discovery Endpoint&lt;/strong&gt;: In the Optimizely Opal UI, register the tool using the public URL appended with `/discovery`.&lt;br /&gt;&lt;strong&gt;Agent Workflow&lt;/strong&gt;: Configure an Agent Workflow step to first synthesize the necessary ADO parameters (Org, Project, Title, Description) from an unstructured request, and then automatically feed that structured JSON output directly into the `create_ado_user_story` tool.&lt;/p&gt;
&lt;p&gt;By bridging the gap between your conversational AI input and your crucial development systems, you empower your Optimizely agents to become powerful, action-oriented contributors to your product delivery lifecycle.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;/div&gt;
&lt;/div&gt;
&lt;/div&gt;
&lt;/div&gt;
&lt;/div&gt;</id><updated>2025-12-01T03:06:59.0000000Z</updated><summary type="html">Blog post</summary></entry> <entry><title>Getting around asynchronous limitations in Optimizely (Visitor Groups etc.)</title><link href="https://world.optimizely.com/blogs/aniket-gadre/dates/2025/8/audiencevisitor-groups-asynchronous-limitation-/" /><id>&lt;p&gt;Optmizely has a powerful personalization engine that allows creating custom Audiences/Visitor groups. It comes with one limitation though. It doesn&#39;t support ASYNC operations. The below solution can work for any scenario where you cannot run async operations in Optimizely.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Here&#39;s a real world use case and how we got around it.&lt;/p&gt;
&lt;p&gt;Personalizing user experiences often starts with knowing where your visitor is coming from. One lightweight approach is to translate the user&#39;s IP address into a &lt;strong&gt;zip code&lt;/strong&gt; to personalize the experience for users depending on their location (state run promotion, etc.). In Optimizely CMS 12 (running on ASP.NET Core 8), on of the way to do this is by using&amp;nbsp;&lt;strong&gt;middleware&lt;/strong&gt;.&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;Why Middleware?&lt;/h2&gt;
&lt;p&gt;&lt;strong&gt;Visitor Groups &lt;/strong&gt;has a key limitation: visitor group criteria must run synchronously. This means you can&amp;rsquo;t safely call an external API or await an async lookup when evaluating a visitor group without causing thread block calls to the external service, which rules out many IP-to-zip services. ASP.NET Core middleware, on the other hand, runs for every request (that can be filtered) and fully supports async operations. Using AsyncHelper.RunSync() as outlined &lt;a href=&quot;https://marisks.net/2017/04/02/calling-async-methods-within-episerver-events/&quot;&gt;here&lt;/a&gt; isn&#39;t a scalable solution and can cause thread pool starvation and deadlocks in your application. This makes it the ideal place for concerns like geolocation, logging, authentication, and request enrichment. By resolving the zip code once at the start of the pipeline, you avoid duplicate lookups in controllers, block controllers, or views at the same time avoid blocking calls to external services.&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;Step 1: Create a Request-Scoped Container&lt;/h2&gt;
&lt;p&gt;We&amp;rsquo;ll use a scoped class to store the resolved zip code so it&amp;rsquo;s easily accessible through DI.&lt;/p&gt;
&lt;pre class=&quot;language-csharp&quot;&gt;&lt;code&gt;public sealed class GeoContext
{
    public string? ZipCode { get; set; }
}&lt;/code&gt;&lt;/pre&gt;
&lt;pre&gt;&amp;nbsp;&lt;/pre&gt;
&lt;hr /&gt;
&lt;h2&gt;Step 2: Define an IP &amp;rarr; Zip Resolver&lt;/h2&gt;
&lt;p&gt;This is your service that turns an IP address into a zip code. You can use a 3rd-party API, MaxMind, or your own data source.&lt;/p&gt;
&lt;pre class=&quot;language-csharp&quot;&gt;&lt;code&gt; public interface IZipFromIpResolver
{
    Task&amp;lt;string?&amp;gt; GetZipFromIpAsync(string ip, CancellationToken ct = default);
}
 &lt;/code&gt;&lt;/pre&gt;
&lt;pre&gt;&amp;nbsp;&lt;/pre&gt;
&lt;h2&gt;Step 3: Implement Middleware&lt;/h2&gt;
&lt;p&gt;The middleware gets the client IP, calls the resolver, and stores the result in GeoContext&lt;/p&gt;
&lt;pre class=&quot;language-csharp&quot;&gt;&lt;code&gt; public sealed class GeoZipMiddleware
{
    private readonly RequestDelegate _next;

    public GeoZipMiddleware(RequestDelegate next) =&amp;gt; _next = next;

    public async Task InvokeAsync(HttpContext context, GeoContext geo, IZipFromIpResolver resolver)
    {
      // Filter out any paths to avoid calling the service on these requests
       var path = httpContext.Request.Path.Value ?? &quot;&quot;;
       if (!path.StartsWith(&quot;/assets&quot;, StringComparison.OrdinalIgnoreCase) &amp;amp;&amp;amp;
       !path.StartsWith(&quot;/static&quot;, StringComparison.OrdinalIgnoreCase) &amp;amp;&amp;amp;
       !path.StartsWith(&quot;/api/custom&quot;, StringComparison.OrdinalIgnoreCase) &amp;amp;&amp;amp;
       !path.Contains(&quot;/globalassets&quot;, StringComparison.OrdinalIgnoreCase) &amp;amp;&amp;amp;
       !path.Contains(&quot;/.well-known&quot;, StringComparison.OrdinalIgnoreCase) &amp;amp;&amp;amp;
       !path.Contains(&quot;/siteassets&quot;, StringComparison.OrdinalIgnoreCase) &amp;amp;&amp;amp;
       !path.Contains(&quot;/Errors&quot;, StringComparison.OrdinalIgnoreCase) &amp;amp;&amp;amp;
       !path.Contains(&quot;/contentassets&quot;, StringComparison.OrdinalIgnoreCase)) // Optimizely blobs
       {
        var ip = context.Connection.RemoteIpAddress?.ToString();
        if (!string.IsNullOrWhiteSpace(ip))
        {
            try
            {
                // Call your GeoLocation Service
                geo.ZipCode = await resolver.GetZipFromIpAsync(ip);
                // You can also save it within the httpContext
                 httpContext.Items[&quot;ZipCode&quot;] = geo.ZipCode;
            }
            catch
            {
                // log if needed; don&amp;rsquo;t block the request
            }
        }
       }
       await _next(context);
    }
}
 &lt;/code&gt;&lt;/pre&gt;
&lt;pre&gt;&amp;nbsp;&lt;/pre&gt;
&lt;h2&gt;Step 4: Register Services &amp;amp; Middleware&lt;/h2&gt;
&lt;p&gt;Update Program.cs&lt;/p&gt;
&lt;pre class=&quot;language-csharp&quot;&gt;&lt;code&gt;var builder = WebApplication.CreateBuilder(args);

// Add request-scoped container
builder.Services.AddScoped&amp;lt;GeoContext&amp;gt;();

// Add resolver implementation
// builder.Services.AddSingleton&amp;lt;IZipFromIpResolver, MyIpResolver&amp;gt;();

builder.Services.AddHttpContextAccessor();

var app = builder.Build();

// Use middleware early in the pipeline
app.UseMiddleware&amp;lt;GeoZipMiddleware&amp;gt;();

app.UseRouting();
app.UseAuthorization();

app.MapContent();

app.Run();&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;Step 5: Use in Controllers, Blocks, or Views&lt;/h2&gt;
&lt;p&gt;Since GeoContext&amp;nbsp;is scoped, you can inject it anywhere. You can also use the httpContext.Items to retrieve the value (set in the middleware)&lt;/p&gt;
&lt;pre class=&quot;language-csharp&quot;&gt;&lt;code&gt; public override bool IsMatch(IPrincipal principal, HttpContext httpContext)
 {
     var zipCode = httpContext.Items[&quot;ZipCode&quot;];

     // do something with it
    return true;

 }&lt;/code&gt;&lt;/pre&gt;
&lt;pre&gt;&amp;nbsp;&lt;/pre&gt;
&lt;hr /&gt;
&lt;h2&gt;Notes &amp;amp; Best Practices&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Caching&lt;/strong&gt;: IP-to-zip lookups can be expensive. Use ISynchronizedObjectCache&amp;nbsp;or a distributed cache to reduce API calls.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Privacy&lt;/strong&gt;: Treat IP and location data as personal information (GDPR/CCPA). Avoid persisting unless you have a clear use case.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Performance&lt;/strong&gt;: Consider skipping static asset and other URL requests inside your middleware for speed.&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;hr /&gt;
&lt;h2&gt;Wrapping Up&lt;/h2&gt;
&lt;p&gt;By leveraging ASP.NET Core middleware in Optimizely CMS 12, you can easily enrich each request with geolocation or any other data. This is powerful and can significantly improve the performance of the website by removing blocking calls from the application.&amp;nbsp;&lt;/p&gt;</id><updated>2025-08-16T11:09:08.0000000Z</updated><summary type="html">Blog post</summary></entry> <entry><title>Part 1: Planning and estimation for Upgrade (CMS 12/Commerce 14) </title><link href="https://world.optimizely.com/blogs/aniket-gadre/dates/2023/12/upgrade-cms-12commerce-14---things-to-consider---part-1/" /><id>&lt;p&gt;In this series, I will talk about the Optimizely CMS/Commerce upgrade project(s) with some do&#39;s and don&#39;ts as well as tips &amp;amp; tricks for a successful implementation. We can all agree depending on the complexity of the implementation, the ugprade project has a potential to go off track pretty easily. This blog post will allow you to get ahead of some of these challenges.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;span style=&quot;text-decoration: underline;&quot;&gt;5P&#39;s principle: &lt;strong&gt;P&lt;/strong&gt;roper &lt;strong&gt;P&lt;/strong&gt;lanning &lt;strong&gt;P&lt;/strong&gt;revents &lt;strong&gt;P&lt;/strong&gt;oor &lt;strong&gt;P&lt;/strong&gt;erformance.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;In Part 1 of this series let&#39;s talk through the most important (&amp;amp; often neglected) phase - &quot;Planning and Estimatation&quot;.&amp;nbsp;&lt;/p&gt;
&lt;h2&gt;&lt;strong&gt;Planning &amp;amp; Estimation&lt;/strong&gt;&lt;/h2&gt;
&lt;p&gt;This is one of the most difficult part of the process. How do you estimate an upgrade? Unfortunately, I don&#39;t have a silver bullet, as each project is unique but here are a few critical things to consider that will help you plan and estimate it better.&lt;/p&gt;
&lt;p&gt;&lt;span style=&quot;text-decoration: underline;&quot;&gt;&lt;strong&gt;Onboarding &amp;amp; kickoff&lt;/strong&gt;&lt;/span&gt;: If you are planning to use a new sidecar team for the upgrade, you need to account for any setup/onboaring time for new developers + kickoff.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;span style=&quot;text-decoration: underline;&quot;&gt;&lt;strong&gt;Customizations&lt;/strong&gt;&lt;/span&gt;: Take inventory of the all the customizations to the core product. This is by far the biggest variable in the estimate and can cause the project to go off track quickly if you don&#39;t have a plan for how you plan to support it.&lt;/p&gt;
&lt;p&gt;&lt;span style=&quot;text-decoration: underline;&quot;&gt;&lt;strong&gt;Third Party Integrations&lt;/strong&gt;&lt;/span&gt;: With the upgrade the underlying engine needs to be upgraded to .NET Core. It will be useful to document all third party integrations including nuget packages that rely on the old .NET framework &amp;amp; haven&#39;t been updated to support the new .NET Core architecture. You may need to work with the client to find alternatives or remove this functionality completely.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;span style=&quot;text-decoration: underline;&quot;&gt;&lt;strong&gt;Commerce Manager&lt;/strong&gt;&lt;/span&gt;: Commerce Manager will no longer available in the new Commerce 14 and will have Order Manager instead. Ensure you are documenting any updates/customizations that need to be supported in the new Order Manager for processing your orders.&lt;/p&gt;
&lt;p&gt;&lt;span style=&quot;text-decoration: underline;&quot;&gt;&lt;strong&gt;Linux Containers&lt;/strong&gt;&lt;/span&gt;: The new DXP is hosted in Linux containers unlike the existing DXP which is hosted on Window&#39;s servers. This is a huge diference as Windows Forms WPF and other Window specific packages will not work in the the Linux containers. This is difficult to anticipate as the upgrade project may work perfectly fine locally but throw random errors in the new DXP.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;span style=&quot;font-size: 10pt;&quot;&gt;&lt;em&gt;Did you know: Linux has removed support for TLS1.0 &amp;amp; TLS 1.1 certificates? We ran into this issue after deploying to DXP although it worked perfectly fine locally (windows v/s linux).&amp;nbsp;&lt;/em&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span style=&quot;text-decoration: underline;&quot;&gt;&lt;strong&gt;Breaking Changes&lt;/strong&gt;&lt;/span&gt;: There are lot of breaking changes in this version. I would recommend carefully reviewing these &lt;a href=&quot;https://docs.developers.optimizely.com/content-management-system/docs/breaking-changes-in-content-cloud-cms-12&quot;&gt;here&lt;/a&gt; before the start of the project to avoid any big surprises during implementation.&lt;/p&gt;
&lt;p&gt;&lt;span style=&quot;text-decoration: underline;&quot;&gt;&lt;strong&gt;Ongoing development &amp;amp; branching&lt;/strong&gt;&lt;/span&gt;: If you have a huge team of developers working on the existing project (we did in this case), you need to keep the upgrade branch updated regularly. Have a solid branching strategy figured out on keeping the old and new in sync. This along with ongoing QA can be challenging as new functionality and features will need to be merged in and tested.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;span style=&quot;text-decoration: underline;&quot;&gt;&lt;strong&gt;Deployments&lt;/strong&gt;&lt;/span&gt;: Make sure you account for creating build &amp;amp; release pipelines, CI/CD etc. There may be instances where the existing DXP deployment pipelines do not work with the new DXP. In our project, the bundled js and css files weren&#39;t getting deployed correctly (due to differences in .NET &amp;amp; .NET Core build).&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;span style=&quot;text-decoration: underline;&quot;&gt;&lt;strong&gt;Content Freeze&lt;/strong&gt;&lt;/span&gt;: This may not impact the budget but does affect the timeline. Content freezes are essential and need to be co-ordinated with the client as we get closer to the finish line. Ensure you are accounting for how long the content freeze will last and&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;span style=&quot;text-decoration: underline;&quot;&gt;&lt;strong&gt;Code freeze&lt;/strong&gt;&lt;/span&gt;: It may be a good idea to freeze any code development (or keep it to minor bug fixes) before going LIVE to prevent introducing new bugs. Adding new features may add new defects, which could potentially delay the Go LIVE.&lt;/p&gt;
&lt;p&gt;&lt;span style=&quot;text-decoration: underline;&quot;&gt;&lt;strong&gt;Go Live Prep&lt;/strong&gt;&lt;/span&gt;: Ensure the right people are available for GO LIVE, ex: IT team with DNS access needs to be involved to switch from old to the new DXP environments (this is a two step process).&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;span style=&quot;text-decoration: underline;&quot;&gt;&lt;strong&gt;Add Buffer&lt;/strong&gt;&lt;/span&gt;: Always buffer your estimates for unknowns. Depending on the complexity of the project, there&#39;s a chance you missed something which could potentially take hours (if not days) to figure out &amp;amp; completely derail the project.&lt;/p&gt;
&lt;p&gt;Hope this helps! In Part 2 of thia series I will talk about implementation &amp;amp; known issues to bypass.&amp;nbsp;&lt;/p&gt;</id><updated>2023-12-19T23:39:30.0000000Z</updated><summary type="html">Blog post</summary></entry> <entry><title>Serializable Carts - Website crash? Wait what?...</title><link href="https://world.optimizely.com/blogs/aniket-gadre/dates/2023/2/serializable-carts-use-with-caution/" /><id>&lt;p&gt;&lt;strong&gt;Stop and do this right now:&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;If you are running a transactional ecommerce website on Optimizely please check the health and size of your commerce database in Application Insights. This gets typically overlooked unless you have an alert setup or there&#39;s a issue on the website.&lt;/p&gt;
&lt;p&gt;Okay maybe I lied in the title. It won&#39;t be a website crash but none of the carts can be created (naturally no orders can be created until this issue is resolved). This will be SEV 1 issue with the client yelling on the phone to get this resolved ASAP.&lt;br /&gt;&lt;br /&gt;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;What are Serializable Carts&lt;/strong&gt;:&lt;/p&gt;
&lt;p&gt;Serializable carts have been around for a while so it&#39;s not a net new functionality. Documentation here: &lt;a href=&quot;https://docs.developers.optimizely.com/commerce/v14.0.0-commerce-cloud/docs/serializable-carts?_gl=1*t9pqkp*_ga*MTUwOTA5MDEyMS4xNjc3MjY2Mjcy*_ga_C7SLJ6HMJ5*MTY3NzU5NjExNy4xMS4xLjE2Nzc1OTYzNTQuNTAuMC4w&quot;&gt;https://docs.developers.optimizely.com/commerce/v14.0.0-commerce-cloud/docs/serializable-carts&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;The SerializableCarts table is pretty simple. The data column holds the entire cart as a JSON string (which can use up some good bytes in the DB).&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;/link/069820c1f71a450da10e846c3e0cc1fa.aspx&quot; /&gt;&lt;/p&gt;
&lt;p&gt;Serializable carts are created/updated in the database every time a user adds/updates an item to the cart (because of SaveCart() calls in your code). Saving carts in the database is a useful piece of functionality for keeping the users&#39; history of products as well as retrieving their saved items even after they close their browser (as long as they don&#39;t delete cookies). Seriaziable Carts are also used for &#39;Wishlist&#39; to create favorites or other lists that are stored in the database.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Anything wrong here?&lt;/strong&gt;&lt;/p&gt;
&lt;pre class=&quot;language-csharp&quot;&gt;&lt;code&gt;         public ICart GetInMemoryCart(Guid contactGuid)
        {
            // Load cart to get a fake cart, if not create one
            var cart = _orderRepository.LoadOrCreateCart&amp;lt;ICart&amp;gt;(contactGuid, &quot;Default&quot;);

            return cart;
        }

        public void CallingMethodDoPromotionCalculations()
        {
            // Get a fake cart to do custom calculations
            var cart = GetInMemoryCart(Guid.NewGuid());

            if (cart != null)
            {
                // Add line items to the cart
               // Basic shipping address etc.
               // Do promotion calculations on the cart to check if user will qualify
               // More code
                _orderRepository.Save(cart);
            }

           // Return the results of the calculations
        }&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;&lt;strong&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;What&#39;s wrong?&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;If you couldn&#39;t figure out the problem the issue isn&#39;t evident at first sight. The above code will create a new cart with a new GUID (meant to be for in-memory calculations etc.) in the database. Now if this code runs (on say home page), every time the user browses to the website (whether they add items to the cart or not) you got a problem.&lt;/p&gt;
&lt;p&gt;There are 2 things to watch out for:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;It&#39;s using the LoadOrCreate() creating a cart with a new GUID every time, instead of using something like CustomerContext.Current.CurrentContactId&lt;/li&gt;
&lt;li&gt;It&#39;s calling the SaveCart() which actually commits this cart to the database with the new Customer GUID.&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;&lt;strong&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Death by a thousand paper cuts&lt;/strong&gt;:&lt;/p&gt;
&lt;p&gt;Here&#39;s a scenario to consider where this issue can go unnoticed for days/months until your database is full:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;The above piece of code was accidentally introduced on a frequently visited page and it&#39;s creating thousands of carts every day.&amp;nbsp;&lt;/li&gt;
&lt;li&gt;&amp;nbsp;Well luckily Optimizely provides a scheduled job called &quot;Remove expired carts&quot; that deletes carts that haven&#39;t been modified in the last 30 days and by default runs once every day.&amp;nbsp;&lt;/li&gt;
&lt;li&gt;The issue is the scheduled job won&#39;t get to these fake/junk carts until 30 days later and in the meanwhile created 1 million carts/rows in the database.&amp;nbsp;&lt;/li&gt;
&lt;li&gt;Well we have the scheduled job no issue right? (hopefully!). However, now when the scheduled job tries to get all serializable carts to delete within the last 30 days it may start failing due to timeouts. Then your website is doomed....very slowly :) because if the failing scheduled job goes unnoticed for a while you are accumalating carts and absolutely nothing to purge them.&amp;nbsp;&lt;/li&gt;
&lt;li&gt;Soon enough the SerializableCarts table will grow expoentially and eat up all the allocated database space (currently set to 250 GB in DXP)&lt;/li&gt;
&lt;li&gt;When this happens in production, no new carts will be created. God forbid if it happens during peak traffic hours or even worse on Cyber Monday this will be SEV 1 right away!!!&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;&lt;strong&gt;Short Term Fix&lt;/strong&gt;:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;Call/Email Optimizely customer support and ask them to increase the database size from 250GB to 500 GB. This will resolve the issue right away and your website will be operational and buy you some time to get to the bottom of this issue.&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;&lt;strong&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Long term Solution&lt;/strong&gt;:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;Check application insights to see how long this has been going on and also get a sense of daily increase in the database size&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;
&lt;li&gt;Ask support to run the following query to get the size of the tables in the ecommerce database. If this is a seriazible cart issue, the table will show up to the top of the list.&amp;nbsp;&lt;br /&gt;
&lt;pre class=&quot;language-markup&quot;&gt;&lt;code&gt;SELECT 
    t.NAME AS TableName,
    s.Name AS SchemaName,
    p.rows,
    SUM(a.total_pages) * 8 AS TotalSpaceKB, 
    CAST(ROUND(((SUM(a.total_pages) * 8) / 1024.00), 2) AS NUMERIC(36, 2)) AS TotalSpaceMB,
    SUM(a.used_pages) * 8 AS UsedSpaceKB, 
    CAST(ROUND(((SUM(a.used_pages) * 8) / 1024.00), 2) AS NUMERIC(36, 2)) AS UsedSpaceMB, 
    (SUM(a.total_pages) - SUM(a.used_pages)) * 8 AS UnusedSpaceKB,
    CAST(ROUND(((SUM(a.total_pages) - SUM(a.used_pages)) * 8) / 1024.00, 2) AS NUMERIC(36, 2)) AS UnusedSpaceMB
FROM 
    sys.tables t
INNER JOIN      
    sys.indexes i ON t.OBJECT_ID = i.object_id
INNER JOIN 
    sys.partitions p ON i.object_id = p.OBJECT_ID AND i.index_id = p.index_id
INNER JOIN 
    sys.allocation_units a ON p.partition_id = a.container_id
LEFT OUTER JOIN 
    sys.schemas s ON t.schema_id = s.schema_id
WHERE 
    t.NAME NOT LIKE &#39;dt%&#39; 
    AND t.is_ms_shipped = 0
    AND i.OBJECT_ID &amp;gt; 255 
GROUP BY 
    t.Name, s.Name, p.Rows
ORDER BY 
    TotalSpaceMB DESC, t.Name&lt;/code&gt;&lt;/pre&gt;
&lt;/li&gt;
&lt;li&gt;Ask Optimizely support to run the following script to clean up the ecommerce database SerializableCarts table (PLEASE TEST THIS ON LOCAL AND OTHER LOWER ENVIRONMENTS FIRST)&lt;br /&gt;
&lt;pre class=&quot;language-markup&quot;&gt;&lt;code&gt;-- The -300 is the lookback (depends on how far back this issue has been around. The script will need to be run in increments by support -300, -200, -100 down to -30. 
DELETE from SerializableCart where Modified &amp;lt; DATEADD(day, -300, GETDATE()) and Name != &#39;WishList&#39;&lt;/code&gt;&lt;/pre&gt;
&lt;/li&gt;
&lt;li&gt;Lower the frequency of the &quot;Remove Expired Carts&quot; scheduled job to run every hour or multiple times in a day (this wouldn&#39;t help unless you clean up the database first)&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;
&lt;li&gt;Investigate the root cause with special attention to SaveCart(), LoadOrCreateCart() and LoadCart().&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;
&lt;li&gt;Setup alerts with Optimizely to be notified when the database size increases 20% more than the baseline.&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;
&lt;li&gt;Manually check the scheduled job is functioning regularly to avoid the same problem from happening again.&amp;nbsp;&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;Happy coding!&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;</id><updated>2023-03-01T04:45:05.0000000Z</updated><summary type="html">Blog post</summary></entry> <entry><title>Azure Service Bus Messaging (Topic/Queues) for transferring data between external PIM/ERP to Optimizely</title><link href="https://world.optimizely.com/blogs/aniket-gadre/dates/2023/2/using-azure-topic-queues-for-transferring-data-between-external-pim-to-optimizely/" /><id>&lt;p&gt;Optimizely provides a PIM solution as a part of the DXP package. More information here: &lt;a href=&quot;https://www.optimizely.com/product-information-management/&quot;&gt;https://www.optimizely.com/product-information-management/&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;More often that not, clients have their existing PIM and/or ERP systems that feed other systems in their organization. For ex: Their PIM/ERP system may be serving physical stores, running reports, feeding their invoicing details and the SOURCE of TRUTH. There are numerous blog posts on importing catalog one time into Optimizely using the out-of-the-box Optimizely APIs.&lt;/p&gt;
&lt;p&gt;Needless to say, as updates are made to say pricing, inventory, assets, delivery charges, taxes etc. in ERP/PIM/DAM, we need to keep that data synchronized in the Optimizely catalog. to ensure the customers see the most up-to-date information on the website as quickly as possible.&lt;/p&gt;
&lt;p&gt;This requires a strategy to figure out how to move content between two systems and do it on a regular fault tolerant basis. A quick solution is the use of Optimizely&#39;s scheduled job to fetch data and update it in the database. though there are some limitations with a scheduled job - timeouts, low fault tolerance, logging, speed, resource constraints, alerting etc.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Another alternative is to &lt;strong&gt;&lt;a href=&quot;https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-queues-topics-subscriptions&quot;&gt;Azure Service Bus Messaging&lt;/a&gt; &lt;/strong&gt;to line up the product updates from the source system (client&#39;s PIM/ERP) and synchronize it to the Optimizely catalog on a configurable schedule. Azure Service bus have a lot of advantages as described below and you can also read up online.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Advantages&lt;/strong&gt;:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Message Sessions&lt;/li&gt;
&lt;li&gt;Auto-forwarding&lt;/li&gt;
&lt;li&gt;Dead-lettering&lt;/li&gt;
&lt;li&gt;Scheduled Delivery&lt;/li&gt;
&lt;li&gt;Message deferral&lt;/li&gt;
&lt;li&gt;Transactions&lt;/li&gt;
&lt;li&gt;Auto-delete on idle&lt;/li&gt;
&lt;li&gt;Duplicate detection&lt;/li&gt;
&lt;li&gt;Geo Disaster recovery&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;You can use the Azure Service Bus .NET SDK for integration: &lt;a href=&quot;https://learn.microsoft.com/en-us/dotnet/api/overview/azure/service-bus?preserve-view=true&amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;view=azure-dotnet&quot;&gt;https://learn.microsoft.com/en-us/dotnet/api/overview/azure/service-bus?preserve-view=true&amp;amp;view=azure-dotnet&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Strategy&lt;/strong&gt;:&lt;/p&gt;
&lt;p&gt;We have used the following strategy on a huge B2C retail client and works really well.&amp;nbsp;&amp;nbsp;&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;Our custom C# function/console app (extract job) deployed on Azure gets all products that have been updated in the last &#39;x&#39; mins/hours by pinging the custom endpoint provided by client&lt;/li&gt;
&lt;li&gt;This function app is run using a &#39;TimerTrigger&#39; configurable in Azure function app configuration. More info on function apps: &lt;a href=&quot;https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-your-first-function-visual-studio?tabs=in-process&quot;&gt;https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-your-first-function-visual-studio?tabs=in-process&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;This function app is responsible for getting the data from the endpoint, serializing each message as a JSON and send it to the ASB topic (product extract topic)&lt;/li&gt;
&lt;li&gt;A second custom C# function app (transload job) which was subscribed to the above topic in ASB using &#39;ServiceBusTrigger&#39; (executes every time there&#39;s a new message)&lt;/li&gt;
&lt;li&gt;This function app&#39;s job was to read the message from the topic, deserialize it and update the product item using Optimizely Service API&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;&lt;strong&gt;Diagram&lt;/strong&gt;:&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;/link/862c73d87f444d0fbb3d0e3304447413.aspx&quot; /&gt;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Sample Code (Export Job)&lt;/strong&gt;:&lt;/p&gt;
&lt;pre class=&quot;language-csharp&quot;&gt;&lt;code&gt;namespace ClientNamespace.Export.Features.CartCheckout.TaxSync
{
    using System;
    using System.Linq;
    using System.Net.Http;
    using System.Threading;
    using System.Threading.Tasks;
    using Microsoft.Azure.WebJobs;
    using ClientNamespace.Export.Core.Features.CartCheckout.TaxRateSync.Models;
    using ClientNamespace.Export.Core.Features.Infrastructure.Azure.Constants;
    using ClientNamespace.Export.Core.Features.Infrastructure.Azure.Services;
    using ClientNamespace.Export.Core.Features.Infrastructure.Logging;
    using ClientNamespace.Export.Features.Infrastructure.Azure.Constants;
    using ClientNamespace.Export.Features.Infrastructure.Azure.Extensions;
    using ClientNamespace.Export.Features.Infrastructure.Azure.Services;
    using ClientNamespace.Export.Features.Infrastructure.Rfapi.Clients;
    using Serilog;
    using Serilog.Core;
    using ConnectionStringNames = ClientNamespace.Export.Core.Features.Infrastructure.Azure.Constants.ConnectionStringNames;
    using ExecutionContext = Microsoft.Azure.WebJobs.ExecutionContext;

    public class TaxRatesExportFunction
    {
        private const int ShortCircuit = 100_000;

        private IHttpClientFactory _clientFactory;

        public TaxRatesExportFunction(IHttpClientFactory clientFactory)
        {
            _clientFactory = clientFactory;
        }

       #if !DEBUG // remove this line to run locally as a console app
        [FunctionName(&quot;TaxRatesExport&quot;)]
       #endif
        public async Task Run(
            [TimerTrigger(
                ScheduleExpressions.TaxRatesExport,
                RunOnStartup = false)]
            TimerInfo myTimer)
        {
            var log = LoglevelWrapper.WrapLogger(Log.Logger);

            try
            {
                log.Information(&quot;Starting TaxRatesExportFunction: {starttime}&quot;, DateTime.UtcNow);

                using (var topicMessageSender = new TopicMessageSender(ConnectionStringNames.ServiceBusTaxRates, TopicNames.TaxRates, log))
                {
                    var taxRates = await apiClient.TaxesAllAsync(); // custom endpoint from the client
                    
		    var export = new TaxRateExport
                    {
                        TaxRates = taxRates
                            .Select(x =&amp;gt; new TaxRate
                            {
                               Percentage = x.TaxRate ?? 0.000,
                               PostalCode = x.PostalCode,
                               TaxCode = x.TaxCode,
                               TaxableDelivery = x.TaxableDelivery,
                               TaxablePlatinum = x.TaxablePlatinum,
                             })
                             .ToList(),
                        };
		
                   // Send the message to the topic to be consumed by the the transload function app
		    
                        try
                        {
                            var message = new Message(Encoding.UTF8.GetBytes(JsonConvert.SerializeObject(export)))
                            {
                                MessageId = Guid.NewGuid().ToString(),
                                SessionId = &quot;sesionid&quot;,
                            };
                            string connectionString = Environment.GetEnvironmentVariable(&quot;connectionStringName&quot;);
                            if (string.IsNullOrEmpty(connectionString))
                            {
                                connectionString = Environment.GetEnvironmentVariable($&quot;CUSTOMCONNSTR_{&quot;connectionStringName&quot;}&quot;);
                            }

                            var topicClient = new TopicClient(connectionString, &quot;topicName&quot;, RetryPolicy.Default);
                            await topicClient.SendAsync(message);
                        }
                        catch (Exception ex)
                        {
                            // logging
                        }
	
                }
            }
            catch (Exception ex)
            {
                log.Error(ex, &quot;Unhandled exception in TaxRatesExportFunction {exception}&quot;, ex);
            }
            finally
            {
                log.Information(&quot;TaxRatesExportFunction Complete: {endtime}&quot;, DateTime.UtcNow);
            }
        }
    }
}
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;&lt;strong&gt;Sample code Import job&lt;/strong&gt;:&lt;/p&gt;
&lt;pre class=&quot;language-csharp&quot;&gt;&lt;code&gt;namespace .Website.Import.Features.CartCheckout.TaxSync
{
    using System;
    using System.Net.Http;
    using System.Threading.Tasks;
    using Infrastructure.Azure.Constants;
    using Microsoft.Azure.WebJobs;
    using Newtonsoft.Json;
    using ClientNamespace.Export.Core.Features.CartCheckout.TaxRateSync.Models;
    using ClientNamespace.Export.Core.Features.Infrastructure.Azure.Constants;
    using ClientNamespace.Export.Core.Features.Infrastructure.Logging;
    using .Website.Core.Features.Infrastructure.Episerver.Clients;
    using Serilog;
    using Serilog.Context;
    using ConnectionStringNames = ClientNamespace.Export.Core.Features.Infrastructure.Azure.Constants.ConnectionStringNames;

    public class TaxRatesImportFunction
    {
        private readonly IHttpClientFactory _clientFactory;

        public TaxRatesImportFunction(IHttpClientFactory clientFactory)
        {
            _clientFactory = clientFactory;
        }

        #if !DEBUG // Remove this to run locally (will be triggered when it sees a message on the topic it&#39;s subscribed to)
        [FunctionName(FunctionNames.TaxRatesImport)]
        #endif
        public async Task Run(
            [ServiceBusTrigger(
                TopicNames.TaxRates,
                SubscriptionNames.TaxRates,
                Connection = ConnectionStringNames.ServiceBusTaxRates,
                IsSessionsEnabled = true)]
            string mySbMsg)
           {
              var log = LoglevelWrapper.WrapLogger(Log.Logger);

            try
            {
                log.Information(&quot;Starting TaxRatesImportFunction: {starttime}&quot;, DateTime.UtcNow);
                log.Debug(&quot;Tax Rates Import Message: {message}&quot;, mySbMsg);

                TaxRateExport export = null;

                try
                {
                    // Get taxes from topic queue
                    export = JsonConvert.DeserializeObject&amp;lt;TaxRateExport&amp;gt;(mySbMsg);
                }
                catch (Exception ex)
                {
                    log.Error(ex, &quot;Could not JSON deserialize tax rates message {message} with exception {exception}&quot;, mySbMsg, ex);
                }

                if (export?.TaxRates == null)
                {
                    log.Warning(&quot;Tax rates deserialized, but data was null&quot;);
                    return;
                }

                // Load taxes into Episerver
                var serviceApiClient = EpiserverApiClientFactory.Create(log, _clientFactory);
                foreach (var taxRate in export.TaxRates)
                {
                    try
                    {
                        using (LogContext.PushProperty(&quot;importtaxrate&quot;, taxRate.TaxCode))
                        {
			   // Update the taxes table (either custom endpoint or using Service API)
                            await serviceApiClient.SaveTaxRateAsync(taxRate);
                        }
                    }
                    catch (Exception ex)
                    {
                        // Don&#39;t fail the group
			// Custom logic to handle exception when updating in the Optimizely dtabase.
                    }
                }
            }
            catch (Exception ex)
            {
                log.Error(ex, &quot;Unhandled exception in TaxRatesImportFunction {exception}&quot;, ex);
            }
            finally
            {
                log.Information(&quot;TaxRatesImportFunction Complete: {endtime}&quot;, DateTime.UtcNow);
            }
        }
    }
}
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;As you see, with minimal code you can create a more fault tolerant synchronization to the optimizely database. You can now visualize this scaling to other areas of your website. For ex: We have scaled this system to automate processing of orders - As orders come in, the serialized order object is placed on the Azure service bus for automated processing all the way to completing the orders. Yes the client&#39;s IT team needs to write some code to automate it on their side but it has saved them hundred&#39;s of thousands of dollars in costs of manually updating each order by a keying team member.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Can you think of other ways to scale the Optimizely system to use Azure Service Bus Messaging?&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Happy coding!&lt;/p&gt;</id><updated>2023-02-27T03:49:41.0000000Z</updated><summary type="html">Blog post</summary></entry> <entry><title>ChatGPT/OpenAI integration for text generation using Prompt in Optimizely</title><link href="https://world.optimizely.com/blogs/aniket-gadre/dates/2023/2/gpt-open-ai-integration-text-generation-on-content-publish/" /><id>&lt;p&gt;Here&#39;s how you can use a simple publishing event to generate content using OpenAI.&lt;/p&gt;
&lt;p&gt;The code is pretty simple - I will avoid getting into too many details as Tomas has done a wonderful job of explaining it in his blog post here: &lt;br /&gt;&lt;a href=&quot;https://www.gulla.net/en/blog/integrating-generative-ai-in-optimizely-cms-a-quick-test-with-openai/&quot;&gt;https://www.gulla.net/en/blog/integrating-generative-ai-in-optimizely-cms-a-quick-test-with-openai/&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;...And from Allan here:&lt;br /&gt;&lt;a href=&quot;https://www.codeart.dk/blog/2022/11/ai-assisted-content-creation---in-optimizely-cms--commerce-ai-series---part-2/&quot;&gt;https://www.codeart.dk/blog/2022/11/ai-assisted-content-creation---in-optimizely-cms--commerce-ai-series---part-2/&lt;/a&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Here&#39;s sample code which has been requested by a few people.&amp;nbsp;&lt;/p&gt;
&lt;pre class=&quot;language-csharp&quot;&gt;&lt;code&gt;namespace ClientName.CMS.Features.Sample
{
    using System;
    using System.Collections.Generic;
    using System.Linq;
    using System.Net.Http;
    using System.Threading.Tasks;
    using System.Web;
    using EPiServer;
    using EPiServer.Core;
    using EPiServer.Framework;
    using EPiServer.Framework.Initialization;
    using EPiServer.ServiceLocation;
    using Newtonsoft.Json;
    using ClientName.CMS.Features.Basics.RichTextBlock.Models;
    using ClientName.Common.Features.Foundation.Threading.Utilities;

    [InitializableModule]
    [ModuleDependency(typeof(EPiServer.Web.InitializationModule))]
    public class OpenAIBlockInitialization : IInitializableModule
    {
        private static readonly HttpClient _client = new HttpClient();
        private readonly string _apiKey = &quot;YOUR API KEY GOES HERE&quot;; // You can generate it here: https://platform.openai.com/account/api-keys

        public void Initialize(InitializationEngine context)
        {
            // Add initialization logic, this method is called once after CMS has been initialized
            var contentEvents = ServiceLocator.Current.GetInstance&amp;lt;IContentEvents&amp;gt;();
            contentEvents.PublishingContent += ContentEvents_PublishingContent;
        }

        public async Task&amp;lt;dynamic&amp;gt; SendRequestAsync(string model, string prompt, int maxTokens)
        {
            var requestUrl = &quot;https://api.openai.com/v1/engines/&quot; + model + &quot;/completions&quot;;
            var requestData = new
            {
                prompt = prompt,
                max_tokens = maxTokens
            };
            var jsonRequestData = JsonConvert.SerializeObject(requestData);
            var requestContent = new StringContent(jsonRequestData, System.Text.Encoding.UTF8, &quot;application/json&quot;);

            _client.DefaultRequestHeaders.Authorization = new System.Net.Http.Headers.AuthenticationHeaderValue(&quot;Bearer&quot;, _apiKey);

            var response = await _client.PostAsync(requestUrl, requestContent);
            response.EnsureSuccessStatusCode();

            var responseContent = await response.Content.ReadAsStringAsync();
            var responseData = JsonConvert.DeserializeObject&amp;lt;dynamic&amp;gt;(responseContent);

            return responseData;
        }

        private void ContentEvents_PublishingContent(object sender, EPiServer.ContentEventArgs e)
        {
            try
            {
                if (e.Content != null)
                {
                    if (e.Content is RichTextBlock richTextBlock)
                    {
                        var blockData = e.Content as RichTextBlock;

                        string textToOpenAI = blockData.OpenAIPrompt;

                        // Call Open AI and get results
                        var response = AsyncHelper.RunSync(async () =&amp;gt; await SendRequestAsync(&quot;text-davinci-003&quot;, textToOpenAI, 3000));

                        blockData.OpenAIGeneratedText = response?.choices[0]?.text;
                    }
                }
            }
            catch (Exception ex)
            {
                // Optinal logging
            }
        }

        public void Uninitialize(InitializationEngine context)
        {
            // Add uninitialization logic
            var contentEvents = ServiceLocator.Current.GetInstance&amp;lt;IContentEvents&amp;gt;();
            contentEvents.PublishingContent -= ContentEvents_PublishingContent;
        }
    }
}&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;AsyncHelper class to run async method synchronously (due to initialization functions limitations)&lt;/p&gt;
&lt;pre class=&quot;language-csharp&quot;&gt;&lt;code&gt;namespace ClientName.Common.Features.Foundation.Threading.Utilities
{
    using System;
    using System.Threading;
    using System.Threading.Tasks;

    public static class AsyncHelper
    {
        private static readonly TaskFactory TaskFactory =
            new TaskFactory(CancellationToken.None, TaskCreationOptions.None, TaskContinuationOptions.None, TaskScheduler.Default);

        /// &amp;lt;summary&amp;gt;
        /// Executes an async Task method which has a void return value synchronously
        /// USAGE: AsyncUtil.RunSync(() =&amp;gt; AsyncMethod());
        /// &amp;lt;/summary&amp;gt;
        /// &amp;lt;param name=&quot;task&quot;&amp;gt;Task method to execute&amp;lt;/param&amp;gt;
        public static void RunSync(Func&amp;lt;Task&amp;gt; task) =&amp;gt; TaskFactory.StartNew(task).Unwrap().GetAwaiter().GetResult();

        /// &amp;lt;summary&amp;gt;
        /// Executes an async Task&amp;lt;T&amp;gt; method which has a T return type synchronously
        /// USAGE: T result = AsyncUtil.RunSync(() =&amp;gt; AsyncMethod&amp;lt;T&amp;gt;());
        /// &amp;lt;/summary&amp;gt;
        /// &amp;lt;typeparam name=&quot;TResult&quot;&amp;gt;Return Type&amp;lt;/typeparam&amp;gt;
        /// &amp;lt;param name=&quot;task&quot;&amp;gt;Task&amp;lt;T&amp;gt; method to execute&amp;lt;/param&amp;gt;
        /// &amp;lt;returns&amp;gt;&amp;lt;/returns&amp;gt;
        public static TResult RunSync&amp;lt;TResult&amp;gt;(Func&amp;lt;Task&amp;lt;TResult&amp;gt;&amp;gt; task) =&amp;gt;
            TaskFactory.StartNew(task).Unwrap().GetAwaiter().GetResult();
    }
}&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Happy coding!&lt;/p&gt;</id><updated>2023-02-27T01:44:42.0000000Z</updated><summary type="html">Blog post</summary></entry> <entry><title>The beauty of Decorator pattern in Optimizely</title><link href="https://world.optimizely.com/blogs/aniket-gadre/dates/2023/2/the-beauty-of-decorator-pattern-in-optimizely/" /><id>&lt;p&gt;Decorator pattern is one of my favorite design pattern for backend code development.&lt;/p&gt;
&lt;p&gt;&lt;span&gt;From wikipedia:&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;A decorator pattern is a design pattern that allows a behavior to be added to an individual object &lt;/span&gt;&lt;span&gt;dynamically, without affecting the behavior of other objects from the same classes.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;&lt;img src=&quot;/link/61c8ea2655be4aa09b909b11ab10fcd4.aspx&quot; /&gt;&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;&lt;span&gt;&lt;strong&gt;Advantages&lt;/strong&gt;:&amp;nbsp;&lt;/span&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;span&gt;Helps you extend the behaviour of the classes/services without modifying the behavior. &lt;/span&gt;&lt;/li&gt;
&lt;li&gt;&lt;span&gt;Helps enforcing single responsibility principle (one class one responsibility) and Open/Closed principle (classes can be extended but not modified).&amp;nbsp;&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;&lt;span&gt;More efficient than subclassing because the objects behavior can be augmented without definining an entierly new object. &lt;/span&gt;&lt;/li&gt;
&lt;li&gt;&lt;span&gt;Mainly used for caching keep the layer separate (including the keys can be made unique per functionality)&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;&lt;span&gt;Additional scenarios - logging, alerting, processing etc.&lt;/span&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;span&gt;&lt;strong&gt;Implementation&lt;/strong&gt;:&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;A simple example is an alert needs to be sent every time an order is submitted or there&#39;s an unhandled exception in the order service after a user submits the order. It might be tempting to add an &#39;Alert Sender Email&#39; dependency directly to the main order service class. However, if we need to stick to SRP and O/C SOLID principles, order service should only perform 1 job (submit the order).&amp;nbsp;&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;In that case, one way to extend the behavior of the order service class is to create a new class that inherits from the same interface (IOrderSubmitService) that sends the email. This means you don&#39;t need to add a new interface (unlike sub-classing) which makes the interfaces slim and helps the interface segregation principle indirectly.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Sample code:&lt;/p&gt;
&lt;pre class=&quot;language-csharp&quot;&gt;&lt;code&gt;namespace RF.Website.CMS.Features.CartCheckout.OrderSubmit.Alerting
{
    using System;
    using System.Threading.Tasks;
    using RF.Website.CMS.Features.CartCheckout.OrderSubmit.Services;
    using RF.Website.CMS.Features.CartCheckout.OrderSubmit.ViewModels;
    using RF.Website.Common.Features.Foundation.Alerts.Services;

    public class AlertingOrderSubmitService : IOrderSubmitService
    {
        private readonly IOrderSubmitService _implementation;
        private readonly IAlertSender _alertSender;

        public AlertingOrderSubmitService(
            IOrderSubmitService orderSubmitService,
            IAlertSender alertSender)
        {
            _implementation = orderSubmitService ?? throw new ArgumentNullException(nameof(orderSubmitService));
            _alertSender = alertSender ?? throw new ArgumentNullException(nameof(alertSender));
        }

        public async Task&amp;lt;OrderSubmitViewModel&amp;gt; SubmitOrderAsync(string associateName, int cartVersion, string kountSessionId)
        {
            try
            {
                return await _implementation.SubmitOrderAsync(associateName, cartVersion, kountSessionId);
                // Potential to add code to send email after every successful submission. 
            }
            catch (Exception exception)
            {
                string subject = &quot;SubmitOrderAsync Error&quot;;
                string body = &quot;An error occurred while calling SubmitOrderAsync.&quot;;
                await _alertSender.SendAlertAsync(subject, body, exception);
                throw;
            }
        }
    }
}&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;The statement in the try block is the one that calls the implementation of the submit order&lt;/p&gt;
&lt;p&gt;The IOrderSubmitService:&lt;/p&gt;
&lt;pre class=&quot;language-csharp&quot;&gt;&lt;code&gt;namespace ClientName.CartCheckout.OrderSubmit.Services
{
    using System.Threading.Tasks;
    using ClientName.CartCheckout.OrderSubmit.ViewModels;

    public interface IOrderSubmitService
    {
        Task&amp;lt;OrderSubmitViewModel&amp;gt; SubmitOrderAsync();
    }
}&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Next you will need to ensure the above code wraps the main code by using a decorator pattern. Luckily, it comes as a part of the Structure map and can be easily incorporate this in your code.&lt;/p&gt;
&lt;pre class=&quot;language-csharp&quot;&gt;&lt;code&gt;public void ConfigureContainer(ServiceConfigurationContext context)
{
 context.StructureMap().Configure(container =&amp;gt;
{
   container.For&amp;lt;IOrderSubmitService&amp;gt;().Use&amp;lt;DefaultOrderSubmitService&amp;gt;();
  // Can be used for logging or extending other behaviors of the Submit Order service
  // container.For&amp;lt;IOrderSubmitService&amp;gt;().DecorateAllWith&amp;lt;LoggingOrderSubmitService&amp;gt;();
   container.For&amp;lt;IOrderSubmitService&amp;gt;().DecorateAllWith&amp;lt;AlertingOrderSubmitService&amp;gt;();
}

}&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;That&#39;s it. Add a breakpoint in the AlertingOrderSubmitService to see it in action. Every time it will hit the wrapper/decorator class and then into your concrete implementation of the functionality.&lt;/p&gt;

&lt;p&gt;Happy coding!&lt;/p&gt;</id><updated>2023-02-26T23:21:43.0000000Z</updated><summary type="html">Blog post</summary></entry> <entry><title>Sending Files to Amazon S3 storage</title><link href="https://world.optimizely.com/blogs/aniket-gadre/dates/2023/2/optimizely-sending-files-to-amazon-s3-storage/" /><id>&lt;p&gt;I am sure you have been asked by clients to create a scheduled job that generates a report and sends it to third party system for further processing.&lt;/p&gt;
&lt;p&gt;One of our client asked us to generate a daily report of all the completed orders for further processing in Snowflake. We looked at multiple options, but one option that stood out was creating a scheduled job and storing these report/CSV files in Amazon&#39;s S3 storage (considering it was heavily used by the client).&amp;nbsp;&lt;/p&gt;
&lt;p&gt;The integration to Amazon&#39;s S3 was relatively simpler than I thought.&lt;/p&gt;
&lt;h2&gt;Step 1: Keys/Credentials for S3 storage&lt;/h2&gt;
&lt;p&gt;Once the client to generate the keys and the bucket and provide us with the following:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;User Name&lt;/li&gt;
&lt;li&gt;Access Key ID&lt;/li&gt;
&lt;li&gt;Secret Access Key&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;Step 2: Install S3 Browser&lt;/h2&gt;
&lt;p&gt;This tool allows you to browse all the buckets with the client&#39;s S3 storage. You need to use the above credentials to browse the reports generated and published to this bucket.&lt;/p&gt;
&lt;p&gt;URL: https://s3browser.com/download.aspx&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;/link/344572b8fba648e082c39b624e20d11d.aspx&quot; /&gt;&lt;/p&gt;
&lt;h2&gt;Step 2: Install the AWS S3 SDK&lt;/h2&gt;
&lt;p&gt;&lt;img src=&quot;/link/b7d8a299774c4c73a3ad2d31ff05eea1.aspx&quot; /&gt;&lt;/p&gt;
&lt;h2&gt;Step 3: Code&lt;/h2&gt;
&lt;p&gt;Add the required code to transfer CSV files to AWS S3 storage&lt;/p&gt;
&lt;pre class=&quot;language-csharp&quot;&gt;&lt;code&gt;namespace ClientName.Orders.DailyOrders.Services
{
    using System;
    using Amazon;
    using Amazon.S3;
    using Amazon.S3.Transfer;

    public static class AmazonUploaderService
    {
        public static bool SendMyFileToS3(System.IO.Stream fileStream, string fileNameInS3)
        {
            try
            {
                var client = new AmazonS3Client(Amazon.RegionEndpoint.USEast1);

                // create a TransferUtility instance passing it the IAmazonS3 created in the first step
                TransferUtility utility = new TransferUtility(client);

                // making a TransferUtilityUploadRequest instance
                TransferUtilityUploadRequest request = new TransferUtilityUploadRequest();

                request.BucketName = System.Configuration.ConfigurationManager.AppSettings[&quot;AWSBucketName&quot;];
                request.Key = fileNameInS3;
                request.InputStream = fileStream;
                utility.Upload(request);
            }
            catch (Exception ex)
            {
                return false;
            }

            return true; // indicate that the file was sent
        }
    }
}&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Call the above function using the following code.&lt;/p&gt;
&lt;pre class=&quot;language-csharp&quot;&gt;&lt;code&gt;_s3FileName = $&quot;{DateTime.UtcNow.ToString(&quot;yyyy-MM-dd-hh-mm-ss&quot;)}-Snowflake-W2-orders.csv&quot;;
StringBuilder _report = &quot;Col1, Col2, Col3, Col4...&quot;; // List of all orders using string builder
if (!string.IsNullOrWhiteSpace(_report?.ToString()))
{
  byte[] byteArray = Encoding.ASCII.GetBytes(_report.ToString());
  using (MemoryStream memoryStream = new MemoryStream(byteArray))
  {
    uploaded = AmazonUploaderService.SendMyFileToS3(memoryStream, _s3FileName);
  }
}&lt;/code&gt;&lt;/pre&gt;
&lt;h2&gt;Step 4: Test using S3 Browser&lt;/h2&gt;
&lt;p&gt;Once the code is run successfully you should be able to view a list of all the reports generated using S3 browser you installed in step 2.&lt;/p&gt;
&lt;p&gt;That&#39;s it! Happy coding :)&lt;/p&gt;</id><updated>2023-02-25T00:40:35.0000000Z</updated><summary type="html">Blog post</summary></entry> <entry><title>Alexa skill integration with Episerver - Part 1</title><link href="https://world.optimizely.com/blogs/aniket-gadre/dates/2019/3/alexa-skill-for-episerver/" /><id>&lt;p&gt;We all know Episerver is a very powerful Enterprise CMS. The content authors and marketers have complete control over content, personalization, analytics as well as access to user data at their finget tips.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;With that said, the technological landscape is changing and so is user interaction with new innovative devices. In my opinion, websites (including responsive mobile websites) will always be the most popular way for presenting information, but as developers &amp;amp; marketers we should be prepared for this giant wave of new trend heading our way. Example: &quot;&lt;span&gt;&lt;em&gt;50% of all searches will be voice searches by 2020&lt;/em&gt;&quot;&lt;/span&gt;. Episerver is prepared with it&#39;s Headless API to allow developers and marketers to ride the wave with ease. Episerver headless API allows serving same content to various devices including voice devices, mobile apps and others.&amp;nbsp;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;I recently implemented an &lt;span style=&quot;text-decoration:&amp;#32;underline;&quot;&gt;&lt;strong&gt;Alexa skill&lt;/strong&gt;&lt;/span&gt; that queries Episerver CMS and returns meaningful data to Alexa device. Though this is a very basic version/proof-of-concept for an Alexa skill it opens up a whole lot of possibilites depending on your user&#39;s interaction with the website. The Alexa skill I implemented reads out the two latest &lt;span style=&quot;text-decoration:&amp;#32;underline;&quot;&gt;news&lt;/span&gt; or &lt;span style=&quot;text-decoration:&amp;#32;underline;&quot;&gt;events&lt;/span&gt; from a website. This can very well be extended to &quot;&lt;em&gt;Give me the store locations for XYZ near Boston&lt;/em&gt;&quot;, &quot;&lt;em&gt;Are there any new promotions for ABC?&lt;/em&gt;&quot;, &quot;&lt;em&gt;Company&#39;s profit summary for this quarter&lt;/em&gt;&quot;. You get the point :)&amp;nbsp;&lt;/p&gt;
&lt;p&gt;As an end user, if you can get quick information from your favorite brand without having to spend time searching for it on a website, it&#39;s a huge value add in terms of convienience and saved time. No more booting up a device, navigating to a website, typing in a search box, enduring frustrating UI and performance issues. Simply ask &quot;Alexa&quot; what you need and listen to it while getting ready to go to work or during commercials on TV. If you are a technology (or tech savvy) organization it&#39;s even more important to display innovation to let your users know that you always &quot;keep up&quot; with new trends in technology.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Alexa Skill&lt;/strong&gt;:&lt;/p&gt;
&lt;p&gt;The core Alexa concept is&amp;nbsp;pretty simple.&amp;nbsp;It consists of:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;Alexa skill kit&amp;nbsp;&lt;/strong&gt;(Front-end code) -&amp;nbsp;&lt;a href=&quot;https://developer.amazon.com/alexa/console/ask&quot;&gt;https://developer.amazon.com/alexa/console/ask&lt;/a&gt;&amp;nbsp;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Lambda function&lt;/strong&gt; (Back-end code) -&amp;nbsp;&lt;a href=&quot;https://aws.amazon.com/lambda/&quot;&gt;https://aws.amazon.com/lambda/&lt;/a&gt;&amp;nbsp;&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;You will need to setup 2 accounts:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;Amazon&lt;/strong&gt; &lt;strong&gt;Developer&lt;/strong&gt;&amp;nbsp;account (to configure front end interactions using Amazon provided UI)&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;AWS&lt;/strong&gt; account (to host the back end code called lambda function)&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;&lt;strong&gt;NOTE&lt;/strong&gt;: Amazon Developer Account has a &lt;span style=&quot;text-decoration:&amp;#32;underline;&quot;&gt;&lt;strong&gt;BETA&amp;nbsp;&lt;/strong&gt;&lt;strong&gt;feature&lt;/strong&gt;&lt;/span&gt; that allows you to host the skill and the code in the same interface which is a super convinient and easy to understand. This is highly recommed if you want to get your Alexa skill up and running in minutes. When you create a new skill select the option &quot;Alexa Hosted (Beta)&quot; and you should be able to host and update the code in the same Amazon developer account.&amp;nbsp;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;span style=&quot;text-decoration:&amp;#32;underline;&quot;&gt;&lt;strong&gt;Definitions&lt;/strong&gt;&lt;/span&gt;:&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Skill Name&lt;/strong&gt;: The name of the skill that will be used when you publish your skill to Amazon.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Invocation Name&lt;/strong&gt;: The term user will call out to invoke/start interaction with your skill. For ex: If the invocation name is &quot;Fun Demo&quot; the user can say Alexa open &quot;Fun Demo&quot; or Alexa start &quot;Fun Demo&quot;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Intent&lt;/strong&gt;:&amp;nbsp;&lt;span&gt;Intents allow you to specify what a user will say to invoke the skill. For ex: Get me the latest news or find me the closest stores in Boston area. You can create a custom intent as well as update the out-of-the-box Amazon provided Intents (such as CancelIntent or HelpIntent).&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;&lt;strong&gt;Slots&lt;/strong&gt;: Slots are nothing but parameters you can pass to the Intent to allow dynamic terms. For example: Order me {number} {size} pizza. The terms number and size are two dymamic parameters passed to the Intent.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Endpoint&lt;/strong&gt;: Endpoint is to connect your front end code (Invocation, Intents) to the backend code. If you are using Alexa hosted beta feaure, then no configuration is necessary. If the back end code is self-hosted (or a rest end point) these values need to be configured. The end point can be a REST endpoint which returns valid data or a lambda function that hosts your backend code.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;To get started, as a first step I recommend setting up a simple Web API REST endpoint in Episerver that returns a JSON object. Ideally you would want to setup Episerver Headless API but for a quick demo a simple REST endpoint should be enough.&lt;/p&gt;
&lt;p&gt;I will get into the details of Alexa skill implementation and integrating it with Episerver in my next blog post.&lt;/p&gt;
&lt;p&gt;&lt;br /&gt;Stay tuned!&lt;/p&gt;</id><updated>2019-03-05T20:38:25.0000000Z</updated><summary type="html">Blog post</summary></entry> <entry><title>Max &amp; Min element validation for Content Area or Link Collection</title><link href="https://world.optimizely.com/blogs/aniket-gadre/dates/2018/10/max--min-validators-for-content-area-or-link-collection/" /><id>&lt;p&gt;Reccently we had a business requirement for setting minimum &amp;amp; maximum&amp;nbsp;limit for blocks&amp;nbsp;in&amp;nbsp;Content Areas. While we&amp;nbsp;can educate&amp;nbsp;the content authors to set the correct number of blocks in the content area it&#39;s always&amp;nbsp;recommended&amp;nbsp;to add validation&amp;nbsp;within the CMS to avoid&amp;nbsp;human errors.&lt;/p&gt;
&lt;p&gt;Here&#39;s the code to to set the maximum number of blocks in the content area and Link Item Collection.&amp;nbsp;&lt;/p&gt;
&lt;pre class=&quot;language-html&quot;&gt;&lt;code&gt;/// &amp;lt;summary&amp;gt;
    /// Sets the maximum element count in a linkcollection, a content area - or any other type of collection.
    /// &amp;lt;/summary&amp;gt;
    [AttributeUsage(AttributeTargets.Property, AllowMultiple = false)]
    public class MaxElementsAttribute : ValidationAttribute, IMetadataAware
    {
        public int MaxCount { get; set; }

        public void OnMetadataCreated(ModelMetadata metadata)
        {
            //TODO: Use to disable editor drag and drop at a certain point.
        }

        protected override ValidationResult IsValid(object value, ValidationContext validationContext)
        {
            if (value == null)
            {
                return null;
            }
            if (value is LinkItemCollection)
            {
                if ((value as LinkItemCollection).Count &amp;gt; MaxCount)
                {
                    return new ValidationResult(&quot;This field exceeds the maximum limit of &quot; + MaxCount + &quot; items&quot;);
                }
            }
            else if (value is ContentArea)
            {
                if ((value as ContentArea).Count &amp;gt; MaxCount)
                {
                    return new ValidationResult(&quot;This field exceeds the maximum limit of &quot; + MaxCount + &quot; items&quot;);
                }
            }

            return null;
        }

        public MaxElementsAttribute(int MaxElementsInList)
        {
            this.MaxCount = MaxElementsInList;
        }
    }&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;On the Content Area field (or Link Item Collection) set the MaxElements attribute as shown below.&amp;nbsp;&lt;/p&gt;
&lt;pre class=&quot;language-html&quot;&gt;&lt;code&gt;[Display(
            Name = &quot;Items&quot;,
            Description = &quot;Items&quot;,
            GroupName = SystemTabNames.Content,
            Order = 30)]
        [MaxElements(25)]
        public virtual ContentArea Items { get; set; }&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;You can use the same logic for setting the minimum number of elements as well.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;br /&gt;&lt;br /&gt;&lt;/p&gt;
&lt;p&gt;Happy coding :)&lt;/p&gt;</id><updated>2018-10-07T04:35:21.0000000Z</updated><summary type="html">Blog post</summary></entry> <entry><title>Boston&#39;s Episerver Developer/Editor meetup</title><link href="https://world.optimizely.com/blogs/aniket-gadre/dates/2018/5/boston-summer-2018-episerver-developereditor-meetup/" /><id>&lt;!DOCTYPE html&gt;
&lt;html&gt;
&lt;head&gt;
&lt;/head&gt;
&lt;body&gt;
&lt;p&gt;Join us on &lt;strong&gt;Tuesday,&amp;nbsp;June&amp;nbsp;12th 2018&lt;/strong&gt;&amp;nbsp;from &lt;strong&gt;5:30 - 7:30 pm&lt;/strong&gt; for the first Episerver Developer/Editor Meetup in Boston hosted by Rightpoint.&lt;/p&gt;
&lt;p&gt;Speakers include &lt;a href=&quot;https://www.linkedin.com/in/robfolan/&quot;&gt;Rob Folan&lt;/a&gt;&amp;nbsp;from&amp;nbsp;&lt;a href=&quot;http://www.episerver.com/&quot;&gt;Episerver&lt;/a&gt;&amp;nbsp;&amp;amp; &lt;a href=&quot;https://www.linkedin.com/in/aniketdgadre/&quot;&gt;myself&lt;/a&gt;&amp;nbsp;from &lt;a href=&quot;https://www.rightpoint.com/&quot;&gt;Rightpoint&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Rob will talk about the&amp;nbsp;cool refreshing summer updates with the&amp;nbsp;Episerver&amp;nbsp;platform (Insight, Perform, Advance &amp;amp; Campain)&amp;nbsp;and I will talk about the hot&amp;nbsp;Digital Experience Cloud.&lt;/p&gt;
&lt;p&gt;To continue with the theme, we will be serving HOT pizza and&amp;nbsp;COLD beverages.&lt;/p&gt;
&lt;p&gt;For more information and to RSVP, please&amp;nbsp;visit the&amp;nbsp;Eventbrite page &lt;a href=&quot;https://www.eventbrite.com/e/episerver-developer-editor-boston-summer-meetup-tickets-45780188777&quot;&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Hope to see you all there!&lt;/p&gt;
&lt;/body&gt;
&lt;/html&gt;</id><updated>2018-05-29T22:26:41.0000000Z</updated><summary type="html">Blog post</summary></entry> <entry><title>Dynamic Drop down list editable by content authors using ISelectionFactory  &amp; Property List</title><link href="https://world.optimizely.com/blogs/aniket-gadre/dates/2018/4/dynamic-drop-down-list-editable-by-content-authors-using-iselectionfactory---property-list/" /><id>&lt;!DOCTYPE html&gt;
&lt;html&gt;
&lt;head&gt;
&lt;/head&gt;
&lt;body&gt;
&lt;p&gt;On one of my recent projects, we wanted the ability for content authors to&amp;nbsp;manage&amp;nbsp;a list of items in the dropdown through the CMS. Though, it&#39;s simple to create a list of items that can be retrieved from a constants&amp;nbsp;class or an appsetting.config, it does require developer to make that change. This also means there needs to be a deployment to production for a simple name/value change in&amp;nbsp;a dropdown.&lt;/p&gt;
&lt;p&gt;To get around this standard implementation, we implemented a simple Property List on the Start Page and referenced the property in the ISelectionFactory implementation to get the key value pair.&lt;/p&gt;
&lt;p&gt;The dropdown list can now be managed by the content authors without relying on the developer to make a change.&amp;nbsp;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Here&#39;s the code:&lt;/p&gt;
&lt;pre class=&quot;language-html&quot;&gt;&lt;code&gt; public class AccountPropertiesFactory : ISelectionFactory
    {
        public IEnumerable&amp;lt;ISelectItem&amp;gt; GetSelections(ExtendedMetadata metadata)
        {
            var contentRepository = ServiceLocator.Current.GetInstance&amp;lt;IContentLoader&amp;gt;();
            var startPage = contentRepository.Get&amp;lt;StartPage&amp;gt;(ContentReference.StartPage);

            var selectItems = new List&amp;lt;SelectItem&amp;gt;();

            foreach(var accountProperty in startPage.AccountTypePropertyList)
            {
                selectItems.Add( new SelectItem()
                    {
                        Text = accountProperty.Text,
                        Value = accountProperty.Text
                    }
                );
            }

            return selectItems.ToArray();
        }
    }&lt;/code&gt;&lt;/pre&gt;

&lt;/body&gt;
&lt;/html&gt;</id><updated>2018-04-27T16:00:51.0000000Z</updated><summary type="html">Blog post</summary></entry> <entry><title>Continuous Integration &amp; Deployment with VSTS &amp; DXC (Azure Integration environment)</title><link href="https://world.optimizely.com/blogs/aniket-gadre/dates/2017/4/continuous-builddeployment-with-vsts--dxc/" /><id>&lt;!DOCTYPE html&gt;
&lt;html&gt;
&lt;head&gt;
&lt;/head&gt;
&lt;body&gt;
&lt;p&gt;I recently implemented continous integration &amp;amp; deployment using Visual Studio Team Services in Epi&#39;s DXC&amp;nbsp;environment. Though I now feel it&#39;s pretty simple, it can be a little tricky if&amp;nbsp;you have no prior experience configuring it.&lt;br /&gt;&lt;br /&gt;&lt;/p&gt;
&lt;h2&gt;What is CI/CD?&lt;/h2&gt;
&lt;p&gt;For folks who are new to build &amp;amp; deployment process, Continuous Integration (CI) is a development practice that requires developers to integrate code into a shared repository several times a day. Each check-in is then verified by an automated build, allowing teams to detect problems early.&amp;nbsp;&lt;span&gt;Because you&amp;rsquo;re integrating so frequently, there is significantly less back-tracking to discover where things went wrong, so you can spend more time building features. Continuous Deployment (CD) is an extension of CI, where you can auto-deploy your integrated changes to the server (Integration environment in case of DXC). This significantly reduces the risks,&amp;nbsp;costs and effort to deploy&amp;nbsp;and allows you to&amp;nbsp;deliver features to the clients sooner.&lt;br /&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;
&lt;h2&gt;Publish Profile:&lt;/h2&gt;
&lt;p&gt;One important thing you need to get your head wrapped around is the concept of Publish Profile. If you are using Epi DXC service, you have azure portal access to the Integration environment (&lt;a href=&quot;https://portal.azure.com&quot;&gt;https://portal.azure.com&lt;/a&gt;). If you don&#39;t, you should ask Episerver managed services to give you the access you need. The publish profile allows you to publish your code to Azure/Integration environment from Visual Studio (by setting up publishing profile)&amp;nbsp;or VSTS (by adding it to the build definition). I prefer using VSTS since there are a lot more configuration options avaialable to you. &amp;nbsp; &amp;nbsp;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Here&#39;s how you download the publishing profile from Azure (Integration environment).&lt;/p&gt;
&lt;p&gt;App Services (left menu) &amp;gt; Select your App Service. You should see the &quot;Get publish profile&quot; button in your rightmost pane.&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;/link/8326df1d6cda47279d2412fc96f022e3.aspx&quot; alt=&quot;Image PublishProfile.GIF&quot; /&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;You will need this in order to integrate with VSTS for CI/CD.&amp;nbsp;&lt;br /&gt;&lt;br /&gt;&lt;/p&gt;
&lt;h2&gt;Buid Definition:&lt;/h2&gt;
&lt;p&gt;VSTS allows you to&amp;nbsp;setup multiple build definitions.&lt;/p&gt;
&lt;p&gt;You can have one build definition called &quot;Develop CI&quot; (that builds the develop branch)&amp;nbsp;&amp;amp; one for &quot;Integration CI/CD&quot; (that builds &amp;amp; also deploys the code from the master branch). By separating out these build definitions, you can make sure that only master branch code (that is final) gets auto-deployed to the the Integration environment.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;/link/427d0c62540b46cda6e6c84f53d0a7cd.aspx&quot; width=&quot;664&quot; alt=&quot;Image Build-Definition-3.GIF&quot; height=&quot;332&quot; /&gt;&lt;/p&gt;
&lt;p&gt;For integration deployment&amp;nbsp;under MSBuild Arguments you can provide details using the publishing profile you downloaded from Azure Integration environmnent. This will publish your changes&amp;nbsp;after successfully building the &quot;master&quot; branch. If you see the &quot;Get Sources&quot; points to the master branch&amp;nbsp;v/s develop branch as shown above.&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;/link/52062711434a4493950a4312a1dfae35.aspx&quot; width=&quot;1362&quot; alt=&quot;Image Integration-Build-Definition.GIF&quot; height=&quot;472&quot; /&gt;&lt;/p&gt;
&lt;p&gt;You can also setup triggers to run a build on every check-in (or scheduled) for&amp;nbsp;the master branch, so&amp;nbsp;when someone (hopefully a&amp;nbsp;team lead/architect) pushes to the mater branch VSTS builds the solution (including nuget, gulp processes) and publishes your changes to the Integration environment without any manual intervention.&lt;/p&gt;
&lt;p&gt;This is just a basic process to setup CI/CD in the DXC environment. You can get get more creative with a lot of in-built features from VSTS.&lt;/p&gt;
&lt;p&gt;Hope this helps!&amp;nbsp;Comments are welcome...&lt;/p&gt;
&lt;p&gt;AG&amp;nbsp;&lt;/p&gt;
&lt;/body&gt;
&lt;/html&gt;</id><updated>2017-04-04T21:03:31.7770000Z</updated><summary type="html">Blog post</summary></entry> <entry><title>Episerver Certification - Tips &amp; Tricks!!!</title><link href="https://world.optimizely.com/blogs/aniket-gadre/dates/2017/3/episerver-certification-9-0/" /><id>&lt;!DOCTYPE html&gt;
 
&lt;html&gt;
&lt;head&gt;
&lt;meta http-equiv=&quot;Content-Type&quot; content=&quot;text/html; charset=UTF-8&quot; /&gt;
&lt;meta name=&quot;author&quot; content=&quot;Aniket Gadre&quot; /&gt;
&lt;meta name=&quot;description&quot; content=&quot;Episerver certification, exam tips, tips and tricks&quot; /&gt;
&lt;meta name=&quot;keywords&quot; content=&quot;Episerver Certification, exam tips, episerver cms, CMS, Episerver exam&quot; /&gt;
&lt;meta property=&quot;og:title&quot; content=&quot;Episerver Certification - Tips &amp; Tricks!&quot; /&gt;
&lt;meta property=&quot;”og:image”&quot; content=&quot;http://cdn.certmag.com/wp-content/uploads/2015/07/Passed-the-exam-300x202.jpg&quot; alt=&quot;Image result for exam you passed&quot; /&gt;
&lt;title&gt;Episerver Certification - Tips &amp; Tricks!&lt;/title&gt;
&lt;/head&gt;
&lt;body&gt;
&lt;p&gt;Yesterday, I passed the Episerver 9 Certifcation, so sharing my thoughts while it&#39;s fresh in my mind.&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;http://cdn.certmag.com/wp-content/uploads/2015/07/Passed-the-exam-300x202.jpg&quot; alt=&quot;Image result for exam you passed&quot; /&gt;&lt;/p&gt;
&lt;h2&gt;&lt;span style=&quot;text-decoration: underline;&quot;&gt;Preparation&lt;/span&gt;:&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;If you have attended &quot;&lt;strong&gt;Bootcamp&lt;/strong&gt;&quot;, make sure you know the materials&amp;nbsp;covered in the class inside out. This is your bible for the exam.&amp;nbsp;I would say, at least 30-40% of the exam&amp;nbsp;questions&amp;nbsp;will be based on the material covered during the course. About 30-40% will be based on your understanding of these concepts. The rest of the questions can be based&amp;nbsp;on&amp;nbsp;admin, editor guides, SDK or sample Alloy website.&lt;/li&gt;
&lt;li&gt;Alloy, Alloy, Alloy. Did I mention the &lt;strong&gt;Alloy&lt;/strong&gt; website? Look at every single method and class (if it doesn&#39;t make sense, google is your friend).&lt;/li&gt;
&lt;li&gt;Have at least one or two projects under your belt. To give you an anology - it&#39;s the difference between reading a book on swimming and actually swimming in the water.&lt;/li&gt;
&lt;li&gt;Understand all areas of CMS like editing interface, gadgets, mirroring, personalization, multi-site setup, languages, friendly urls, access rights, scheduled jobs etc. by clicking on each link and/or reading the admin/editor guides.&amp;nbsp;&lt;/li&gt;
&lt;li&gt;Know all areas of&amp;nbsp;development like CRUD operations on blocks/pages, templates, caching, modules, add ons, filtering,&amp;nbsp;all base classes &amp;amp; APIs, localization, attributes, custom editor, dynamic data etc.&lt;/li&gt;
&lt;li&gt;Focus on the concepts. Ask yourself two questions -&amp;nbsp;&quot;&lt;strong&gt;How&lt;/strong&gt; does it work&quot; &amp;amp; &quot;&lt;strong&gt;Why&lt;/strong&gt; does it work that way&quot;.&amp;nbsp;&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;&lt;span style=&quot;text-decoration: underline;&quot;&gt;About the Exam&lt;/span&gt;:&lt;/h2&gt;
&lt;p&gt;Knowledge areas you will be tested on:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Production knowledge&lt;/li&gt;
&lt;li&gt;Installation&lt;/li&gt;
&lt;li&gt;Content Model&lt;/li&gt;
&lt;li&gt;Creating Websites&lt;/li&gt;
&lt;li&gt;Advanced Concepts&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;It&#39;s a multiple choice exam with &lt;strong&gt;64 questions&lt;/strong&gt; and 120 mins, so roughly &lt;strong&gt;2 mins per question&lt;/strong&gt;. I don&#39;t think time is an issue because I completed mine 45 mins before time. One tip I would like to give is, if you aren&#39;t confident about something, skip the quetsion and come back to it later. You can&#39;t go back once you have answered a question.&lt;/p&gt;
&lt;p&gt;There are 3 types of multiple choice questions:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;Select &lt;strong&gt;ONE&lt;/strong&gt; of the following 4 options.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;TWO&lt;/strong&gt;&amp;nbsp;of the following 4 options.&lt;/li&gt;
&lt;li&gt;Which of&amp;nbsp;these is &lt;strong&gt;NOT&lt;/strong&gt; true of the 4 options.&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;The last two&amp;nbsp;are a little tricky because if you answer one out of the two wrong, your answer will be marked as incorrect. Be careful becuase the options are&amp;nbsp;a&amp;nbsp;little tricky :)&amp;nbsp;&lt;/p&gt;
&lt;h2&gt;&lt;span style=&quot;text-decoration: underline;&quot;&gt;Before Exam (Remote)&lt;/span&gt;:&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;You will receive an email&amp;nbsp;from Episerver about the instructions 24-48 hours before the exam. Make sure you read these instructions carefully. This is a closed book (application) exam, so make sure you have nothing around you (including cell phones) and close all your applications. No multiple screens allowed during the exam.&lt;/li&gt;
&lt;li&gt;Book a quite&amp;nbsp;conference room with a good internet connection (preferably wired).&amp;nbsp;&lt;/li&gt;
&lt;li&gt;For remote exam, make sure your computer/laptop is compatible&amp;nbsp;based on the instructions provided. This includes&amp;nbsp;web camera, microphone, operating system etc.&lt;/li&gt;
&lt;li&gt;I would recommend you take the practice test (included in the instructions), so&amp;nbsp;you are&amp;nbsp;familair with the remote proctoring software and not figuring it out during the exam.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;&lt;span style=&quot;text-decoration: underline;&quot;&gt;During Exam&lt;/span&gt;:&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Keep your cool and don&#39;t rush. Some of the questions are difficult so you may have to wing it at the end.&amp;nbsp;But&amp;nbsp;you will get enough easy/medium level questions to pass the exam.&amp;nbsp;&lt;/li&gt;
&lt;li&gt;Use some educated guesses. As long as you know the concepts on &quot;How&quot; and &quot;Why&quot; it will be easy for you&amp;nbsp;to pick the right answer based on what you know.&lt;/li&gt;
&lt;li&gt;Don&#39;t leave the room or talk to anyone and avoid any distractions.&amp;nbsp;&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;&lt;span style=&quot;text-decoration: underline;&quot;&gt;After Exam&lt;/span&gt;:&lt;/h2&gt;
&lt;p&gt;You will receive a PASS or FAIL right after you complete the exam. If you passed the exam, you will receive an official email from Episerver after reviewing your recording within 5 working days. If you fail, you will be able to re-take the exam after 21 days.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Hope this helps, if you have any questions&amp;nbsp;reach out to me on &lt;a href=&quot;mailto:agadre@rightpoint.com&quot;&gt;agadre@rightpoint.com&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Good luck!&lt;/p&gt;
&lt;p&gt;AG&lt;/p&gt;
&lt;/body&gt;
&lt;/html&gt;</id><updated>2017-03-24T15:48:49.6330000Z</updated><summary type="html">Blog post</summary></entry></feed>