Patrick Lam
May 5, 2025
  287
(2 votes)

Improving Query Performance in Optimizely Graph

As part of your onboarding with Optimizely Graph, we recommend adopting the following best practices to help improve performance and reduce query latency. Implementing these strategies can significantly enhance the efficiency of your Optimizely Graph implementation, leading to a better user experience and more reliable data retrieval.

Use Cached Templates

Cached templates allow you to store and reuse translated queries with variable placeholders. This minimizes processing overhead and helps deliver faster response times. Instead of translating the same query every time it's executed, the translated query is cached and reused. This is one of the key improvements you can make to your Graph implementation.

Example:

To enable cached templates, simply add the following to your request URL and request header.

  1. Query string parameter - Add stored=true to your request URL.
    https://cg.optimizely.com/content/v2?auth=123456789&stored=true
  2. Request header - Include the following header:
    • Key - cg-stored-query
    • Value - template

Code example showing a request using cached Templates:


-H "Content-Type: application/json" \
-H "cg-stored-query: template" \
-d '{
  "query": "query GetItem($id: ID!) { item(id: $id) { name, description } }",
  "variables": {
    "id": "12345"
  }
}'

Benefits:

  • Reduced Latency: By reusing translated queries, you avoid the overhead of repeated translation.
  • Improved Throughput: Caching allows your application to handle more requests with the same resources.
  • Lower CPU Usage: Reduced processing leads to lower CPU utilization on your servers.

Use Item Queries for Single Entities

When retrieving a single item, use the item query instead of items. This improves cache efficiency and reduces unnecessary cache invalidation. When you use the items query, Optimizely Graph may invalidate the entire cache for that content type, even if you're only retrieving one item. Using the item query ensures that only the cache for that specific item is invalidated when it's updated. This will lead to noticeable performance gains.

Example:

Code example showing an item query that retrieves a single item based on its RelativePath:


query GetItem($relativePath: String) {  
  Content(where: { RelativePath: { eq: $relativePath } }) {  
    item { Name RelativePath }  
  }  
}

Benefits:

  • Better Cache Efficiency: Only the cache for the specific item is invalidated, preserving other cached data.
  • Reduced Cache Invalidation: Minimizes unnecessary cache invalidation, leading to more stable cache performance.
  • Faster Data Retrieval: More efficient cache usage results in faster data retrieval times.

Conclusion

Following these practices will help ensure a smoother experience and more consistent performance. These improvements are essential for maintaining a high-performing Optimizely Graph implementation. If you have questions or would like support reviewing your implementation, our team is happy to assist.

Further Reading

May 05, 2025

Comments

Please login to comment.
Latest blogs
Content Compliance Without the Chaos: How Optimizely CMP Empowers Financial Services Marketers

In financial services, content isn’t just about telling your story — it’s about telling it right. Every blog post, product update, or social post i...

abritt | May 22, 2025 |

Opal – Optimizely’s AI-Powered Marketing Assistant

Overview Opal is Optimizely’s AI assistant designed to accelerate and enhance the entire marketing workflow. Integrated natively across...

abritt | May 22, 2025 |

Integrating Address Validation in Optimizely Using Smarty

Address validation is a crucial component of any ecommerce platform. It ensures accurate customer data, reduces shipping errors, and improves the...

PuneetGarg | May 21, 2025

The London Dev Meetup is TOMORROW!!

The rescheduled London Dev Meetup is happening tomorrow, Wednesday, 21st May, at 6pm! This meetup will be Candyspace 's first, and the first one he...

Gavin_M | May 20, 2025