November Happy Hour will be moved to Thursday December 5th.
November Happy Hour will be moved to Thursday December 5th.
Hi Ron
The job is pulling mechanism. If I understand your requirment correctly, you're after the realtime listening and processing and the schedule job doesn't seem fit into your use case. Since you're using Azure service bus queue, there are many other options out there, is there any specific reason why you intend to use scheduled job?
Cheers
Hey Vincent
Thanks for replying.
Correct, I'm after real-time listening. Ideally, an Azure Function would fit this use case however as I understand it is not currently supported in DXP.
Scheduled job seems the only other logical choice, which also allows leveraging the dedicated web app to run these jobs separate from the primary web app.
I'd be keen to understand the other options for real-time listening that won't have any performance impact on the primary web app.
Thanks
Hi Ron
Isn't your customer using their own Azure Service bus queue, not from DXP? If that's correct, I believe you customer must have their own Azure subscription already. Here is the idea, write your own Azure function or use logic app and deploy it under your client Azure subscription, then create your custom API endpoint hosted in DXP and post data with secured way.
Cheers.
Hi Vincent
We are on the same page :). We did consider a message forwarding approach from the client's Azure subscription however it did have its own challenges.
I found another approach (thanks to Nikhil) using WebJobs in DXP. I didn't realise WebJobs were supported in DXP, was this feature recently added?
Looks like we would be able to implement real-time listening and asynchronous processing of messages using a WebJob in DXP.
Thanks
Hi Ron
Glad you find an alternative. I'm not aware the support of WebJobs in DXP, you might need to send email to support for further clarification.
The blog post seems quite interesting idea, however I personally would get a bit clear understanding how things work before processing this approach. For example, how easy can you access to the log of job in case if you need to troubleshooting? How would you deploy this to higher environments pre prod and prod? How is the webjob affected during the deployment? How does it work in multiple instances?
Let me know once you find out these answers :)
Cheers.
Hi Vincent
Great questions.
DXP only recently started supporting WebJobs, I believe all native functionality of WebJobs is supported. The WebJob is deployed together with the web app so no changes to the deployment flow.
We did raise a ticket with support, I'll forward it to you for your reference
Thanks
Adding a link to the official documentation for WebJobs, should be noted that only Continious web jobs are supported in DXP:
> "How does it work in multiple instances?"
Usually WebJobs runtime uses Azure Storage Blob lock functionality to coordinate runs across multiple instances.
Be aware that WebJob support seems to have been removed in the recently .NET 5 update https://world.optimizely.com/documentation/developer-guides/digital-experience-platform/webjobs-in-dxp-environments/
A few alternative options for you:
Option 1:
Build an Azure Function App that consumes the Service Bus and pushes the message to an API endpoint within your Optimizely build.
Option 2:
Use an Azure Event Grid Topic instead of a Service Bus. They work a lot like an Azure Service Bus, but instead of having to pull from the Azure Service Bus, the Event Grid Topic will push messages to an API of your choosing at a rate that your application can handle the requests.
Be aware that in case of EventGrid - you will receive event metadata and not the actual content of the message. For that you might do a round-trip back to the source system which emitted event and fetch more data..
Valdis, this depends on what you send to Event Grid in the first instance. You can choose to include the actual content of the message in the Event Grid message. The data property of the Event Grid message structure is any object you want it to be.
{
"topic": string,
"subject": string,
"id": string,
"eventType": string,
"eventTime": string,
"data":{
some-object-containing-data-you-want
},
"dataVersion": string,
"metadataVersion": string
}
Yup, depends (forgot to add that). In case you use Azure built-in events (like blob storage update) - actual content is required to fetch on your own.
In my last implementation, a data change event in an separate application would push an event to an Event Grid Topic where the data property contained both the new version of the record. But yes, you're right, it does depend on whether they are in built events or something more custom :)
Yeah, when you are on your own - you can add whatever you need and a kitchen sink.. ;)
I can only agree with Stotty. Create an Azure Function. It's super simple to create, and to consume the Service Bus. It's not included in the DXP offering, so you would need to set up your own. But the cost of hosting that function will be close to zero.
If you don't need real time updates, then go ahead and do a scheduled job that runs every x seconds. Just say no to webjobs...😉
There is an even simpler solution and you can hack it inside Optimizely site - you can utilize .NET hosted services - this will be the thing running in the background, always alive (as long as your site is alive), and able to launch message receiver on the SB.
Initialization sequence might be out of sync with CMS (like background services are initialized at the very beginning of the app) - but you should be able to request all the stuff from IoC.
Hello,
I have a scenario where I need to receive messages from customer's own Azure service bus queues.
I'm intending to use scheduled jobs for this purpose, to listen to the queues and process the messages accordingly. As this will be a long-running job, I'm looking to leverage the dedicated scheduled job web app set up as described in this blog
However, I'm not sure how to set up listening to the queue as a scheduled job. Typically scheduled jobs are configured to run at specified intervals where the job starts and ends after processing is completed. In my case, I need the scheduled job to be automatically started and continually listen to the queue to receive messages. Is this possible to achieve with the scheduled job functionality? Do I have to use an infinite delay timer after starting the queue listener in the job's Execute method, to ensure the job is always running. Hoping there is a better way to achieve this, appreciate any ideas
Thanks