I have a scenario where I need to receive messages from customer's own Azure service bus queues.
I'm intending to use scheduled jobs for this purpose, to listen to the queues and process the messages accordingly. As this will be a long-running job, I'm looking to leverage the dedicated scheduled job web app set up as described in this blog
However, I'm not sure how to set up listening to the queue as a scheduled job. Typically scheduled jobs are configured to run at specified intervals where the job starts and ends after processing is completed. In my case, I need the scheduled job to be automatically started and continually listen to the queue to receive messages. Is this possible to achieve with the scheduled job functionality? Do I have to use an infinite delay timer after starting the queue listener in the job's Execute method, to ensure the job is always running. Hoping there is a better way to achieve this, appreciate any ideas
The job is pulling mechanism. If I understand your requirment correctly, you're after the realtime listening and processing and the schedule job doesn't seem fit into your use case. Since you're using Azure service bus queue, there are many other options out there, is there any specific reason why you intend to use scheduled job?
Thanks for replying.
Correct, I'm after real-time listening. Ideally, an Azure Function would fit this use case however as I understand it is not currently supported in DXP.
Scheduled job seems the only other logical choice, which also allows leveraging the dedicated web app to run these jobs separate from the primary web app.
I'd be keen to understand the other options for real-time listening that won't have any performance impact on the primary web app.
Isn't your customer using their own Azure Service bus queue, not from DXP? If that's correct, I believe you customer must have their own Azure subscription already. Here is the idea, write your own Azure function or use logic app and deploy it under your client Azure subscription, then create your custom API endpoint hosted in DXP and post data with secured way.
We are on the same page :). We did consider a message forwarding approach from the client's Azure subscription however it did have its own challenges.
I found another approach (thanks to Nikhil) using WebJobs in DXP. I didn't realise WebJobs were supported in DXP, was this feature recently added?
Looks like we would be able to implement real-time listening and asynchronous processing of messages using a WebJob in DXP.
Glad you find an alternative. I'm not aware the support of WebJobs in DXP, you might need to send email to support for further clarification.
The blog post seems quite interesting idea, however I personally would get a bit clear understanding how things work before processing this approach. For example, how easy can you access to the log of job in case if you need to troubleshooting? How would you deploy this to higher environments pre prod and prod? How is the webjob affected during the deployment? How does it work in multiple instances?
Let me know once you find out these answers :)
DXP only recently started supporting WebJobs, I believe all native functionality of WebJobs is supported. The WebJob is deployed together with the web app so no changes to the deployment flow.
We did raise a ticket with support, I'll forward it to you for your reference
Adding a link to the official documentation for WebJobs, should be noted that only Continious web jobs are supported in DXP:
> "How does it work in multiple instances?"
Usually WebJobs runtime uses Azure Storage Blob lock functionality to coordinate runs across multiple instances.