TagAzure Service Bus

Introduction to the MultiProducerConcurrentConsumer for Azure Service Bus message completion

I

In the last post in the series about Azure Service Bus message completion, I briefly talked how the message batch completion could benefit from a more reactive approach. Before I dive into the inner workings of the reactive structure that I’m going to outline in the next few posts, I need to briefly recap the functional and non-functional requirements we need from the component. The component shall Make sure messages are only completed on the receiver they came from Reduce the number of...

Complete messages where they came from with Azure Service Bus

C

In the last post, we made a simplistic attempt to speed up batch completion of messages by having multiple dedicated background completion tasks. Unfortunately, this introduced an interesting side effect. The code above is using a single receive client to complete message lock tokens in batches. Those lock token could come from multiple receive clients, but they would end up being completed on the same instance. Depending on the transport type you use it might work or it won’t. When you...

Batch completion with multiple receivers on Azure Service Bus

B

In the last post, we created multiple receivers with multiple factories to speed up our message processing. Unfortunately, this has some consequences for our background completion loop. Since the message processing logic is shared between multiple receivers, all the receivers will try to push lock tokens into the concurrent stack. So the following code will be called by numberOfReceivers * concurrencyPerReceiver concurrently. So for example when we’d use 10 receivers with each a...

Multiple message receivers with Azure Service Bus

M

In the first post of this series we looked at a simple message receive loop. We tried to optimize the receive loop by moving out the message completion into a dedicated background operation. It turns out we can do more to boost the message throughput. Here is how our simple loop looked like We created a queue client with QueueClient.CreateFromConnection. If we carefully read the best practices for improvements using Service Bus Messaging we can see that there are multiple ways of creating queue...

Improved batch completion loop for Azure Service Bus

I

In the last post, we created a dedicated completion loop inside a worker thread to complete messages in batches. There are a few obvious or not so obvious things we can improve in our code. But first, how did it look like again? We could make a simple assumption: If the numberOfItems returned by TryPopRange is equal the maximum lock token range we want to complete in batches (here one hundred) then we have potentially more things to complete, and we can try to avoid the delay. So if we were...

Another attempt to batch complete with Azure Service Bus

A

In the last post, we tried to be smart with batch completion and completed every one-hundredth message. Unfortunately, this way of doing completion might not work under with few messages in the queue. We decided to go back to the drawing board and see if we can come up with a better approach which can cope with high but also small loads of messages. The heart of the code we wrote in the last post (omitting the exception handling part) looks like Instead of completing messages in batches of...

Let’s try batch completion of messages on Azure Service Bus

L

In the last post, we deferred the message completion task into the background to remove the additional latency from the message receive callback. This time we want to save even more latency by batching multiple message completions together into a single Azure Service Bus broker call. It turns out this is possible with the Azure SDK. The MessageReceiver has a method called CompleteBatchAsync which accepts an IEnumerable<Guid> called lockTokens. But where do we get these lock tokens from...

Defer message completion on Azure Service Bus

D

In the last post, we started sketching code around the idea of deferring the actual message completion out of the message receive callback. A simple way to achieve our goals would be to kick off the completion task without await it like the following code shows This is inelegant in multiple ways. Since the code is contained in a delegate which is marked as async, the compiler would warn us with CS4014 that the call is not awaited. Of course, we could work around that by suppressing the compiler...

Latency to Azure Service Bus matters

L

In the last post, we optimized the throughput on the client by increasing the concurrency of the message receive loop. What else could we try to improve? Let’s have a closer look at the lambda body of the OnMessageAsync declaration. As we can see, we have technically two operations inside the callback delegate. The first operation is the code we want to execute when a message is pushed to us (here represented with comment). The second operation is the completion or abandon of the message...

Let’s save our Thanksgiving sales with Azure Service Bus

L

As we saw in the last post, naive receiving of messages with Azure Service Bus destroys our Thanksgiving sales. But how can we do better to satisfy our bosses and first and foremost our precious customers? As Sean Feldman rightfully pointed out in the comments section, our sample was using MaxConcurrentCalls set to one. How silly! Who wrote that… Oh, it was me 😉 So how about we correct my mistake and set to 100 to like the following With this tiny change, we can suddenly receive 100...

Recent Posts