Category.NET

Batch completion with multiple receivers on Azure Service Bus

B

In the last post, we created multiple receivers with multiple factories to speed up our message processing. Unfortunately, this has some consequences for our background completion loop. Since the message processing logic is shared between multiple receivers, all the receivers will try to push lock tokens into the concurrent stack. So the following code will be called by numberOfReceivers * concurrencyPerReceiver concurrently. So for example when we’d use 10 receivers with each a...

Multiple message receivers with Azure Service Bus

M

In the first post of this series we looked at a simple message receive loop. We tried to optimize the receive loop by moving out the message completion into a dedicated background operation. It turns out we can do more to boost the message throughput. Here is how our simple loop looked like We created a queue client with QueueClient.CreateFromConnection. If we carefully read the best practices for improvements using Service Bus Messaging we can see that there are multiple ways of creating queue...

Async method without cancellation support, do it my way.

A

In the last post, I talked about Dennis Doomen’s LiquidProjects project and the challenge they faced with asynchronous APIs that were not cancelable. Dennis came up with the following solution to the infinitely running Task.Delay operations. The code above creates a linked token source, an infinitely delayed task which observes the token referenced by the linked token source. When the outer token source cancels the linked token source will also cancel the token owned by it. When the task...

Improved batch completion loop for Azure Service Bus

I

In the last post, we created a dedicated completion loop inside a worker thread to complete messages in batches. There are a few obvious or not so obvious things we can improve in our code. But first, how did it look like again? We could make a simple assumption: If the numberOfItems returned by TryPopRange is equal the maximum lock token range we want to complete in batches (here one hundred) then we have potentially more things to complete, and we can try to avoid the delay. So if we were...

Another attempt to batch complete with Azure Service Bus

A

In the last post, we tried to be smart with batch completion and completed every one-hundredth message. Unfortunately, this way of doing completion might not work under with few messages in the queue. We decided to go back to the drawing board and see if we can come up with a better approach which can cope with high but also small loads of messages. The heart of the code we wrote in the last post (omitting the exception handling part) looks like Instead of completing messages in batches of...

Async method without cancellation support, oh my!

A

Sometimes you have to interact with asynchronous APIs that you don’t control. Those APIs might be executing for a very long time but have no way to cancel the request. Dennis Doomen was exactly in such a situation while building his opinionated Event Sourcing projections library for .NET called LiquidProjections. The library is using NEventStore under the covers. The actual access to the NEventStore is adapted in a class called NEventStoreAdapter. The adapter takes care of loading pages...

Let’s try batch completion of messages on Azure Service Bus

L

In the last post, we deferred the message completion task into the background to remove the additional latency from the message receive callback. This time we want to save even more latency by batching multiple message completions together into a single Azure Service Bus broker call. It turns out this is possible with the Azure SDK. The MessageReceiver has a method called CompleteBatchAsync which accepts an IEnumerable<Guid> called lockTokens. But where do we get these lock tokens from...

Defer message completion on Azure Service Bus

D

In the last post, we started sketching code around the idea of deferring the actual message completion out of the message receive callback. A simple way to achieve our goals would be to kick off the completion task without await it like the following code shows This is inelegant in multiple ways. Since the code is contained in a delegate which is marked as async, the compiler would warn us with CS4014 that the call is not awaited. Of course, we could work around that by suppressing the compiler...

Latency to Azure Service Bus matters

L

In the last post, we optimized the throughput on the client by increasing the concurrency of the message receive loop. What else could we try to improve? Let’s have a closer look at the lambda body of the OnMessageAsync declaration. As we can see, we have technically two operations inside the callback delegate. The first operation is the code we want to execute when a message is pushed to us (here represented with comment). The second operation is the completion or abandon of the message...

Let’s save our Thanksgiving sales with Azure Service Bus

L

As we saw in the last post, naive receiving of messages with Azure Service Bus destroys our Thanksgiving sales. But how can we do better to satisfy our bosses and first and foremost our precious customers? As Sean Feldman rightfully pointed out in the comments section, our sample was using MaxConcurrentCalls set to one. How silly! Who wrote that… Oh, it was me 😉 So how about we correct my mistake and set to 100 to like the following With this tiny change, we can suddenly receive 100...

Recent Posts