Shows how to batch multiple messages together into one single operation against Azure Service Bus, for more posts in this series go to Contents.
In the article about atomic sends, we talked about the fact that each send operation executed against Azure Service Bus can fail and potentially has to be retried. Another important aspect though that wasn’t covered in the previous article is that every send operation has latency costs associated with it. Let’s assume we want to send ten messages to a destination like the following
for (var i = 0; i < 10; i++)
{
var message = new ServiceBusMessage($"Deep Dive{i}");
await sender.SendMessageAsync(message);
}
From my crappy home DSL over WIFI, I get the following latencies to the closest datacenters in Azure. Of course, these figures are not realistic latencies because you would not run your code over WIFI, and hopefully your code would run as PaaS directly in Azure so your latency would be much lower.

But for the sake of this example, let’s take the average of all those latencies, which is a well rounded 50 ms (keep in mind averages are actually meaningless in real-world scenarios but I don’t want to complicated things here unnecessarily). So sending the above messages will take roughly 500 ms because we are “charged” the average latency cost on every send operation. But I have good news for you. In cases where we know, we are going to send multiple messages to the same destination, we can collect them together and send them in one single operation to Azure Service Bus.
var messages = new List<ServiceBusMessage>();
for (var i = 0; i < 10; i++)
{
var message = new ServiceBusMessage("Deep Dive{i}");
messages.Add(message);
}
await sender.SendMessagesAsync(messages);
Now all the ten messages are batched together into a single operation that might need to be retried, but in case it is successful we only pay the latency costs once and this time it is in our example 50 ms only which makes a difference of 450 ms in total. You might be thinking: Great let’s throw as many messages as we can into a single batch. I have some news for you, though. There are limits. The limits are the following:
- You can batch up to 256 KB of messages into a single operation on the standard tier and up to 1 MB of messages into a single operation on the premium tier
- For transactional sends (atomic operations within a TransactionScope) you can batch up to 100 messages into a single operation
So on the standard tier, the following code would throw a MessageSizeExceededException because the total bytes we are sending to Azure Service Bus exceed 256 KB.
for (var i = 0; i < 6500; i++)
{
var message = new ServiceBusMessage($"Deep Dive{i}");
messages.Add(message);
}
// will throw MessageSizeExceededException
await sender.SendMessagesAsync(messages);
And for transactional sends the following code will throw a QuotaExceededException regardless of the tier being used.
for (var i = 0; i < 101; i++)
{
var message = new ServiceBusMessage($"Deep Dive{i}");
messages.Add(message);
}
// will throw QuotaExceededException
using (var scope = new TransactionScope(TransactionScopeAsyncFlowOption.Enabled))
{
await sender.SendMessagesAsync(messages);
scope.Complete();
}
In the video below you can see the code in action against a standard tier.
There is also a convenience API allowing to create a message batch. The batch automatically tracks its size and doesn’t allow adding more messages once the size is reached. The size of the Azure Service Bus tier is automatically considered by the library.
var messagesToSend = new Queue<ServiceBusMessage>();
for (var i = 0; i < 4500; i++)
{
var message = new ServiceBusMessage($"Deep Dive{i}. Deep Dive{i}. Deep Dive{i}.");
messagesToSend.Enqueue(message);
}
var messageCount = messagesToSend.Count;
int batchCount = 1;
while (messagesToSend.Count > 0)
{
using ServiceBusMessageBatch messageBatch = await sender.CreateMessageBatchAsync();
if (messageBatch.TryAddMessage(messagesToSend.Peek()))
{
messagesToSend.Dequeue();
}
else
{
throw new Exception($"Message {messageCount - messagesToSend.Count} is too large and cannot be sent.");
}
while (messagesToSend.Count > 0 && messageBatch.TryAddMessage(messagesToSend.Peek()))
{
messagesToSend.Dequeue();
}
await sender.SendMessagesAsync(messageBatch);
}
Unfortunately, the batch does not yet into account the maximum number of messages you can send within a batch. For example, on the standard tier it is only possible to send a maximum of 4500 messages per single batch yet TryAdd will only return false when the size limit is reached but not when the number of items limit is reached. This might be improved in the future once the SDK has the means to discover those tier-specific settings on the established connection link.
By default, the underlying AMQP client implementation also does some batching even if you are not using the API shown in this blog post. The difference is though that the batching over AMQP happens implicitly based on predefined parameters while here we are explicitly instructing the client to batch multiple well-known operations together.
Updated: 2021-03-23 to use the new SDK
[…] Sender side batching […]
[…] https://www.planetgeek.ch/2020/04/27/azure-service-bus-net-sdk-deep-dive-sender-side-batching/ […]