Latency to Azure Service Bus matters

In the last post, we optimized the throughput on the client by increasing the concurrency of the message receive loop. What else could we try to improve? Let’s have a closer look at the lambda body of the OnMessageAsync declaration.

receiveClient.OnMessageAsync(async message =>
{
   // do something with the message
   await message.CompleteAsync().ConfigureAwait(false);
},..)

As we can see, we have technically two operations inside the callback delegate. The first operation is the code we want to execute when a message is pushed to us (here represented with comment). The second operation is the completion or abandon of the message we received depending on the outcome of the business logic.

For the business logic, there is an important aspect we need to take into consideration. If the business logic happens to call into I/O bound resources the whole call stack of the business logic on the I/o bound path needs to be asynchronous and return a Task or Task<T>.

receiveClient.OnMessageAsync(async message =>
{
   await DoSomethingWithTheMessageAsync().ConfigureAwait(false);
   await message.CompleteAsync().ConfigureAwait(false);
},..)

This makes the callback delegate fully asynchronous and therefore allows to more efficiently use the resources provided by the Azure SDK since we are never blocking on the I/O bound path. Naturally, we would also make sure that the implementation of DoSomethingWithTheMessageAsync is as efficient as possible (for example following best-practices of async ADO.NET calls when we call into a SQL Azure Database and more).

Could we save more? How about the second operation? Can we just leave it out? Not really, since if we would leave it out the message would not be completed and therefore always reappear after the PeekLock duration is over. So we’d be processing the same Thanksgiving order over and over again. So at some point in time, we need to complete the message. But what if we could complete it at a later stage? It turns out we can by making a few trade-offs:

receiveClient.OnMessageAsync(async message =>
{
   try {
      await DoSomethingWithTheMessageAsync().ConfigureAwait(false);
      // omit the completion of the message
   catch(Exception) {
      // in case of an exception make it available again immediately
      await message.AbandonAsync().ConfigureAwait(false);
   }
},..)

By assuming exceptions are like the name suggests “exceptional” we abandoned immediately when an exception occurred in our message processing logic to make the message directly accessible again without needing to wait for the PeekLock duration to expire. For the good case, we omit the completion of the BrokeredMessage and delay the completion slightly.

Why is that useful? The CompleteAsync operation on the brokered message calls to the Azure Service Bus. So essentially it is a remote call which can take up a few dozens of milliseconds depending on the latency of the datacenter you’re connecting to. While I’m writing this post, the latency from my computer to Azure is roughly 40-50 ms (usually it averages between 20-30 ms)*. By omitting the complete call, we save per message receive 40 ms in my example. Saving 40 ms per message with a load of 120K messages means we save 4.8 Million Milliseconds (4800 seconds) on the entire batch of messages.

By not hammering Azure Service Bus for each message received we gain more throughput because the overall execution time of the actual message received is reduced. So instead of bobby car speed we are probably now at rusty old car speed.

But hey, I’m totally cheating now! Since at some point in time, we need to complete the messages not to let them reappear. We also should make sure we complete the messages in the time of a PeekLock duration of a received message. But that is a topic for another post!

* Of course by placing your application into the data center the latency would decrease. But there will always be latency. Remember the speed of light is a constant ;)?

About the author

Daniel Marbach

4 comments

By Daniel Marbach

Recent Posts