MultiProducerConcurrentConsumer – Push it

In the last post, I introduced the start procedure for the MultiProducerConcurrentConsumer including a rough outline of the timer loop responsible for pushing items in batches after the push interval is elapsed.

To understand how the batch size affects the push let me first explain the push method. The push method’s responsibility is primarily to insert the item of type TItem into the right slot. To achieve that the passed in slot number is range checked against the available number of slots (of course a lower bound would not hurt either 😉 ). The passed in slot number is used to pick the concurrent queue that represents the slot, and the item is enqueued into the slot.

public void Push(TItem item, int slotNumber)
{
    if (slotNumber >= numberOfSlots)
    {
        throw new ArgumentOutOfRangeException(nameof(slotNumber),
          $"Slot number must be between 0 and {numberOfSlots - 1}.");
    }

    queues[slotNumber].Enqueue(item);

    var incrementedCounter = Interlocked.Increment(ref numberOfPushedItems);

    if (incrementedCounter > batchSize)
    {
        syncEvent.Set();
    }
}

The second responsibility of the push method is to count the number of items that were inserted into the MPCC. In my version of the MPCC, I made a simple trade off. Instead of having a counter per slot I’ve chosen a global counter which counts all the pushed items called numberOfPushedItems. I think it is a reasonable tradeoff and makes the implementation much simpler but still achieves its goal (what do you think?). Since multiple threads could insert data concurrently, I’m using Interlocked.Increment to increment the counter. Whenever the incremented counter is greater than the batch size, the push method signals a synchronization primitive* to unleash the push process. But how does the MPCC know when the batch size is reached?

It does it by observing the synchronization primitive. Let’s expand the previously shown TimerLoop slightly:

async Task TimerLoop()
{
    var token = tokenSource.Token;
    while (!tokenSource.IsCancellationRequested)
    {
        try
        {
            await Task.WhenAny(Task.Delay(pushInterval, token),
               syncEvent.WaitAsync(token)).ConfigureAwait(false);
            await PushInBatches().ConfigureAwait(false);
        }
        catch (Exception)
        {
            // intentionally ignored
        }
    }
}

So instead of only doing a Task.Delay with the push interval we combine the Task.Delay with the syncEvent.WaitAsync into a Task.WhenAny. What it means is the following: Task.WhenAny will return when any of the tasks inside the array passed into the method is completed (successfully, faulted or canceled). So the await will continue when either the push interval task is elapsed (or canceled), or the synchronization primitive was set (or the token canceled). This code has a slight deficiency. Every time we reach the batch size before the push interval has elapsed the task representing the push interval will “leak” until the interval is elapsed. But we will ignore this for now and potentially talk about this in another post.

The code shown so far is now capable of getting items pushed to it into the right slot and when either the push interval or the batch size is reached the component will push items in batches to the pump delegate. In the next post we will cover the actual PushInBatches method.

* I’m using an AsyncAutoResetEvent which I will explain in a later post.

About the author

Daniel Marbach

3 comments

By Daniel Marbach

Recent Posts