Skip to content

Job chaining and batching in C#/.NET with Hangfire (Pro and Free)

  • C#
  • .NET
  • Hangfire

Taylor Otwell, the creator of Laravel framework recently tweeted a following example of a new feature coming to Laravel jobs:

Bus::chain([
  new InitializeAccount,
  Bus::batch([
    new ImportAccountData,
    new ConnectExternalServices
  ]),
  new NotifyUserThatAccountIsReady,
])->dispatch();

What it does is first it runs the InitializeAccount job, once that's finished it executes both ImportAccountData and ConnectExternalServices in parallel, and finally when the previous two complete, it triggers the NotifyUserThatAccountIsReady job. It's a pretty neat feature that allows for some complex job processing workflows, and honestly I was a little surprised that it wasn't supported in the framework already!

I really enjoy using Hangfire for all my background job processing needs in C#, so it got me thinking what would be the easiest and cleanest way to implement a similar workflow.

Hangfire supports three features that we would need:

  • Job Continuations

    Executed when another (provided) job completes

  • Batches

    Allow executing jobs in parallel

  • Batch Continuations

    Allow executing based on completion state of a job batch

Continuations are a feature that comes with the standard, free version of Hangfire, while Batches and Batch Continuations are part of a paid Pro upgrade. (starting at $500/year)

Implementation using Hangfire Pro

Let's start with the simpler example, since the Pro version supports all required features. We'll assume that our jobs are service classes that implement the appropriate contracts:

  • IInitializeAccount
  • IImportAccountData
  • IConnectExternalServices
  • INotifyUserThatAccountIsReady

And that each contains a method Run which executes the required logic. This way each job class can use .NET's dependency injection to bring any required dependencies.

In the examples below _jobClient is an injected instance of IBackgroundJobClient interface that's part of the Hangfire NuGet package.

Here's how we would schedule the same workflow using Hangfire Pro:

// This job will run first
var initializeAccountJobId = _jobClient.Enqueue<IInitializeAccount>(j => j.Run());

var batchId = BatchJob.StartNew(x =>
{
  // These two jobs will run in paralell
  x.Enqueue<IImportAccountData>(j => j.Run());
  x.Enqueue<IConnectExternalServices>(j => j.Run());
}, initializeAccountJobId);

// This job will run last in a sequence
BatchJob.ContinueBatchWith(
  batchId,
  x => x.Enqueue<INotifyUserThatAccountIsReady>(j => j.Run())
);

It's as simple as that. Hangfire documentation has examples of even more complex workflows.

Implementation using Hangfire Open (Free)

If you don't have Hangfire Pro, the workflow is still possible however you would need to coordinate the continuation of the job NotifyUserThatAccountIsReady manually after ImportAccountData and ConnectExternalServices have completed.

Schedule the jobs like this:

// Enqueue the first job and get its ID
var initializeAccountJobId = _jobClient.Enqueue<IInitializeAccount>(j => j.Run());

// Continue in parallel with IImportAccountData and IConnectExternalServices
var importAccountDataJobId = BackgroundJob.ContinueJobWith(
  initializeAccountJobId,
  x => x.Enqueue<IImportAccountData>(j => j.Run())
);
var connectExternalServicesJobId = BackgroundJob.ContinueJobWith(
  initializeAccountJobId,
  x => x.Enqueue<IConnectExternalServices>(j => j.Run())
);

// INotifyUserThatAccountIsReady should only run after both
// IImportAccountData and IConnectExternalServices have completed.
//
// To ensure this, we can set continuations for both IImportAccountData
// and IConnectExternalServices that will trigger a check.
//
// The check will enqueue INotifyUserThatAccountIsReady only if both
// IImportAccountData and IConnectExternalServices have completed.
BackgroundJob.ContinueWith(
  importAccountDataJobId,
  () => CheckAndRunNotifyUserThatAccountIsReady(importAccountDataJobId, connectExternalServicesJobId)
);
BackgroundJob.ContinueWith(
  connectExternalServicesJobId,
  () => CheckAndRunNotifyUserThatAccountIsReady(importAccountDataJobId, connectExternalServicesJobId)
);

And the CheckAndRunNotifyUserThatAccountIsReady method could look something like this:

public void CheckAndRunNotifyUserThatAccountIsReady(string job1Id, string job2Id)
{
    var client = new BackgroundJobClient();
    var monitor = JobStorage.Current.GetMonitoringApi();

    var job1State = monitor.JobDetails(job1Id).History[0].StateName;
    var job2State = monitor.JobDetails(job2Id).History[0].StateName;

    if (job1State == "Succeeded" && job2State == "Succeeded")
    {
        // Both job dependencies have completed successfully, so enqueue the final job
        client.Enqueue<INotifyUserThatAccountIsReady>(j => j.Run())
    }
}

This is quite a rudimentary way to coordinate the jobs without Hangfire Pro and you could potentially run into race conditions. If the timing is very close, there could be a scenario where both ImportAccountData and ConnectExternalServices attempt to enqueue the NotifyUserThatAccountIsReady job.

Finally, remember to ensure that any methods used within Hangfire jobs are idempotent and safe to call multiple times because Hangfire may retry failed jobs depending on your configuration.


PS. If you liked this article, please share to spread the word.

Share

Looking for a handy server monitoring tool?

Check out StackScout, a project I've been working on for the past few years. It's a user-friendly web app that keeps an eye on your servers across different clouds and data centers. With StackScout, you can set custom alerts, access informative reports, and use some neat analytics tools. Feel free to give it a spin and let me know what you think!

Learn more about StackScout

StackScout server monitoring tool screenshot