github hatchet-dev/hatchet v0.56.3
v0.56.3 - Benchmarking Service

latest releases: v0.72.3, v0.72.2, v0.72.1...
5 months ago

Benchmarking Hatchet

Today, we're open-sourcing our new benchmarking container, which allows anyone to test the performance of their Hatchet setup. This load testing container can be run in pretty much any environment -- for example:

docker run -e HATCHET_CLIENT_TOKEN=your-token ghcr.io/hatchet-dev/hatchet/hatchet-loadtest -e "100" -d "60s" --level "warn" --slots "100"

Example Results

With our latest v1 Hatchet engine, we ran a series of internal benchmarks on an 8 CPU database (Amazon RDS m7g.2xlarge), achieving a stable throughput of up to 2000 events/second. Beyond that, we've also tested higher throughput on larger DB instances (for example, up to 10k events/second on an m7g.8xlarge).

Here's a brief summary of our results:

  • Throughput: Scales smoothly up to 2000 events/s on m7g.2xlarge, leveling out at about 83% CPU utilization on the database.
  • Latency: For lower throughput (100-500 events/s), average execution time remains below 50ms.
  • Setup: Benchmarks were run against a Kubernetes cluster on AWS with 2 Hatchet engine replicas (c7i.4xlarge). The RDS database instance (m7g.2xlarge) was chosen to avoid disk/CPU contention under typical load.

The new engine design allows Hatchet to efficiently handle high volumes of background tasks with stable performance, making it well-suited for AI-driven workloads or large-scale async event processing.

Want to see more details or run your own benchmarks? Read our benchmarking guide for more information.

Don't miss a new hatchet release

NewReleases is sending notifications on new releases.