Now, we release the second major version of the Hono Node.js adapter 🎉 🎉 🎉
The Hono Node.js adapter is now up to 2.3x faster
v2 of the Hono Node.js adapter reaches up to 2.3x the throughput of v1 — that's the peak number, measured on the body-parsing scenario of bun-http-framework-benchmark. The other scenarios (Ping, Query) get a smaller but real boost too.
Install or upgrade with:
npm i @hono/node-server@latestv2
The Node.js adapter is going through a major version bump to v2. That said, the public API stays the same — the headline of this release is the large performance improvement described above.
What does the Node.js adapter do?
A quick refresher on what the Node.js adapter actually does — it exists so that Hono applications can run on Node.js. Hono is built on the Web Standards APIs, but you cannot serve those directly from Node.js. The adapter bridges the Web Standards APIs and the Node.js APIs, which is what lets a Hono app — and more generally a Web-Standards-style app — run on top of Node.js.
If you write the following code and run node ./index.js, a server starts up on localhost:3000. And it really is plain Node.js underneath.
import { Hono } from 'hono'
import { serve } from '@hono/node-server'
const app = new Hono()
app.get('/', (c) => c.text('Hello World!'))
serve(app)The early performance story
The very first implementation of the Node.js adapter looked roughly like this in pseudocode:
export const getRequestListener = (fetchCallback: FetchCallback) => {
return async (incoming: IncomingMessage, outgoing: ServerResponse) => {
const method = incoming.method || 'GET'
const url = `http://${incoming.headers.host}${incoming.url}`
// ...
const init = {
method: method,
headers: headerRecord,
}
// app is a Hono application
const res = await app.fetch(new Request(url, init))
const buffer = await res.arrayBuffer()
outgoing.writeHead(res.status, resHeaderRecord)
outgoing.end(new Uint8Array(buffer))
}
}So the flow was:
- a request comes in as an
IncomingMessage - it gets converted into a
Requestobject and handed to the app - the
Responsereturned by the app is written back to the outgoingServerResponse
In diagram form:
IncomingMessage => Request => app => Response => ServerResponse
This is, frankly, inefficient. So whenever Hono went head-to-head with other Node.js frameworks we kept losing — all we could do was shrug and say "well, it's slow on Node.js."
Introducing LightweightRequest / LightweightResponse
The huge step forward that fixed this was a legendary PR from @usualoma:
It made things up to 2.7x faster.
I previously wrote about this in detail in this post:
https://zenn.dev/yusukebe/articles/7ac501716ae1f7?locale=en
In short, the trick is wonderfully simple. It just follows the golden rule of performance tuning: don't do work you don't have to do. Lightweight versions of Request and Response are constructed and used first — and that path is fast. Only when something actually needs the contents of the Request, e.g. when you call req.json(), does a real new Request() get instantiated under the hood and used from then on. The result is fast, and behavior stays correct.
…but body parsing was still slow
"Fast" here was for a very simple "Hello World" benchmark — a GET that just returns text.
There are many ways to benchmark, but the one we tend to reach for is this:
https://github.com/SaltyAom/bun-http-framework-benchmark
It tests three scenarios: Ping, Query, and Body. Let's pit Hono against the major Node.js frameworks:
As you can see, the Body case is very slow. The handler being measured is essentially this:
import { Hono } from 'hono'
import { serve } from '@hono/node-server'
const app = new Hono()
app.post('/json', async (c) => {
const data = await c.req.json()
return c.json(data)
})
serve(app)c.req.json() is the slow part. The reason is well understood: inside the Node.js adapter, when json() is called the LightweightRequest path can't be used, so a real new Request() ends up being constructed.
perf: optimize request body reading
The 2.3x figure above comes from one PR specifically — PR #301 by @mgcrea:
The PR bundles a few changes, but the key one is "optimize request body reading". Quoting from the PR description:
The fix overrides
text(),json(),arrayBuffer(), andblob()on the request prototype to read directly from the Node.jsIncomingMessageusing event-based I/O.
In other words, in the json() case above, we no longer convert into a Request at all — we read the body straight off the Node.js APIs. A classic fast path. That alone gives a large jump in body-parsing throughput.
The same PR also includes two other tuning improvements:
- URL construction fast-path — skip building a
URLobject except in edge cases buildOutgoingHttpHeadersoptimization — skip theset-cookieheader comparison when there are no cookies
v2 ships several other performance PRs as well — newHeadersFromIncoming and signal fast-paths, Response fast-paths and responseViaCache improvements, method-key caching, a regex-based buildUrl rewrite, and more (see the full list below). They all add up, but #301 is by far the largest single contributor, which is why it gets the spotlight here.
v2 performance
Now let's measure the final v2 build.
First, comparing against the v1 Node.js adapter. dev here is v2. Body improves by 2.3x, and the other scenarios get faster too:
Next, the same comparison against other frameworks. With the Body score jumping, Hono passes Koa and Fastify and takes first place:
There's also an article titled "Hono on Node.js: the fastest response showdown" that benchmarks the Hono Node.js adapter against other Node.js adapters that bridge Web-Standard-API apps to Node.js:
https://zenn.dev/chot/articles/hono-node-the-fastest-adapter
In that article's POST benchmark — the one that exercises body parsing — Hono came in last. Re-running it on v2 lands Hono in first place:
@hono/node-server— 77,237.74srvx— 62,000.29@whatwg-node/server— 57,919.26@remix-run/node-fetch-server— 42,486.39
Breaking changes
There are two breaking changes in v2.
Dropped support for Node.js v18
Node.js v18 reached end-of-life, so v2 requires Node.js v20 or later.
Removed the Vercel adapter
The Vercel adapter (@hono/node-server/vercel) has been removed. It is no longer needed for Vercel's modern runtimes, so the recommendation is to deploy without it.
If you still need the previous behavior, the old adapter was a one-liner on top of getRequestListener and you can write the same thing in your own project:
import type { Hono } from 'hono'
import { getRequestListener } from '@hono/node-server'
export const handle = (app: Hono) => {
return getRequestListener(app.fetch)
}Then use it the same way you used handle from @hono/node-server/vercel before.
All changes
A full list of what landed in PR #316.
Performance
- perf: optimize request body reading and URL construction (#301) by @mgcrea
- perf: optimize
buildOutgoingHttpHeadersfor the common case (#301) by @mgcrea - perf(request): cache method key (#319) by @yusukebe
- perf(url): mark host with port
:as safe host (#320) by @yusukebe - perf(request): optimize
newHeadersFromIncomingand signal fast-path (#332) by @GavinMeierSonos - perf(response, listener):
Responsefast-paths andresponseViaCacheimprovements (#333) by @GavinMeierSonos - perf: replace
Uint8Arraylookup tables with regex inbuildUrl(#345) by @usualoma
Features
- feat: first-class support for WebSockets (#328) by @BlankParticle
Breaking changes
- feat: end support for Node.js v18 (#317) by @yusukebe
- feat!: obsolete the Vercel adapter (#335) by @yusukebe
Fixes & refactors
- fix: more strictly determine when
new URL()should be used (#310) by @usualoma - refactor: improved compatibility with the Web Standard
Requestobject (#311) by @usualoma - fix(request): return an error object instead of throwing (#318) by @usualoma
- refactor: improve handling of null body in response (#341) by @usualoma
- fix: ensure close handler is attached for
Blob/ReadableStreamcacheable responses (#342) by @usualoma - fix: improve
Response.json()andResponse.redirect()spec compliance and efficiency (#343) by @usualoma
Build & tooling
- build: migrate from tsup to tsdown (#302) by @tommy-ish
- feat(test): migrate Jest to Vitest (#303) by @koralle
- fix: only build exported entrypoints (#323) by @BlankParticle
- chore: add
type: moduletopackage.json(#336) by @yusukebe
Wrap-up
So that's v2 of the Node.js adapter — significantly faster, with the same API. Just upgrading should give you a real performance boost. No more "Hono is slow on Node.js" excuses. Please use Hono — fast not only on Cloudflare, Bun, and Deno, but now also on Node.js.