Replies: 3 comments 2 replies
This comment was marked as off-topic.
This comment was marked as off-topic.
-
|
he performance gap between Node.js 16 and Node.js 22 is quite significant because of improvements introduced over several major releases. Here are the main reasons why Node 22 generally outperforms Node 16:
This means: Better JIT optimizations → Faster execution of JavaScript, especially for heavy loops and async operations. Improved garbage collection → Reduced memory overhead and GC pauses. New language features and micro-optimizations in the engine.
Node 22 benefits from better async context handling, which reduces overhead when using async/await or Promises.
Node 18+ and Node 22 include native implementations → lower overhead, better performance in HTTP-heavy applications.
Node 22 includes stability improvements and faster networking stack, especially for streaming and large payloads.
Lower memory fragmentation and improved heap management lead to better performance for long-running apps.
Enhanced diagnostics and profiling tools help optimize performance in production. Benchmarks Native Fetch and improved streams alone make a noticeable difference in API servers. Bottom Line High concurrency Heavy async operations Modern JavaScript features Then upgrading from Node 16 → Node 22 is a big win for performance and maintainability. Plus, Node 16 is already in Maintenance LTS, so you get security and long-term stability by upgrading. Recommendation: Upgrade to Node 22 (or latest LTS) unless you have a specific dependency requiring Node 16. The performance, security, and DX improvements are worth it. |
Beta Was this translation helpful? Give feedback.
-
|
Your benchmarking setup is the problem, not Node 22.
npx autocannon -c 1000 -d 30 http://localhost:30255/
npx autocannon -c 1000 -d 30 http://localhost:30256/Two other things going on here: V8 warmup. Node 22 ships with V8 12.x and the Maglev tiered JIT compiler. It needs warmup time before reaching peak performance. With short benchmarks you're measuring compilation time, not steady-state throughput. Run a warmup phase of 10-15 seconds before you start collecting numbers. Docker networking is adding noise. You're running both containers with port mapping ( There's also the fact that your Alpine base images are significantly different — Alpine 3.12 (Node 16) vs Alpine 3.20 (Node 22) ship different musl libc versions and system libraries, which alone can affect I/O performance. Worth noting: Node 22 does have a stricter HTTP parser ( But honestly, hello-world HTTP benchmarks amplify per-request overhead because there's zero application logic. In a real app with database calls, serialization, and business logic, that overhead becomes negligible. V8 12.x's GC and memory management improvements actually tend to improve performance under sustained real-world load. Re-run with |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I want to upgrade my project from Node v16.19.1 to Node 22.13.1, before upgrading I did a benchmark to measure performance changes after the upgrade (http server)
But I got a bad result. The performance of Node V22 has a significant decline compared to Node V16 (specifically referring to http scenarios)
Below is the environment and results of my test. I hope you can get your answers. Thank you.
Docker run node v22.13.1 alpine(3.20) on centos7(8cpu 16G)
Docker run node v16.19.1 alpine(3.12) on centos7(8cpu 16G)
Docker run script (The contents in the docker image are basically the same)
create app.js content
run
Throughput testing using abtest.
# n = 1000 or 2000 ab -n 100000 -c n http://localhost:30256/# n = 1000 or 2000 ab -n 100000 -c n http://localhost:30255/Test Result
I've tested it many times and the results are basically the same.
Can anyone tell me the reason or a more standard test method.
From the perspective of resource consumption, the Node V22 memory usage is smaller, but the CPU has reached a peak of 100% during the test.
Thank for everyone
Beta Was this translation helpful? Give feedback.
All reactions