Blog

A look at native TypeScript performance

S

Sebastian Staffa

Published on Feb 24, 2025, 3:32 PM

Header image generated by Dall-E3, showing cars passing the finishing line at a nascar race.

The beginning of this year brought with it the release of Node.js v23.6.0. An unassuming minor release at first glance it includes a noteworthy change: The removal of the experimental flag from the native TypeScript support, which previously needed to be enabled with the --experimental-strip-types flag. With Node.js 23.6 you can now run some TypeScript code without additional tooling. You still cannot run code that requires transformation of the resulting code beyond stripping the types. This, for example, includes enums and default parameters for functions.

Nevertheless, I wanted to use this opportunity to see where the capabilities of Node's new native ts support are heading, especially when compared against the existing tools as well as the "new" challengers that have risen to fame in the last few years.

The contenders are:

  • Node.js v23.6.0
  • ts-node v10.9.2, both with and without SWC
  • tsx v4.19.2
  • Deno v13.0.245.12-rusty
  • Bun v1.1.43
  • Node.js v23.6.0 again, but this time we'll compile the TypeScript code to Javascript first

It is worth pointing out that none of the tools that I am testing actually perform any type checks on the code that is run. At least when using the default settings, all the runtimes require you to run the Typecript compiler separately to check your code for type errors.

There are, however, two notable exceptions: Both Deno and ts-node allow you to enable type checking when running the code: for Deno you can use the --check flag, while ts-node has a --typeCheck flag. The latter is activated by default, but I was not able to get it to work. For all comparisons in this article, type checking is turned off, unless otherwise noted, to allow for an equal playing field.

Hello World

The first round of tests will use a simple Hello World script, which will be used to get a feeling for the startup time of the different tools:

function main() {
  console.log("Hello, World!")
}
main()

All measurements are averaged over 100 runs and are taken on my Framework16 with an AMD Ryzen 9 7940H @ 5.2GHz:

A comparison of native TypeScript runtimes when executing a hello world application, averaged over 100 executions.
MeanMedianStd
Node.js 23.0648.39 ms48.00 ms1.326 ms
Node.js 23.06 (js)24.21 ms24.00 ms1.032 ms
ts-node463.28 ms457.50 ms16.168 ms
ts-node + swc470.50 ms470.00 ms3.915 ms
tsx319.80 ms319.50 ms3.240 ms
Deno22.21 ms22.00 ms1.003 ms
Bun10.56 ms10.00 ms0.863 ms

When looking at these results, the performance difference between the two Node.js execution modes is quite striking. The process of stripping the types from the TypeScript files almost doubles the time it takes to print out Hello World. The two tools that are written in javascript themselves, ts-node and tsx, are slower by orders of magnitude. The two rust tools, Deno and Bun, are leading the pack with Deno being twice as fast as the Node.js TypeScript implementation, while Bun cuts this time in half again, making it the clear winner in this test.

Extensive Scripts

The previous hello world example is of course not representative of a real-world application. To get a better idea of the real world performance of the different tools I wanted to craft a more extensive example. My initial idea was to just check out one of the bigger, open source TypeScript repositories like Zod or Drizzle, but I quickly realized that just running one of the projects natively was not that easy. The problem lies once again with JavaScript's module system.

Node.js only supports CommonJS and ES Modules and it requires all imports to contain the .ts file extension. That is of course not a format that any of the big projects use (yet). I could have gone ahead and written a script to fixup all the imports, but instead went with a different route. As I wanted to test the raw interpreter performance of the runtimes anyway, I thought it would be a good idea to create a huge, single TypeScript "bundle" to make sure that all the tools had to interpret all of a projects code and would not be able to skip over unused imports.

This led to the next problem: I couldn't really find a bundler that would let me output TypeScript. Every tool assumed that I wanted to transform the code to JavaScript while bundling. Annoyed, I just wrote a small "bundler" myself which took all of a projects source files and concatenated them into one big file.

The capabilities of my frankly quite basic bundler along with the goal to test the performance of the interpreter resulted in two requirements for the project that I wanted to bundle:

Firstly, it should have a simple project structure as to not break my trivial bundler. Secondly, it should not call any non-TypeScript code. This is why I chose to bundle the Mathigon suite of tools. Mathigon is a math toolset that is used in a pedagogical context and is written purely in TypeScript.

From this toolset I chose the core, fermat, euclid and hilbert libraries to create my bundle. The result is a file with about 6k lines of pure ts code (including whitespaces).

But even though I made sure to only use the most pure and simple TypeScript code for this test, Node.js was not able to run the code without additional flags. The Mathigon libraries make heavy use of constructor parameter properties, which are not supported in Node.js' current default strip-only mode:

  x TypeScript parameter property is not supported in strip-only mode
      ,-[5518:1]
 5515 |
 5516 | export class ExprTerm extends ExprElement {
 5517 |
 5518 |   constructor(readonly items: ExprElement[]) {
      :                        ^^^^^^^^^^^^^^^^^^^^
 5519 |     super();
 5520 |   }

Because of this I had to turn on --experimental-transform-types to get the code to run natively under Node. The other tools executed the bundle without problems, leading to the following results:

A comparison of native TypeScript runtimes when executing a 6k loc file, averaged over 100 executions.
MeanMedianStd
Node.js 23.06168.97 ms169.00 ms3.315 ms
Node.js 23.06 (js)28.89 ms29.00 ms1.303 ms
ts-node609.32 ms608.50 ms16.128 ms
ts-node + swc632.88 ms633.00 ms7.220 ms
tsx322.75 ms322.00 ms3.650 ms
Deno26.66 ms26.00 ms6.142 ms
Bun16.40 ms16.00 ms1.760 ms

These results, once again, make it easy to crown a winner of this benchmark: Bun leaves the competitors in the dust with a mean execution time of 16.4ms. The Node.js implementation takes about 10 times as long to execute the same code while the next fastest tool, Deno, is still about 60% slower than Bun.

Conclusion

To be honest, I was more than a bit underwhelmed by the performance of Node.js. And even if we ignore the numbers for a minute, one still needs to enable experimental features to run what I would consider to be a very basic TypeScript project and still does not get any type checking.

The missing type checking on almost all runtimes is a bit of a letdown as well. Almost all the tools recommend running your code through tsc before shipping it to production. With their default settings, the following code always executed without error:

function main(param: string = "World") {
  console.log(`Hello, ${param}!`)
}
main(5)

Looking at this from a purely performance focused perspective, this makes sense. When viewing the tools as a TypeScript interpreter or REPL, as I understood them initially, it does not. I don't think that this is problem, per se, but I would like to see the tools being more upfront about this behavior.

The biggest takeaway from these tests, however, is the dominance of Bun when it comes to the execution performance. I'll admit that I might have slept on this tool for too long, but after its performance in this test I will definitely try to incorporate it into new projects of mine.

More posts like this one

Note: NixOS Feedback Handling

A quick, positive example on how a project can handle critical feedback

By Sebastian Staffa

In today's post we are improving the cold start times of a Node.js Lambda function by building our own runtime image using Nix.

By Sebastian Staffa

MQTT For Web Developers

MQTT is a protocol that is typically used in an IoT context. In this article, we'll explore how we could use its capabilities in a traditional web application to stream messages in real-time.

By Sebastian Staffa