Bun 1.0 just launched and I'm genuinely impressed and intrigued. How long can this madness keep going? I've never built anything substantial with Bun. Just various scripts to get a feel for it.
At work, I recently launched a micro-service that uses Node + Fastify + TypeScript. I'm not going to rewrite it in Bun, but I'm going to get a feel for the difference.
Basic version in Bun
No need for a package.json
at this point. And that's neat. Create a src/index.ts
and put this in:
const PORT = parseInt(process.env.PORT || "3000");
Bun.serve({
port: PORT,
fetch(req) {
const url = new URL(req.url);
if (url.pathname === "/") return new Response(`Home page!`);
if (url.pathname === "/json") return Response.json({ hello: "world" });
return new Response(`404!`);
},
});
console.log(`Listening on port ${PORT}`);
What's so cool about the convenience-oriented developer experience of Bun is that it comes with a native way for restarting the server as you're editing the server code:
❯ bun --hot src/index.ts
Listening on port 3000
Let's test it:
❯ xh http://localhost:3000/
HTTP/1.1 200 OK
Content-Length: 10
Content-Type: text/plain;charset=utf-8
Date: Sat, 09 Sep 2023 02:34:29 GMT
Home page!
❯ xh http://localhost:3000/json
HTTP/1.1 200 OK
Content-Length: 17
Content-Type: application/json;charset=utf-8
Date: Sat, 09 Sep 2023 02:34:35 GMT
{
"hello": "world"
}
Basic version with Node + Fastify + TypeScript
First of all, you'll need to create a package.json
to install the dependencies, all of which, at this gentle point are built into Bun:
❯ npm i -D ts-node typescript @types/node nodemon
❯ npm i fastify
And edit the package.json
with some scripts:
"scripts": {
"dev": "nodemon src/index.ts",
"start": "ts-node src/index.ts"
},
And of course, the code itself (src/index.ts
):
import fastify from "fastify";
const PORT = parseInt(process.env.PORT || "3000");
const server = fastify();
server.get("/", async () => {
return "Home page!";
});
server.get("/json", (request, reply) => {
reply.send({ hello: "world" });
});
server.listen({ port: PORT }, (err, address) => {
if (err) {
console.error(err);
process.exit(1);
}
console.log(`Server listening at ${address}`);
});
Now run it:
❯ npm run dev
> fastify-hello-world@1.0.0 dev
> nodemon src/index.ts
[nodemon] 3.0.1
[nodemon] to restart at any time, enter `rs`
[nodemon] watching path(s): *.*
[nodemon] watching extensions: ts,json
[nodemon] starting `ts-node src/index.ts`
Server listening at http://[::1]:3000
Let's test it:
❯ xh http://localhost:3000/
HTTP/1.1 200 OK
Connection: keep-alive
Content-Length: 10
Content-Type: text/plain; charset=utf-8
Date: Sat, 09 Sep 2023 02:42:46 GMT
Keep-Alive: timeout=72
Home page!
❯ xh http://localhost:3000/json
HTTP/1.1 200 OK
Connection: keep-alive
Content-Length: 17
Content-Type: application/json; charset=utf-8
Date: Sat, 09 Sep 2023 02:43:08 GMT
Keep-Alive: timeout=72
{
"hello": "world"
}
For the record, I quite like this little setup. nodemon
can automatically understand TypeScript. It's a neat minimum if Node is a desire.
Quick benchmark
Bun
Note that this server has no logging or any I/O.
❯ bun src/index.ts
Listening on port 3000
Using hey
to test 10,000 requests across 100 concurrent clients:
❯ hey -n 10000 -c 100 http://localhost:3000/ Summary: Total: 0.2746 secs Slowest: 0.0167 secs Fastest: 0.0002 secs Average: 0.0026 secs Requests/sec: 36418.8132 Total data: 100000 bytes Size/request: 10 bytes
Node + Fastify
❯ npm run start
Using hey
again:
❯ hey -n 10000 -c 100 http://localhost:3000/ Summary: Total: 0.6606 secs Slowest: 0.0483 secs Fastest: 0.0001 secs Average: 0.0065 secs Requests/sec: 15138.5719 Total data: 100000 bytes Size/request: 10 bytes
About a 2x advantage to Bun.
Serving an HTML file with Bun
Bun.serve({
port: PORT,
fetch(req) {
const url = new URL(req.url);
if (url.pathname === "/") return new Response(`Home page!`);
if (url.pathname === "/json") return Response.json({ hello: "world" });
+ if (url.pathname === "/index.html")
+ return new Response(Bun.file("src/index.html"));
return new Response(`404!`);
},
});
Serves the src/index.html
file just right:
❯ xh --headers http://localhost:3000/index.html
HTTP/1.1 200 OK
Content-Length: 889
Content-Type: text/html;charset=utf-8
Serving an HTML file with Node + Fastify
First, install the plugin:
❯ npm i @fastify/static
And make this change:
+import path from "node:path";
+
import fastify from "fastify";
+import fastifyStatic from "@fastify/static";
const PORT = parseInt(process.env.PORT || "3000");
const server = fastify();
+server.register(fastifyStatic, {
+ root: path.resolve("src"),
+});
+
server.get("/", async () => {
return "Home page!";
});
server.get("/json", (request, reply) => {
reply.send({ hello: "world" });
});
+server.get("/index.html", (request, reply) => {
+ reply.sendFile("index.html");
+});
+
server.listen({ port: PORT }, (err, address) => {
if (err) {
console.error(err);
And it works great:
❯ xh --headers http://localhost:3000/index.html
HTTP/1.1 200 OK
Accept-Ranges: bytes
Cache-Control: public, max-age=0
Connection: keep-alive
Content-Length: 889
Content-Type: text/html; charset=UTF-8
Date: Sat, 09 Sep 2023 03:04:15 GMT
Etag: W/"379-18a77e4e346"
Keep-Alive: timeout=72
Last-Modified: Sat, 09 Sep 2023 03:03:23 GMT
Quick benchmark of serving the HTML file
Bun
❯ hey -n 10000 -c 100 http://localhost:3000/index.html
Summary:
Total: 0.6408 secs
Slowest: 0.0160 secs
Fastest: 0.0001 secs
Average: 0.0063 secs
Requests/sec: 15605.9735
Total data: 8890000 bytes
Size/request: 889 bytes
Node + Fastify
❯ hey -n 10000 -c 100 http://localhost:3000/index.html
Summary:
Total: 1.5473 secs
Slowest: 0.0272 secs
Fastest: 0.0078 secs
Average: 0.0154 secs
Requests/sec: 6462.9597
Total data: 8890000 bytes
Size/request: 889 bytes
Again, a 2x performance win for Bun.
Conclusion
There isn't much to conclude here. Just an intro to the beauty of how quick Bun is, both in terms of developer experience and raw performance.
What I admire about Bun being such a convenient bundle is that Python'esque feeling of simplicity and minimalism. (For example python3.11 -m http.server -d src 3000
will make http://localhost:3000/index.html
work)
The basic boilerplate of Node with Fastify + TypeScript + nodemon
+ ts-node
is a great one if you're not ready to make the leap to Bun. I would certainly use it again. Fastify might not be the fastest server in the Node ecosystem, but it's good enough.
What's not shown in this little intro blog post, and is perhaps a silly thing to focus on, is the speed with which you type bun --hot src/index.ts
and the server is ready to go. It's as far as human perception goes instant. The npm run dev
on the other hand has this ~3 second "lag". Not everyone cares about that, but I do. It's more of an ethos. It's that wonderful feeling that you don't pause your thinking.
It's hard to see when I press the Enter key but compare that to Bun:
UPDATE (Sep 11, 2023)
I found this: github.com/SaltyAom/bun-http-framework-benchmark
It's a much better benchmark than mine here. Mind you, as long as you're not using something horribly slow, and you're not doing any I/O the HTTP framework performances don't matter much.
Comments
You can easily speed up our TypeScript environment with Fastify + TypeScript + NodeJs using swc. swc is a TypeScript compiler (Plus more..) written in Rust. And it's very very fast. Like 200ms it takes to start in my case.
Of course, you still need tsc to do a typescript build with all the bells & whistles that check on types. But for fast development, SWC can be very useful. I think the biggest down-side is when Bun you can't get rid of Bun. While using Fastify, that can be run on various runtimes. I hope you get my point.
"our TypeScript environment"?
I've never actually understood how SWC works. There was talk about it being enabled by default in Next.js but Next.js is slow.
"using Fastify, that can be run on various runtimes"
Yeah, that's awesome!
Actually is quite easy. You first need to add SWC (core + cli) to your project: `npm i -D @swc/cli @swc/core`
Add an new `.swcrc` to your root folder, with the content:
```json
{
"jsc": {
"parser": {
"syntax": "typescript",
"tsx": false,
"decorators": false,
"dynamicImport": false
},
"target": "esnext"
},
"module": {
"type": "es6"
},
"sourceMaps": true
}
```
Changing your package.json to use swc could be as simple as:
```json
"main": "dist/index.js",
"scripts": {
"start": "swc ./src -d dist --strip-leading-paths && node .",
....
....
}
```
In this example above I will use `swc` command to compile the src directory, outputs into the dist directory. I also don't want the leading path (eg. /src) but you can remove that flag if you wish.
Then I just run node, which will execute the "main" from package.json (aka dist/index.js file created by SWC...).
In your `tsconfig.json` file you can configure also ts-node if you wish to use SWC with ts-node with nodemon or something like that:
```
"ts-node": {
"files": true,
"swc": true,
"esm": true,
},
```
More info: https://swc.rs/docs/getting-started. And yes you're right NextJS also has support for it: https://docs.nestjs.com/recipes/swc
> And yes you're right NextJS also has support for it: https://docs.nestjs.com/recipes/
That's Nest, not Next :)
Not confusing at all.