PAS7 Studio
Back to all articles

How to set up a Bun.js server with Fastify: from zero to production

A hands-on guide to building a production-ready Bun.js server with Fastify. Install Bun, create an HTTP server, add TypeScript routes, handle validation, connect a database, and deploy — with benchmarks showing Bun + Fastify vs Node.js + Fastify performance differences.

12 Apr 2026· 17 min read· Technology
Best forBackend developersFull-stack engineersTech leads evaluating runtimesDevOps engineers
Glowing orange Bun shell logo merging with cyan Fastify lightning bolt on dark charcoal background, symbolizing the Bun + Fastify tech stack

This is a hands-on, from-zero-to-working-server guide. No prior Bun experience required.

How Bun differs from Node.js at the runtime level (engine, tooling, startup speed)
Step-by-step Bun installation and project scaffolding with TypeScript
Two server approaches: Bun.serve() (native) and fastify.listen() (framework)
Route definition, JSON schema validation, and error handling with Fastify on Bun
Database integration with PostgreSQL and Bun's built-in SQLite (bun:sqlite)
Production deployment with Docker and performance benchmarks vs Node.js

If you have been building servers with Node.js, most of your knowledge transfers directly. The differences that matter most are engine architecture, built-in tooling, and TypeScript handling.

Comparison pointBun 1.3Node.js 22/23
JavaScript engineJavaScriptCore (Safari/WebKit)V8 (Chrome/Chromium)
Written inZig + C++C++ + JavaScript
TypeScriptNative (zero config, no transpile step) [1]Experimental via --experimental-strip-types (Node 23) [2]
Package managerBuilt-in (bun install) — 10-30x faster than npm [1]npm, yarn, pnpm (external)
BundlerBuilt-in (bun build) — Webpack-class features [1]None built-in (requires Webpack, Vite, esbuild)
Test runnerBuilt-in (bun test) — Jest-compatible, ~20x faster [1]Built-in node:test (basic, stable since v20) [2]
Cold start8-15ms40-120ms
HTTP throughput (Fastify)~95,200 req/s [3]~55,800 req/s [3]
Memory baseline~380MB~512MB
npm compatibility~98% of top 1000 packages100% (native)

Bottom line

For new server projects, Bun offers a faster runtime, simpler toolchain, and native TypeScript. The remaining 2% of npm incompatibility is concentrated in native C++ addons like bcrypt or node-sass — most web frameworks and ORMs work without modification.

Bun installs as a single binary. No Node.js, npm, or separate build tools required.

01

Install via curl

BASH
curl -fsSL https://bun.sh/install | bash

This downloads the Bun binary, adds it to your PATH, and verifies the installation. After installation, run bun --version to confirm.

02

Install via PowerShell

POWERSHELL
powershell -c "irm bun.sh/install.ps1|iex"

Windows support reached stable status in Bun 1.3. All core features — runtime, package manager, test runner, and bundler — work on Windows. [1]

03

Confirm installation

BASH
bun --version

You should see 1.3.x or later. Bun is now ready. No separate Node.js installation is needed for Bun projects, though having Node.js installed alongside Bun is common and works without conflicts.

Create the project directory, install Fastify, and write the first server file — all in TypeScript, all without a build step.

01

Create the project and install dependencies

BASH
mkdir my-bun-server && cd my-bun-server
bun init
bun add fastify

bun init creates a minimal package.json with "type": "module" set. bun add fastify installs Fastify from npm — Bun's package manager downloads and resolves dependencies 10-30x faster than npm. [1]

02

Create the server entry point

TYPESCRIPT
// server.ts
import Fastify from "fastify";

const fastify = Fastify({ logger: true });

fastify.get("/", async (request, reply) => {
  return { hello: "world", runtime: "bun" };
});

fastify.get("/health", async (request, reply) => {
  return { status: "ok", uptime: process.uptime() };
});

const start = async () => {
  try {
    await fastify.listen({ port: 3000, host: "0.0.0.0" });
    console.log("Server running at http://localhost:3000");
  } catch (err) {
    fastify.log.error(err);
    process.exit(1);
  }
};

start();

This is standard Fastify code. The only difference from Node.js: you run it with bun instead of node, and the .ts extension works directly.

03

Run the server

BASH
bun run server.ts

That is it. No tsc, no ts-node, no nodemon, no build step. Bun reads the TypeScript file, strips types at parse time, and executes it. For development with hot reload, use bun --hot server.ts — Bun preserves application state across reloads, which is faster than Node.js's --watch flag that restarts the entire process. [1]

What just happened

You created a working Fastify server in TypeScript with two commands (bun init and bun add fastify) and one file (server.ts). No configuration files, no build tools, no transpilation step. This is the core DX difference between Bun and Node.js.

Bun provides two ways to create HTTP servers: the native Bun.serve() API and third-party frameworks like Fastify. Both run on Bun's runtime, but they serve different needs.

Bun.serve() is Bun's built-in HTTP server. It is the fastest option — benchmarks show approximately 180,000 requests/sec for plaintext responses compared to Node.js's 65,000 requests/sec. It is ideal for simple APIs, microservices, and performance-critical endpoints where you want minimal overhead. [1][3]

Fastify adds a structured layer on top: JSON schema validation, serialization optimization, plugin architecture, hooks, and decorators. Running Fastify on Bun gives you ~95,200 req/s compared to ~55,800 req/s on Node.js — roughly 70% more throughput. You lose some raw speed compared to Bun.serve(), but gain a production-grade framework with validation, error handling, and extensibility. [3]

For most real-world applications, Fastify on Bun is the pragmatic choice. The framework overhead is a fraction of the total request time when you add database queries, authentication, and business logic. The schema validation alone — which Fastify handles at the framework level with 2-3x serialization speedup — often justifies the choice. [4][5]

Use Bun.serve() when

You need maximum raw throughput for simple endpoints. Building a lightweight proxy, health check service, webhook receiver, or internal microservice. You want the smallest possible cold start for serverless deployments. You are comfortable handling routing, validation, and error handling manually.

Use Fastify on Bun when

You need JSON schema validation and serialization. You want a plugin ecosystem (CORS, auth, Swagger, rate limiting). You are building a REST or GraphQL API with multiple routes. You want structured error handling and request lifecycle hooks. You need production-grade developer tooling.

Avoid Bun.serve() when

Your API has complex validation requirements. You need OpenAPI/Swagger documentation generation. You rely on framework-specific plugins. Your team expects Express/Fastify patterns for maintainability.

Bun.serve() delivers the highest raw throughput, but Fastify on Bun provides a strong balance of framework features and performance. The real-world gap between Fastify on Bun and Fastify on Node.js is approximately 70%.

Section bun-serve-vs-fastify screenshot

Fastify's strength is structured request validation using JSON Schema. Here is how to build validated routes on Bun.

01

Create a validated route module

TYPESCRIPT
// routes/users.ts
import { FastifyPluginAsync } from "fastify";

export const userRoutes: FastifyPluginAsync = async (fastify) => {
  const createUserBody = {
    type: "object",
    required: ["email", "name"],
    properties: {
      email: { type: "string", format: "email" },
      name: { type: "string", minLength: 1, maxLength: 100 },
      role: { type: "string", enum: ["admin", "user", "viewer"] },
    },
  } as const;

  fastify.post("/users", {
    schema: { body: createUserBody },
  }, async (request, reply) => {
    const { email, name, role } = request.body;
    return {
      id: crypto.randomUUID(),
      email,
      name,
      role: role ?? "user",
      createdAt: new Date().toISOString(),
    };
  });

  fastify.get("/users/:id", {
    schema: {
      params: {
        type: "object",
        properties: { id: { type: "string" } },
        required: ["id"],
      },
    },
  }, async (request, reply) => {
    const { id } = request.params as { id: string };
    return { id, message: "User lookup placeholder" };
  });
};
02

Register routes in the main server file

TYPESCRIPT
// server.ts
import Fastify from "fastify";
import { userRoutes } from "./routes/users";

const fastify = Fastify({ logger: true });

fastify.register(userRoutes);

fastify.setErrorHandler((error, request, reply) => {
  if (error.validation) {
    reply.status(400).send({
      error: "Validation failed",
      details: error.validation,
    });
    return;
  }
  fastify.log.error(error);
  reply.status(500).send({ error: "Internal server error" });
});

const start = async () => {
  try {
    await fastify.listen({ port: 3000, host: "0.0.0.0" });
  } catch (err) {
    fastify.log.error(err);
    process.exit(1);
  }
};

start();

Run with bun run server.ts. Invalid request bodies now return structured 400 errors automatically.

Key point

Fastify's schema validation works identically on Bun and Node.js. The only difference is that the validated handler executes faster on Bun's runtime. If your existing Fastify routes use JSON Schema, they will work on Bun without modification.

Bun supports two database paths: its built-in bun:sqlite module for lightweight local storage, and standard npm database drivers for PostgreSQL, MySQL, MongoDB, and other databases. For PostgreSQL, the pg driver works on Bun without modification. [1]

For an ORM layer, Prisma, Drizzle ORM, and TypeORM all work on Bun. Prisma in particular is well-tested — the Prisma team has confirmed Bun compatibility and many production deployments use Prisma + Bun + PostgreSQL. [1][2]

Bun's built-in SQLite module (bun:sqlite) is useful for prototyping, caching, and single-instance deployments where an external database is overkill. For production APIs, PostgreSQL via pg or an ORM like Prisma is the standard choice.

Section database-integration screenshot

Practical advice

Use bun:sqlite for prototyping and single-instance caching. Use PostgreSQL + Prisma or Drizzle for production APIs. Both work on Bun without configuration changes beyond the database connection string.

Bun's built-in test runner uses Jest-compatible APIs. Most existing test files work without modification.

01

Write a route test

Create a test file using Bun's built-in test runner. Import describe, it, and expect from bun:test. The API is Jest-compatible so existing NestJS/Fastify tests migrate with minimal changes. Use app.inject() to test routes without starting a real HTTP server. Run all tests with bun test, which discovers files matching *.test.ts and *.spec.ts automatically. A test suite of 500 tests completes in approximately 1.2 seconds compared to Jest 24 seconds. [1]

Migration note

If your project already uses Jest, most test files can switch to bun test without changes. The API surface is nearly identical. The main difference is speed: expect tests that took 30 seconds with Jest will complete in roughly 1-2 seconds with Bun.

Bun provides official Docker images that are smaller and start faster than Node.js equivalents.

01

Create a production Dockerfile

DOCKERFILE
FROM oven/bun:1.3 AS base
WORKDIR /app

COPY package.json bun.lockb ./
RUN bun install --frozen-lockfile --production

COPY src ./src

EXPOSE 3000
CMD ["bun", "run", "src/server.ts"]
02

Build the image and start the container

BASH
docker build -t my-bun-server .
docker run -p 3000:3000 my-bun-server

The container starts in under a second. Bun's cold start advantage (8-15ms vs 40-120ms for Node.js) is especially valuable in container orchestration environments where frequent restarts and scaling events are normal.

These benchmarks come from multiple independent sources testing Fastify on Bun vs Fastify on Node.js with identical hardware and workloads.

The performance gap between Bun and Node.js is most pronounced in raw HTTP throughput and cold starts. When framework overhead enters the picture (Fastify middleware, JSON serialization, validation), the gap narrows to 40-70%. In real-world APIs where database queries and network I/O dominate, the runtime accounts for less than 10% of total request time. The biggest practical gains are in startup speed, CI/CD pipeline time, and WebSocket concurrency. [1][2][3]

95,200 vs 55,800

Bun delivers ~70% more Fastify throughput than Node.js on identical hardware. [3]

156ms vs 245ms

Bun starts ~36% faster, reducing serverless function costs directly. [2]

1.2M vs 680K

Bun handles 76% more concurrent WebSocket connections on the same hardware. [1]

380MB vs 512MB

Bun containers use ~25% less memory, enabling more pods per cluster. [1]

2s vs 30s

Bun's package manager installs dependencies 10-15x faster than npm. [1]

1.2s vs 24s

Bun's test runner is ~20x faster than Jest with compatible API. [1]

When the numbers matter most

Bun's performance advantage is largest for serverless functions (cold start), CI/CD pipelines (install + test speed), and WebSocket-heavy applications. For standard REST APIs where the database is the bottleneck, the difference is smaller but still meaningful — especially at scale.

These are the problems that teams actually encounter when deploying Bun in production.

Assuming 100% npm compatibility. Approximately 2% of packages — mostly native C++ addons compiled with node-gyp — may not work. Common problematic packages include bcrypt (use bcryptjs), certain canvas implementations, and node-sass (use sass). Always run bun install && bun test before committing. [1][2]

Ignoring Bun's garbage collector differences. Bun uses JSC's garbage collector, which is less battle-tested for processes running 72+ hours continuously compared to V8's. For long-running services, monitor memory behavior over 24-48 hours after migration. [2]

Forgetting to audit native dependencies before Docker builds. A package that installs fine with bun install on macOS may fail during Docker builds on Linux if it includes platform-specific native binaries. Test the full build pipeline early. [1]

Using node: protocol imports without verification. Bun supports most node: built-in module imports, but edge cases exist in node:diagnostics_channel, node:vm, and certain inspector APIs. If your code uses deep Node.js internals, test each import explicitly. [1][2]

Expecting all Express middleware to work unchanged. Most Express middleware works with Fastify (via @fastify/express adapter), but some packages rely on Express-specific request/response object mutations. Test each middleware individually. [4]

The safe migration path

Start by using Bun as a package manager and test runner while keeping Node.js as the runtime. This gives you 80-90% of the speed benefit with zero production risk. Switch the runtime to Bun only after your full test suite passes and you have validated memory behavior in staging.

Both runtimes are production-ready in 2026. The choice depends on your specific constraints.

Choose Bun for new projects

You control the dependency tree. You want native TypeScript, faster CI/CD, lower serverless costs, or maximum HTTP throughput. The compatibility is excellent for standard web frameworks (Fastify, Hono, Express) and ORMs (Prisma, Drizzle, TypeORM).

Keep Node.js for existing projects

Your application has hundreds of dependencies, native C++ addons, or years of production hardening. The performance gains rarely justify the migration cost when the runtime is not the bottleneck — database queries and network I/O usually dominate.

Use the hybrid approach

Bun for tooling (package management, testing) + Node.js for production runtime. This is the pragmatic enterprise pattern: 80-90% faster CI/CD with zero migration risk. Several Fortune 500 companies adopted this exact pattern in 2026.

Short answers to common questions about running Fastify on Bun.

Does Fastify work on Bun without changes?

Yes. Fastify runs on Bun without code modifications. All core features — routing, validation, hooks, plugins, serialization — work identically. Benchmarks show approximately 70% more throughput on Bun compared to Node.js. [3][4]

Do I still need tsconfig.json with Bun?

Not for basic execution. Bun runs .ts files natively. You only need tsconfig.json if you want strict type checking in your IDE or if other tools (ESLint, IDE autocomplete) reference it. Bun ignores tsconfig.json at runtime. [1]

Can I use Prisma with Bun?

Yes. Prisma works on Bun without modification. The Prisma team has confirmed Bun compatibility and many production deployments use Prisma + Bun + PostgreSQL. Run bun add prisma @prisma/client and generate your client normally. [1][2]

What about hot reload in development?

Bun provides bun --hot which preserves application state across reloads — faster than Node.js's --watch flag that restarts the entire process. For Fastify, this means your server restarts in milliseconds instead of seconds. [1]

Is Bun safe for production in 2026?

Yes for most use cases. Bun 1.3 reached stable Windows support and 98% npm compatibility. The primary caveats are long-running processes (72+ hours) where V8's garbage collector is more proven, and native C++ addon dependencies. [1][2]

How does Bun compare to Deno?

Bun is faster than Deno in most benchmarks. Deno has a stronger security model (built-in permissions) but slower adoption. For this guide's use case (Fastify server), Bun has better framework compatibility and a faster test runner. [2]

Sources supporting claims about Bun runtime features, performance benchmarks, Fastify compatibility, and ecosystem status.

Reviewed: 12 Apr 2026Applies to: Bun 1.3+Applies to: Fastify 5.xApplies to: TypeScript 5.xApplies to: Node.js 22+ (for comparison)Tested with: Bun 1.3 runtimeTested with: Fastify 5.8.xTested with: bun:sqliteTested with: PostgreSQL via pg driverTested with: Docker (oven/bun base image)

If you are evaluating Bun for your backend stack, PAS7 Studio can help you assess compatibility, build the migration path, and ship a production-ready Bun + Fastify server with validation, database integration, authentication, and deployment automation.

We have experience with Bun, Fastify, NestJS, PostgreSQL, Docker, and the full backend stack. Whether you are starting fresh or migrating from Node.js, we can help you avoid the common pitfalls and get the performance benefits without the surprises.

You are here01/02

Bun.js server with Fastify: from zero to production

Previous
Next

Related Articles

growth

AI SEO / GEO in 2026: Your Next Customers Aren’t Humans — They’re Agents

Search is shifting from clicks to answers. Bots and AI agents crawl, cite, recommend, and increasingly buy. Learn what AI SEO / GEO means, why classic SEO is no longer enough, and how PAS7 Studio helps brands win visibility in the agentic web.

blogs

The most powerful Apple chip yet? M5 Pro and M5 Max are breaking records

A data-backed March 2026 analysis of Apple M5 Pro and M5 Max. We break down why these chips can credibly be called Apple's most powerful pro laptop silicon, how they compare with M4 Pro, M4 Max, M1 Pro, M1 Max, and how they stack up against Intel and AMD laptop rivals.

blogs

Artemis II and the Code That Carries Humans to the Moon

This article unpacks NASA's Artemis II mission, launched on April 1, 2026, and explains what it really says about modern engineering: flight software, backup logic, simulation, telemetry, human control, and the careful role AI can play in space systems.

telegram-media-saver

Automatic Tagging & Search for Saved Links

Integrate with GDrive/S3/Notion for automatic tagging and fast search via search APIs

Professional development for your business

We create modern web solutions and bots for businesses. Learn how we can help you achieve your goals.