{ }
Published on

Build a Blazing-Fast JSON-to-CSV API with Bun + Express (2025)

Authors
  • avatar
    Name
    Ahmed Farid
    Twitter
    @

NOTE

Bun 1.2 now supports Node HTTP APIs and Express 5 out-of-the-box, giving you 2-3× faster throughput versus Node 20.

In this guide you’ll:

  1. Scaffold a Bun project with Express 5 middleware.
  2. Implement a /convert POST endpoint that streams CSV.
  3. Add validation with zod.
  4. Benchmark vs Node with wrk.
  5. Deploy to Fly.io with a tiny 7 MB container.

Table of Contents

1. Why Bun + Express?

  • Bun’s JIT + Zig HTTP = <1 ms latency.
  • You reuse thousands of Express tutorials/middleware.
  • Built-in native CSV streaming via fast-csv works unchanged.

2. Prerequisites

  • Bun 1.2+ (brew install bun).
  • wrk or autocannon for local bench.

3. Initialize Project

bun create express-json-csv
cd express-json-csv
bun add express fast-csv zod

tsconfig.json is auto-generated; Bun compiles TS instantly.

4. Express Server (index.ts)

import express from 'express'
import { json } from 'express'
import { z } from 'zod'
import { format } from '@fast-csv/format'

const app = express()
app.use(json({ limit: '1mb' }))

const rowSchema = z.record(z.any())
const bodySchema = z.object({ data: z.array(rowSchema) })

app.post('/convert', (req, res) => {
  const parse = bodySchema.safeParse(req.body)
  if (!parse.success) return res.status(400).json({ error: parse.error.flatten() })

  res.setHeader('Content-Type', 'text/csv')
  res.setHeader('Content-Disposition', 'attachment; filename="data.csv"')

  const csvStream = format({ headers: true })
  csvStream.pipe(res)
  parse.data.data.forEach((row) => csvStream.write(row))
  csvStream.end()
})

const PORT = process.env.PORT || 3000
app.listen(PORT, () => console.log(`🚀 Server ready on :${PORT}`))

Bun handles TS without transpile step; bun run index.ts boots instantly.

5. Test Locally

bun run index.ts &
curl -X POST http://localhost:3000/convert \
  -H 'Content-Type: application/json' \
  -d '{"data":[{"name":"Alice","age":30},{"name":"Bob","age":25}]}' -o out.csv
cat out.csv

Output:

name,age
Alice,30
Bob,25

6. Benchmark

wrk -t4 -c200 -d15s --latency -s ./scripts/json.lua http://localhost:3000/convert

Expect ~140k req/s on M1 vs ~55k with Node 20.

7. Dockerfile (7 MB)

FROM oven/bun:slim-1.2
WORKDIR /app
COPY package.json bun.lockb tsconfig.json ./
RUN bun install --production
COPY . ./
EXPOSE 3000
CMD ["bun","run","index.ts"]

Build & push:

docker build -t json-csv-api .

Fly.io fly launch → detects port 3000.

8. Input Validation Tips

  • Reject rows > 10 000 to avoid memory spikes.
  • Use streaming JSON parser (clarinet) for 50 MB+ bodies.

9. Error Handling Middleware

app.use((err, _req, res, _next) => {
  console.error(err)
  res.status(500).json({ error: 'Internal' })
})

10. Security

  • Add helmet() for headers (works in Bun).
  • Limit body size.
  • Run behind Cloudflare or fly-proxy for rate-limiting.

11. Conclusion

With Bun + Express you get Node’s ecosystem plus Go-level speed. Ship your JSON-to-CSV micro-service in minutes and slash compute costs. ⚡️