- Published on
Securing Vercel API Routes with IP Whitelist & Rate Limiting (2025 Guide)
- Authors
- Name
- Ahmed Farid
- @
CAUTION
Exposing unauthenticated API routes can get you DoS-ed or abused quickly. Follow this playbook before shipping to production.
Vercel’s Serverless & Edge runtimes make it trivial to spin up endpoints, yet also easy to forget basic protections. This article walks you through a defence-in-depth approach combining:
- IP Whitelisting (simple but effective for internal hooks / cron jobs).
- Rate Limiting (shared Redis counter, token bucket algorithm).
- Centralised Logging & Alerts.
We will implement it with Next.js 14 App Router, Edge Middleware and Upstash Redis—all on the free tier.
Table of Contents
- Table of Contents
- 1. Prerequisites & Terminology
- 2. Project Setup
- 3. Extracting the Client IP on Vercel
- 4. Implementing the IP Whitelist (Edge Middleware)
- 5. Adding Rate Limiting with Upstash Redis
- 6. Combining Both Guards
- 7. Testing Locally with Vercel CLI
- 8. Logging & Alerting
- 9. Automating IP Rotation (Optional)
- 10. Production Checklist
- 11. Further Reading & Resources
- 12. Conclusion
1. Prerequisites & Terminology
- Next.js 14 project deployed on Vercel.
- Node 18+ locally.
- A free Upstash Redis database (or Vercel KV).
Term | Meaning |
---|---|
Edge Middleware | Code that runs on Vercel’s edge before the request hits your route. |
Token Bucket | Rate-limit algorithm allowing bursts then steady refill. |
Forwarded-For | X-Forwarded-For header containing client IP chain. |
2. Project Setup
npx create-next-app@latest secure-api
cd secure-api
npm i @upstash/ratelimit @upstash/redis
Create .env.local
:
UPSTASH_REDIS_REST_URL=...
UPSTASH_REDIS_REST_TOKEN=...
ALLOWED_IPS=192.0.2.10,203.0.113.5
3. Extracting the Client IP on Vercel
Vercel places the real client IP in request.headers.get('x-forwarded-for')
(Edge) or req.headers['x-forwarded-for']
(Serverless). Always take the first element.
export function getClientIp(header?: string | null) {
if (!header) return '0.0.0.0'
return header.split(',')[0].trim()
}
4. Implementing the IP Whitelist (Edge Middleware)
middleware.ts
at project root:
import { NextResponse, NextRequest } from 'next/server'
import { getClientIp } from '@/lib/ip'
const allowed = (process.env.ALLOWED_IPS || '').split(',')
export function middleware(req: NextRequest) {
const ip = getClientIp(req.headers.get('x-forwarded-for'))
const isAllowed = allowed.includes(ip)
if (!isAllowed && req.nextUrl.pathname.startsWith('/api/secure')) {
return new NextResponse('Forbidden', { status: 403 })
}
return NextResponse.next()
}
export const config = { matcher: ['/api/secure/:path*'] }
Notes:
- Matcher limits overhead to secured routes only.
- Use
ALLOWED_IPS
env var to avoid repo leaks.
5. Adding Rate Limiting with Upstash Redis
Create lib/ratelimit.ts
:
import { Ratelimit } from '@upstash/ratelimit'
import { Redis } from '@upstash/redis'
export const ratelimit = new Ratelimit({
redis: Redis.fromEnv(),
limiter: Ratelimit.slidingWindow(100, '1 m'), // 100 requests per minute
analytics: true,
})
Use it in a Serverless API route:
// app/api/secure/hello/route.ts
import { ratelimit } from '@/lib/ratelimit'
import { NextResponse } from 'next/server'
import { getClientIp } from '@/lib/ip'
export async function GET(request: Request) {
const ip = getClientIp(request.headers.get('x-forwarded-for'))
const { success, remaining, reset } = await ratelimit.limit(ip)
if (!success) {
return NextResponse.json(
{ error: 'Too many requests' },
{
status: 429,
headers: {
'X-RateLimit-Remaining': remaining.toString(),
'X-RateLimit-Reset': reset.toString(),
},
}
)
}
return NextResponse.json({ message: 'Hello secure world' })
}
Edge Functions can’t yet use Upstash (no fetch in streaming) — stick to Serverless for now.
6. Combining Both Guards
Because Edge Middleware runs before the Serverless Function, an attacker never consumes function cold-start time. The Serverless route then adds a second layer of throttling.
7. Testing Locally with Vercel CLI
vercel dev
curl -H "x-forwarded-for: 192.0.2.10" http://localhost:3000/api/secure/hello
# 200 OK
curl -H "x-forwarded-for: 8.8.8.8" http://localhost:3000/api/secure/hello
# 403 Forbidden
For rate-limit exhaustion:
for i in {1..101}; do curl -s -o /dev/null -w "%{http_code}\n" -H "x-forwarded-for: 192.0.2.10" http://localhost:3000/api/secure/hello; done
8. Logging & Alerting
- Vercel Web Analytics capture 4xx/5xx trends.
- Upstash dashboard shows per-IP usage.
- Send Slack alerts via Log Drains when
status >= 400
.
9. Automating IP Rotation (Optional)
If your partner’s IP changes, expose a signed admin route that updates ALLOWED_IPS
in Vercel Project Env Vars via the Vercel REST API then triggers a redeploy.
10. Production Checklist
✅ Deny by default—pinpoint routes with matcher
. ✅ Keep Redis in the same region as your functions (@vercel/edge-config
alternative if ultra-low latency needed). ✅ Encrypt env vars (default on Vercel). ✅ Document the throttling policy for clients.
11. Further Reading & Resources
- Upstash Rate Limit docs.
- Vercel Edge Middleware.
- OWASP API Security Top 10.
12. Conclusion
By layering an IP whitelist at the edge with robust, Redis-backed rate limiting inside your API route, you achieve cost-effective protection against abuse without sacrificing developer velocity. ✅
Secure those endpoints and ship with confidence! 🔒