Back to blog
PerformanceDec 3, 2025

How We Made Our API 2x Faster with Edge Runtime

The Problem

When we first launched Euclase, our API routes were running on Node.js serverless functions. While this worked well for warm requests (~200ms), cold starts were painful — averaging around 1.4 seconds.

For a note-taking app where users expect instant responsiveness, this was unacceptable.

The Solution: Edge Runtime

We migrated all our API routes to Edge Runtime using @neondatabase/serverless. This library is specifically designed to work in edge environments where traditional PostgreSQL clients don't work.

// Before: Node.js runtime
import { Pool } from 'pg';

// After: Edge Runtime
import { neon } from '@neondatabase/serverless';

The Results

MetricBefore (Node.js)After (Edge)
Cold Start~1.4s~0.7s
Warm Request~0.2s~0.2s
That's a 2x improvement in cold start times.

Key Changes

  • Switched to @neondatabase/serverless - HTTP-based PostgreSQL client that works at the edge
  • Added export const runtime = 'edge' to all API routes
  • Updated Drizzle ORM config to use the Neon serverless adapter

Takeaways

Edge Runtime is a game-changer for applications that need low latency globally. The trade-off is a more limited API surface (no Node.js-specific APIs), but for our use case, it was absolutely worth it.

If you're building a latency-sensitive application on Vercel, definitely consider Edge Runtime + Neon serverless.

How We Made Our API 2x Faster with Edge Runtime - Euclase Blog