Guides

Cassette vs Vercel

How Cassette compares to Vercel

Cassette vs Vercel

Vercel is a serverless platform that abstracts away the server. Cassette is the server itself. Same end goal—running your code—completely different approaches.

The Short Version

Feature Cassette Vercel
Model Server you own Serverless functions
Pricing Flat monthly Per-request + bandwidth + compute time
Cold starts Never Yes (can be significant)
Runtime limits None 10-300 seconds max
Background jobs Run anything Needs workarounds
Database Install Postgres, SQLite, whatever External service required
Frameworks Any Next.js optimized, others supported

What Vercel Gets Right

Vercel nailed the developer experience for Next.js. Push to GitHub, get a preview URL. Merge to main, deploy to production. Edge functions, image optimization, analytics—it all works.

For static sites and simple Next.js apps, this is genuine magic.

Where Serverless Gets Weird

The serverless model has tradeoffs that show up as your app grows:

Cold Starts

Your function spins down when unused. Next request? Wait 200-2000ms while it spins back up. That's not latency—that's waiting for your computer to boot.

Cassette servers are always on. The first request is as fast as the thousandth.

Execution Limits

Vercel functions timeout after 10 seconds (Hobby), 60 seconds (Pro), or 300 seconds (Enterprise). Processing a large file? Running a batch job? Generating a PDF? You're fighting the clock.

Cassette has no execution limits. Your code runs until it's done.

The Vendor Lock-In

Vercel's magic comes from tight integration with their platform. Move off Vercel? You're rewriting:
- Image optimization (use next/image? tied to Vercel)
- Edge middleware (Vercel-specific runtime)
- Serverless functions (their deployment format)
- Environment variables (their dashboard)
- Preview deployments (their GitHub integration)

Cassette runs standard Linux. Your code is your code. Deploy with Docker, Kamal, rsync, git hooks—whatever works. Leave anytime, take everything with you.

The Pricing Surprise

Vercel's free tier is generous. The Pro tier ($20/user/month) seems reasonable. Then you get traffic.

Vercel charges for:
- Function invocations (per million)
- Edge function invocations (per million, different price)
- Bandwidth (per GB)
- Build minutes (per minute after free tier)
- Image optimizations (per 1,000)
- Analytics (per event)
- Speed Insights (per data point)

A moderately successful app can easily hit $200-500/month on Vercel. That same app runs comfortably on a $29/month Cassette.

Real Example: A SaaS Dashboard

Say you're running a B2B dashboard with 1,000 daily active users:
- 50,000 page views/day
- 150,000 API calls/day
- 5GB bandwidth/day

On Vercel Pro: $20/seat × 3 developers + bandwidth overages + function invocations = $150-300/month (and climbing with usage)

On Cassette Medium: $29/month. Same traffic, same performance, predictable forever.

What You Can't Do on Vercel

Vercel is built for web frontends and API routes. Anything else requires external services:

Long-running processes? No. Use a separate job queue service.

WebSockets? Limited. Their Edge Runtime has restrictions.

Databases? Not on Vercel. Connect to Supabase, PlanetScale, Neon, etc. (more services, more bills)

Background workers? Not really. Need external queue + worker setup.

Cron jobs? Sort of. Vercel Cron has limits and costs extra.

AI agents? Tricky. Long-running agent loops hit timeout limits.

On Cassette, all of this runs on the same machine:

BASH
# On your Cassette
# Run your web app
docker compose up -d

# Run background workers
./bin/jobs start

# Run a cron job
crontab -e

# Run an AI agent
python agent.py

Same server. Same bill. No limits.

The AI Agent Gap

Here's where serverless really struggles: AI agents.

Modern AI agents work in loops. They think, act, observe, repeat. They install packages, run commands, make API calls, wait for responses. This can take minutes or hours.

Vercel's model doesn't support this. Functions timeout. State doesn't persist. The agent can't live anywhere.

Cassette is the natural home for AI agents. The command line—what Cassette provides out of the box—is the best runtime for agents that need to install tools, run shell commands, and maintain state.

When Vercel Makes Sense

Vercel is the right choice when:
- You're building a static site or simple Next.js app
- Traffic is low to moderate and predictable
- You want zero DevOps for frontend deployment
- Your team is already in the Vercel ecosystem
- You're okay paying for convenience at scale

When Cassette Makes Sense

Cassette is the right choice when:
- You want predictable costs regardless of traffic
- You run more than just a frontend (background jobs, databases, workers)
- You're not using Next.js (or even if you are—Next.js runs great on servers)
- You want to avoid cold starts entirely
- You need long-running processes or AI agents
- You prefer owning your infrastructure over renting abstractions

The Philosophy

Vercel bet on serverless: no servers to manage means less complexity.

We bet on servers, done right: a good server is less complex than serverless.

The serverless promise was don't think about servers. The reality is think about cold starts, execution limits, vendor lock-in, and unpredictable bills.

A server that just works—always on, with no limits, at a fixed price—is simpler than the serverless alternative for most applications.

The Bigger Picture

The explosion of AI-generated apps (vibecoding, AppGen platforms) means millions of new applications need hosting. Most of these don't need edge functions and image optimization CDNs. They need a computer that runs their code.

Cassette is that computer. Fast, predictable, always on. No abstractions between your code and the machine.


Want a server that stays on? Join the waitlist and skip the serverless complexity.