Serverless

8 min read · Updated 2026-04-25

Serverless computing has evolved from a specialized cloud service to a fundamental architectural pattern. The approach abstracts away infrastructure management, letting developers focus purely on application logic while cloud providers handle provisioning and scaling.

What Defines Serverless

The core model is Function-as-a-Service (FaaS). Application logic is broken into individual functions that execute in response to events. Functions are stateless, ephemeral, and automatically scaled by the cloud provider.

Event-triggered execution
Functions run only when events fire — HTTP requests, queue messages, file uploads, scheduled triggers, DB changes.
Automatic scaling
From zero to thousands of concurrent executions. No capacity planning, no autoscaling group config.
Pay per execution
Billed for actual compute time used, in 1 ms increments on most platforms. Idle = $0.
Managed infrastructure
No server provisioning, patching, or scaling. The provider handles all of it.

The Major Implementations

Cloud-native FaaS
Provider-managed
AWS Lambda (the original, 2014), Google Cloud Functions, Azure Functions, IBM Cloud Functions. Tight integration with each cloud's services.
Self-hosted serverless
Kubernetes-native
Knative (the leading open standard), OpenFaaS, Fission, Kubeless. Run serverless on your own K8s cluster — portable across providers.

Edge serverless

A separate category running closer to users:

Why Teams Adopt It

Cost optimization
Traditional servers run 24/7 even when idle. Serverless costs only during execution. For variable or unpredictable traffic, savings are significant.
Developer productivity
No server config, no scaling strategy, no patching. Engineers focus on business logic. Faster iteration, faster feature delivery.
Automatic scaling
No more capacity planning meetings. Platform makes scaling decisions in real time based on traffic.

Where It Fits

API backends
REST APIs from Lambda + API Gateway. Each endpoint scales independently. Often cheaper than dedicated servers for variable load.
Data processing pipelines
Process uploaded files, transform data streams, run batch operations on demand without permanent infrastructure.
Event-driven microservices
Functions react to queue messages, DB changes, file uploads. Reactive architecture without standing servers.
Scheduled jobs / automation
Replaces cron on servers. Maintenance tasks, report generation, cleanup operations triggered by time.

The Real Costs

Cold start latency

The most-cited problem. When a function hasn’t been invoked recently, the provider has to initialize a new runtime — adding milliseconds to seconds of latency.

Mitigations: provisioned concurrency (pay to keep some warm), language choice, smaller deployment artifacts, edge-runtime alternatives (Cloudflare Workers cold-start in ~5 ms).

Vendor lock-in

Each provider implements serverless with unique APIs, deployment mechanisms, and feature sets. Migrating between providers can require significant rework.

Mitigations:

Debugging and observability

Distributed serverless environments need new approaches. Traditional debugging breaks down across many short-lived functions. Distributed tracing, structured logs, and platform-specific tools (AWS X-Ray, GCP Cloud Trace) are essential.

Long-running and stateful workloads

Serverless functions have execution time limits (15 minutes on Lambda, 9 on Cloud Functions). Stateful workloads need to externalize state to managed services (DynamoDB, Redis, Cloud Firestore). This forces some architectural patterns that wouldn’t otherwise be necessary.

When Serverless Is the Right Choice

Good fit
Use serverless
Variable / spiky traffic. Event-driven workloads. Background jobs. Scheduled tasks. Webhooks. APIs with low-to-moderate steady traffic. New product spikes where you don't want to overprovision.
Bad fit
Skip serverless
Sustained high-throughput steady-state workloads (containers cheaper). Long-running computations beyond execution limits. Latency-critical paths where cold starts hurt. Workloads heavily dependent on local state.

Serverless and Multi-Tenant SaaS

For multi-tenant SaaS specifically, serverless has interesting properties:

The patterns serverless replaces in a SaaS — webhook handlers, image/PDF processing, scheduled tenant cleanup, email/notification dispatch — are often the workloads where serverless is most economical.

Recap