Functions that scale like serverless —
governed like enterprise.
Run HTTP and event-driven code with revisions, traffic control, zero-trust guardrails, and first-class integrations — with AI-assisted provisioning and debugging.
Deploy → Scale → Observe
90s overview
Select a trigger to see the flow
export default async function handler(req: Request) {
// Process HTTP request
const data = await req.json();
return Response.json({ status: "ok" });
}
Triggers as first-class primitives
Beyond typical HTTP functions. Connect to any event source with type-safe handlers and automatic scaling.
HTTP / Webhooks
Handle REST APIs, webhooks, and HTTP callbacks with automatic request parsing and response serialization.
export default async (req: Request) => {
const body = await req.json();
return Response.json({ ok: true });
}
Cron Schedules
Schedule recurring tasks with cron expressions. Perfect for batch jobs, reports, and maintenance tasks.
// Runs every day at midnight UTC
@cron("0 0 * * *")
export default async () => {
await generateReport();
}
Messaging (Kafka / Pulsar)
Consume events from Kafka, Pulsar, or Pub/Sub topics with at-least-once delivery and automatic batching.
@subscribe("orders.created")
export default async (event: OrderEvent) => {
await processOrder(event.payload);
}
Object Storage (S3 / GCS)
React to file uploads, deletions, and modifications in your storage buckets automatically.
@onUpload("my-bucket/uploads/*")
export default async (file: StorageEvent) => {
await processImage(file.path);
}
Database Events (CDC)
Capture change data from your database. React to inserts, updates, and deletes in real-time.
@onChange("users")
export default async (change: ChangeEvent) => {
await syncToSearch(change.after);
}
Create triggers • Run locally • Preview
Intelligent scale-to-zero + immutable revisions
Every deploy creates an immutable revision. Route traffic between versions, canary new code, and rollback instantly—all with automatic scaling that goes to zero when idle.
Scale Behavior
Functions scale to zero when idle. First request wakes the function (~50-200ms cold start).
Concurrency Model
One request per instance. Minimizes latency by avoiding resource contention.
Request Buffer
Burst handlingDeploy → Canary → Rollback
Start serverless. Promote seamlessly.
Same packaging, same identity, same observability — different execution posture. Graduate functions to dedicated compute when you need it.
Pure Scale-to-Zero
DefaultIdeal for sporadic traffic, webhooks, and event handlers. Pay only for actual execution time.
Pinned Warm Capacity
DedicatedKeep minimum instances warm for latency-sensitive workloads. Instant response, no cold starts.
Worker Mode
Long-runningFor async processing, batch jobs, and long-running tasks. Built-in queue consumption and retries.
Long-running tasks?
Use worker mode with queues and built-in retries. Function mode is best for sub-minute executions.
Managed services. One-click bindings.
Connect functions to managed databases, storage, messaging, and analytics. Policy-aware bindings with identity, cost allocation, and audit built in.
my-function
Event processor
// Auto-injected binding
const db = process.env.DATABASE_URL;
Browse Marketplace → Deploy BigQuery → Bind to Function
Global functions. Enterprise patterns.
Deploy across regions with active-active or active-passive patterns. Automatic failover, latency-based routing, and consistency controls.
Routing Pattern
Requests route to the nearest healthy region. All regions serve traffic simultaneously.
Region Health
AI Recommendation
Based on your stateless function and PostgreSQL binding, we recommend Active-Active with read replicas in each region.
Stateful dependencies?
Use managed services that support your chosen pattern. PostgreSQL read replicas or Redis with CRDT work well for active-active.
AI-assisted provisioning & debugging
Generate function skeletons, select triggers, bind integrations safely, propose IAM policies, estimate costs, and explain error traces — all policy-aware.
# Generated by Celeris AI
apiVersion: functions.celeris.io/v1
kind: Function
metadata:
name: webhook-processor
namespace: production
spec:
runtime: nodejs20
trigger:
type: http
route: /webhooks/stripe
scaling:
minScale: 0
maxScale: 100
concurrencyTarget: 10
bindings:
- name: database
service: postgresql://prod-db
policy:
egress:
- api.stripe.com
networkPolicy: zero-trust
audit: true
---
# Rollout Strategy
rollout:
strategy: canary
steps:
- weight: 10
pause: 5m
- weight: 50
pause: 10m
- weight: 100
AI Copilot
Policy-aware assistance
Capabilities
Quick Generate
Policy Violation Blocked
Egress to *.amazonaws.com is not allowed. Use marketplace-managed S3 instead.
AI generates function + trigger + bindings + rollout
Zero-trust. Audit-ready.
Enterprise-grade security for serverless. Identity management, network policies, egress controls, and comprehensive audit logging.
Identity
Per-function identity
- Workload identity per function
- OIDC token injection
- Role-based access control
Network
Zero-trust policies
- Allow/deny ingress rules
- mTLS for east-west traffic
- Private networking options
Egress
Outbound controls
- Explicit egress allowlist
- Domain-level blocking
- NAT gateway options
Secrets
Secure bindings
- Encrypted at rest + transit
- Automatic rotation
- External secrets support
Audit
Complete trail
- All invocations logged
- Config change history
- Exportable to SIEM
BYOC
Your cloud, your rules
- Deploy to your cloud org
- Data residency control
- VPC integration
Policy Preview
policy:
identity: my-function@project.iam.gserviceaccount.com
network:
mode: zero-trust
allowIngress:
- internal-gateway
egress:
mode: allowlist
allow:
- api.stripe.com
- *.googleapis.com
audit:
enabled: true
retention: 90d
export: bigquery://audit-logs
Debug at distributed-systems scale
Invocation timelines, correlated logs and traces, service graph context, and AI-powered error explanations.
Invocation Timeline
Last 24hService Graph Context
AI Error Analysis
Root Cause
Database connection pool exhausted. Cold start + burst traffic caused new connections to exceed pool limit (max: 10).
Suggested Fix
1. Increase connection pool to 25
2. Enable connection warming on cold start
3. Consider "Warm" scale mode for this function
Tests as first-class deploy gates
E2E, smoke, and load tests run automatically on preview environments. Tests can block promotion. AI generates baseline tests and thresholds.
Preview
Ephemeral environment created
Smoke Tests
Basic functionality verified
Load Tests
Performance validation
Promote
Canary → Full rollout
AI-Generated Tests
AI analyzes your function signature and bindings to generate baseline tests.
Gate Configuration
PR opens → Preview → Tests → Auto promote
Choose your pattern
Select a use case to see how Functions adapts. Each pattern shows the optimal trigger, compute mode, bindings, and rollout strategy.
Webhook Ingestion + Validation
Receive Stripe webhooks, validate signatures, and store events
Trigger
HTTP POST
Compute Mode
Scale-to-Zero
Bindings
PostgreSQL
Rollout
Canary 10% → 100%
Why Celeris is better
Zero-trust egress policy ensures only Stripe IPs can call your webhook. Marketplace PostgreSQL binding handles connection pooling automatically. Multi-region active-active for 99.99% uptime.
Handler Preview
import { verifyStripeSignature } from "@celeris/stripe";
import { db } from "@celeris/postgres";
export default async (req: Request) => {
const event = await verifyStripeSignature(req);
await db.insert(events).values(event);
return Response.json({ received: true });
}
Deploy your first function
in minutes, not days.
Elastic scale-to-zero with enterprise governance. Start free, scale infinitely.
No credit card required. Free tier includes 100K invocations/month.