From Static to Dynamic: Cloudflare's New Workflows for Multi-Tenant Durable Execution
When Cloudflare launched Workers eight years ago, the platform was designed for individual developers. But as the ecosystem matured, so did the needs of its users. Today, Workers powers a vast range of multi-tenant applications—from AI-driven code generation to CI/CD pipelines and autonomous agents. The common thread? Every tenant, agent, or session requires its own isolated compute, storage, and logic. Cloudflare has steadily delivered dynamic primitives to meet these needs: Dynamic Workers for on-demand compute, Durable Object Facets for per-tenant databases, and Artifacts for versioned filesystems. Now, the final piece of the puzzle arrives: Dynamic Workflows, bringing durable execution to multi-tenant environments.
The Challenge of Multi-Tenant Durable Execution
At its core, Cloudflare Workflows is a durable execution engine. It takes a run(event, step) function and transforms it into a resilient program—each step survives failures, can sleep for hours or even days, waits for external events, and resumes exactly where it left off after an isolate recycle. This makes it ideal for long-running processes like onboarding flows, video transcoding, multi-stage billing, or complex agent loops. Workflows V2 already supports up to 50,000 concurrent instances and 300 new instances per second per account, designed for the agentic era.

Yet there was a fundamental limitation: the workflow code had to be part of your deployment. Your wrangler.jsonc would bind a single class to the workflow engine—one class, one deploy. That works if you own all the code and run a traditional application. But it breaks the moment you need to let each customer define their own workflow. Consider these scenarios:
- An app platform where AI writes custom TypeScript for every tenant.
- A CI/CD product where each repository defines its own pipeline logic.
- An agent SDK where each agent writes its own durable execution plan.
In all these cases, the workflow is different for every tenant, agent, or request. There is no single class to bind. This is the same problem that Dynamic Workers solved for compute and Durable Object Facets solved for storage—and now it's time to solve it for durable execution.
Dynamic Deployment Goes Full Stack
Over the past year, Cloudflare has been systematically delivering dynamic primitives for every layer of the stack:
Compute: Dynamic Workers
With Dynamic Workers, platforms can hand the Workers runtime some code at runtime and get back an isolated, sandboxed Worker on the same machine in single-digit milliseconds. No pre-deployment, no manual binding. Perfect for multi-tenant AI and CI/CD workloads.
Storage: Durable Object Facets
Durable Object Facets extend the same idea to data. Each dynamically-loaded app can have its own SQLite database, spun up on demand, with the platform acting as a supervisor. This allows per-tenant, per-session storage without shared bottlenecks.
Source Control: Artifacts
Artifacts provide a Git-native, versioned filesystem you can create by the tens of millions—one per agent, per session, per tenant. This gives each dynamic entity its own reproducible source history.
Together, these primitives gave platforms dynamic deployment for compute, storage, and source control. The missing piece was durable execution—the ability to run long-lived, stateful workflows that are as dynamic as the code that defines them.

Introducing Dynamic Workflows
Today, Cloudflare bridges that gap with Dynamic Workflows. This new capability combines the durability of Workflows with the flexibility of dynamic deployment. Instead of binding a single workflow class at deploy time, you can now pass workflow code at runtime—much like how Dynamic Workers accept code. Each tenant, agent, or session gets its own isolated workflow instance, with its own steps, state, and lifecycle.
The result is a durable execution engine that follows the tenant. Platforms can now:
- Let AI-generated code define multi-step workflows on the fly.
- Enable each CI/CD pipeline to have its own custom logic, including wait steps, retries, and external event handling.
- Allow agents to create and execute their own durable plans without sharing infrastructure.
Dynamic Workflows integrate seamlessly with the other dynamic primitives. A workflow can use a Dynamic Worker for compute, a Durable Object Facet for its database, and Artifacts for versioning—all spun up together per tenant. The platform sits as a supervisor, ensuring isolation and security while the workloads run at Cloudflare's edge.
Use Cases and Implications
The implications are broad. App platforms can now offer durable execution as a built-in feature, not just stateless compute. AI agents can design and persist their own workflows without human intervention. CI/CD systems can handle complex build pipelines that span hours, with each repository defining its own steps and dependencies. Multi-tenant SaaS products can finally give customers full control over their business logic—in TypeScript, running on Cloudflare's global network.
Dynamic Workflows also open the door to new architectures: imagine a workflow that, during a step, dynamically creates sub-workflows for sub-tasks, each with its own isolated state space. The fabric of the application becomes truly elastic.
Conclusion
Cloudflare started with a simple idea: give developers a better way to run code. Eight years later, that idea has evolved into a platform that not only serves developers directly but also empowers them to build platforms for others. With Dynamic Workers, Durable Object Facets, Artifacts, and now Dynamic Workflows, Cloudflare has created a complete stack of dynamic primitives. Durable execution is the last frontier—and it's now open to every tenant, agent, and session. The future of multi-tenant computing is here, and it's fully dynamic.
Related Articles
- AWS Announces Instant Aurora PostgreSQL Serverless Deployment with Express Configuration at re:Invent 2025
- Amazon Bedrock Guardrails Debuts Cross-Account Safety Controls for Enterprise AI
- Aurora PostgreSQL Serverless: Launch a Production-Ready Database in Seconds with Express Configuration
- Strengthening Security in Kubernetes Production Debugging
- How to Safeguard Your SaaS Against Rogue AI Agents: A Comprehensive Data Recovery Guide
- Amazon S3 Files Bridges Gap Between Object Storage and File Systems
- Building Resilient Multi-Cloud Architectures: Cross-Region Failover with AWS and Azure Private Interconnects
- Microsoft Expands Azure Local to Support Thousands of Nodes in Sovereign Private Cloud Deployments