The Internet witnessed a remarkable surge of interest this week as people rushed to purchase Mac minis to run Moltbot, an open-source, self-hosted AI agent designed as a personal assistant. But what if you could run Moltbot without buying dedicated hardware? Cloudflare has introduced Moltworker, middleware that enables running Moltbot efficiently and securely on Cloudflare's Sandbox SDK and Developer Platform APIs.

A Personal Assistant Without the Hardware

Moltbot runs in the background on user hardware, offers extensive integrations for chat applications, AI models, and popular tools, and can be controlled remotely. It can manage finances, social media, organize your day—all through your favorite messaging app.

Moltworker represents a proof of concept demonstrating how Cloudflare's Developer Platform can run complex AI agent applications efficiently and securely at scale. This showcases the platform's capability to handle demanding workloads that traditionally required dedicated infrastructure.

The Node.js Compatibility Story

Cloudflare Workers has never been as compatible with Node.js as it is now. Where we previously had to mock APIs to get packages running, those APIs are now supported natively by the Workers Runtime.

This has fundamentally changed how we can build tools on Cloudflare Workers. When we first implemented Playwright for Browser Rendering, we relied on memfs—a hack and external dependency that forced us to drift from the official Playwright codebase. With more Node.js compatibility, we started using node:fs natively, reducing complexity and maintainability, making upgrades to latest Playwright versions easy.

The list of natively supported Node.js APIs keeps growing. We recently ran an experiment taking the 1,000 most popular NPM packages, installing them, and letting AI attempt to run them in Cloudflare Workers. Excluding packages that are build tools, CLI tools, or browser-only (which don't apply), only 15 packages genuinely didn't work. That's just 1.5%.

Moltbot doesn't necessarily require extensive Workers Node.js compatibility because most code runs in containers anyway, but this highlights how far we've come supporting packages using native APIs. When starting new AI agent applications from scratch, we can actually run significant logic in Workers, closer to users.

The Building Blocks

The other crucial part is that the list of products and APIs on our Developer Platform has grown to the point where anyone can build and run any kind of application—even the most complex and demanding ones—on Cloudflare. Once launched, every application immediately benefits from our secure and scalable global network.

These products and services provided the ingredients we needed. First, Sandboxes, where you can run untrusted code securely in isolated environments. Next, Browser Rendering, where you can programmatically control headless browser instances. Finally, R2, where you can store objects persistently. With these building blocks, we could begin adapting Moltbot.

How We Adapted Moltbot

Moltworker combines an entrypoint Worker acting as API router and proxy between our APIs and the isolated environment, both protected by Cloudflare Access. It provides an administration UI and connects to the Sandbox container running the standard Moltbot Gateway runtime and integrations, using R2 for persistent storage.

AI Gateway for Provider Management

Cloudflare AI Gateway acts as a proxy between AI applications and popular AI providers, giving customers centralized visibility and control over requests.

We recently announced Bring Your Own Key (BYOK) support, where instead of passing provider secrets in plain text with every request, we centrally manage secrets and use them with your gateway configuration.

An even better option where you don't manage AI provider secrets at all end-to-end is Unified Billing. You top up your account with credits and use AI Gateway with any supported provider directly. Cloudflare gets charged and deducts credits from your account.

To make Moltbot use AI Gateway, create a new gateway instance, enable the Anthropic provider, add your Claude key or purchase credits for Unified Billing, and set the ANTHROPIC_BASE_URL environment variable so Moltbot uses the AI Gateway endpoint. That's it—no code changes necessary.

Once Moltbot uses AI Gateway, you have full visibility on costs and access to logs and analytics helping you understand how your AI agent uses AI providers.

Sandboxes for Secure Execution

Last year we anticipated the growing need for AI agents to run untrusted code securely in isolated environments, announcing the Sandbox SDK. This SDK is built on Cloudflare Containers but provides a simple API for executing commands, managing files, running background processes, and exposing services—all from Workers applications.

Instead of dealing with lower-level Container APIs, the Sandbox SDK gives you developer-friendly APIs for secure code execution and handles container lifecycle, networking, file systems, and process management complexity—letting you focus on building application logic with just a few lines of TypeScript.

This fits perfectly for Moltbot. Instead of running Docker on your local Mac mini, we run Docker on Containers, use the Sandbox SDK to issue commands into the isolated environment, and use callbacks to our entrypoint Worker, effectively establishing two-way communication between systems.

R2 for Persistent Storage

Running things on local computers or VPS provides persistent storage for free. Containers, however, are inherently ephemeral—data generated within them is lost upon deletion. The Sandbox SDK provides sandbox.mountBucket() to automatically mount your R2 bucket as a filesystem partition when containers start.

Once we have a local directory guaranteed to survive the container lifecycle, we can use that for Moltbot to store session memory files, conversations, and other assets required to persist.

Browser Rendering for Automation

AI agents rely heavily on browsing the web. Moltbot utilizes dedicated Chromium instances to perform actions, navigate the web, fill forms, take snapshots, and handle tasks requiring web browsers. We can run Chromium on Sandboxes too, but what if we could simplify and use an API instead?

With Cloudflare's Browser Rendering, you can programmatically control and interact with headless browser instances running at scale in our edge network. We support Puppeteer, Stagehand, Playwright, and other popular packages so developers can onboard with minimal code changes.

To get Browser Rendering working with Moltbot, we created a thin CDP proxy from the Sandbox container to the Moltbot Worker, back to Browser Rendering using Puppeteer APIs. Then we injected a Browser Rendering skill into the runtime when the Sandbox starts.

From the Moltbot runtime perspective, it has a local CDP port it can connect to and perform browser tasks.

Zero Trust Access for Authentication

Next, we wanted to protect our APIs and Admin UI from unauthorized access. Doing authentication from scratch is hard—it's the kind of wheel you don't want to reinvent or deal with. Zero Trust Access makes it incredibly easy to protect your application by defining specific policies and login methods for endpoints.

Once endpoints are protected, Cloudflare handles authentication for you and automatically includes a JWT token with every request to origin endpoints. You can validate that JWT for extra protection, ensuring requests came from Access and not malicious third parties.

Like with AI Gateway, once all APIs are behind Access, you get great observability on who users are and what they're doing with your Moltbot instance.

Moltworker in Action

We set up a Slack instance to play with our own Moltbot on Workers. We asked Moltbot to find the shortest route between Cloudflare in London and Cloudflare in Lisbon using Google Maps and take a screenshot. It goes through a sequence of steps using Browser Rendering to navigate Google Maps, doing a pretty good job.

We also got creative and asked Moltbot to create a video browsing our developer documentation. It downloaded and ran ffmpeg to generate the video from frames captured in the browser.

Run Your Own Moltworker

We open-sourced our implementation and made it available at github.com/cloudflare/moltworker, so you can deploy and run your own Moltbot on Workers today.

The README guides you through necessary setup steps. You'll need a Cloudflare account and a minimum $5 USD Workers paid plan subscription to use Sandbox Containers, but all other products are either free to use, like AI Gateway, or have generous free tiers you can use to get started and run indefinitely under reasonable limits.

Note that Moltworker is a proof of concept, not a Cloudflare product. Our goal is to showcase exciting features of our Developer Platform that can run AI agents and unsupervised code efficiently and securely while getting great observability and taking advantage of our global network.

Conclusion

We hope this experiment convinced you that Cloudflare is the perfect place to run your AI applications and agents. We've been working relentlessly to anticipate the future and release features like the Agents SDK that you can use to build your first agent in minutes, Sandboxes where you can run arbitrary code in isolated environments without container lifecycle complications, and AI Search, Cloudflare's managed vector-based search service.

Cloudflare now offers a complete toolkit for AI development: inference, storage APIs, databases, durable execution for stateful workflows, and built-in AI capabilities. Together, these building blocks make it possible to build and run even the most demanding AI applications on our global edge network.

Source: Introducing Moltworker: a self-hosted personal AI agent, minus the minis - Cloudflare Blog