Moltworker is a serverless deployment pattern that runs Moltbot AI assistants inside Cloudflare Workers. It uses R2 for memory storage and Cloudflare Zero Trust for security. This approach eliminates the need for virtual private servers and exposed ports. Hosting costs stay low by leveraging Cloudflare’s existing infrastructure.
Moltworker integrates multiple Cloudflare services into a lightweight runtime. It executes assistant logic in Workers, persists state to R2, and authenticates users via Zero Trust. The platform supports adapters for Telegram, Discord, Slack, and other chat platforms. This allows developers to connect their assistants to familiar channels without managing public infrastructure.

Customer Persona
Moltworker targets developers building personal or team AI assistants. It suits startups needing low-cost hosting without server management. Hobbyists can experiment with serverless AI using free Cloudflare tiers. The tool appeals to anyone who wants a secure, scalable assistant deployment.
Market Analysis
Serverless AI hosting is a competitive space with options like Vercel, Azure Functions, and AWS Lambda. Cloudflare’s edge network offers global latency advantages and zero‑trust security out of the box. For connecting agents to external services, consider using techniques shown in our guide on connecting AI agents to Google Workspace. This edge‑centric model positions Moltworker as a niche solution for developers who prioritize minimal overhead and built‑in security.
Project Link
Project link:
https://github.com/cloudflare/moltworker
How It Works
Moltworker combines several Cloudflare services into a cohesive stack. Workers execute the assistant logic, R2 provides persistent storage, and Zero Trust handles authentication. Adapters bridge popular chat platforms such as Telegram, Discord, and Slack. If you need a more robust production deployment, you can explore running durable autonomous agents with Gobii. The entire system can be configured through environment variables and a simple YAML file.

Deployment starts by cloning the repository and configuring a Cloudflare account. Create an R2 bucket, set up a Worker, and define Access policies. The README provides step‑by‑step instructions for linking your preferred chat adapter. Testing with a small R2 bucket and conservative invocation patterns is recommended to avoid unexpected costs.
Advertising Section
For more advanced AI agent deployments, explore our partner solutions.
The Verdict
Moltworker delivers a serverless AI assistant platform with minimal operational overhead. Cloudflare’s global edge and zero‑trust model provide strong security and low latency. However, feature availability varies by account and region, so validate costs and service limits before production adoption. Some features may require paid tiers, and cold starts or API limits could affect latency and performance. Running assistants on a third‑party provider involves data flow and privacy implications. Ensure compliance with Cloudflare terms and local regulations before deploying sensitive workloads.
- How to Run Durable Autonomous Agents in Production with Gobii
Gobii is an open-source platform for running durable autonomous agents in production. It solves the problem of unreliable, ephemeral AI..
