ctxovrflw — Shared memory for every AI agent

ctxovrflw is a universal AI context and memory layer — a local-first daemon that shares context across 19+ AI agents (Cursor, Claude Code, Cline, Windsurf, Claude Desktop, Copilot, and more) via MCP (Model Context Protocol). What you tell one AI, every AI knows.

How to share context between AI tools

Install ctxovrflw with one command: curl -fsSL ctxovrflw.dev/install.sh | sh. Run ctxovrflw init to auto-detect and configure your AI tools. Memories stored by any connected agent are instantly available to all others.

Features

  • Semantic search — finds memories by meaning, not just keywords, using local ONNX embeddings
  • MCP native — speaks the Model Context Protocol, connects to any compatible agent
  • E2E encrypted sync — AES-256-GCM encryption, zero-knowledge cloud relay
  • Cross-device — sync memory across laptop, desktop, and server
  • Single binary — written in Rust, installs with one curl command
  • Privacy first — runs locally by default, cloud is optional and ephemeral
  • Context synthesis — Pro feature for intelligent context briefings

Supported AI Agents

Works with Claude Code, Cursor, Cline, Windsurf, Claude Desktop, Copilot CLI, Gemini CLI, OpenClaw, Roo Code, Continue, Codex CLI, Goose, Amp, Kiro, Trae, OpenCode, Factory, Antigravity, and Kilo Code.

Pricing

Free: 100 memories, local-only, no account required ($0). Standard: unlimited memories, 3 devices, cloud sync ($10/mo). Pro: unlimited everything, context synthesis ($20/mo).

Integrations

  • ctxovrflw + Cursor — Shared AI Memory for Cursor
  • ctxovrflw + Claude Code — Persistent Memory for Claude Code
  • ctxovrflw + Cline — Shared Memory for Cline VS Code Extension
  • ctxovrflw + Windsurf — AI Memory Integration for Windsurf
  • ctxovrflw + Claude Desktop — Bridge Chat and Code Context
  • ctxovrflw + GitHub Copilot CLI — Shared Memory for Copilot

Documentation

Full documentation available at docs.ctxovrflw.dev.