C3 is a local intelligence layer that gives AI coding tools surgical project understanding instead of brute-force file reads.
Python 3.10+ is the only requirement. No cloud, no API keys. Runs 100% locally.
Grab the repo and install dependencies.
git clone && pip install -r requirements.txt
C3 indexes your code, detects your IDE, and registers the MCP server.
c3 init /path/to/project
Open your AI tool. C3 tools appear automatically via MCP protocol.
claude / codex / gemini / cursor
14 tasks across 7 categories. Same model, same codebase. Only C3 changes the outcome.
TF-IDF + optional semantic search via Ollama. Finds files by meaning, not just text. Sub-second results.
Structural maps that expose every class, function, and signature. 340 lines becomes a 28-token overview.
Symbol-level extraction. Pull one function from a 2,000-line file. Read 22 lines instead of 2,000.
Native syntax checkers for 15+ languages. Background validation with result caching.
Persistent fact store with auto-learning. Your AI remembers decisions and conventions across sessions.
Pre-processes logs, terminal output, and data files. Extracts signal from noise before it hits context.
One install covers every AI coding tool that supports Model Context Protocol.
| Tool | Provider | MCP Tools | Hooks | Instruction Sync | Notes |
|---|---|---|---|---|---|
Claude Code |
Anthropic | ✓ | ✓ | ✓ | Full support — hooks, CLAUDE.md sync, budget management |
VS Code |
Microsoft | ✓ | — | — | MCP tools via Copilot extension |
Codex CLI |
OpenAI | ✓ | — | ✓ | MCP tools + AGENTS.md + session config |
Gemini CLI |
✓ | — | ✓ | MCP tools + GEMINI.md + session config |
Free and open source. If C3 saves you tokens, extends your sessions, and makes AI coding better — consider sponsoring its development.