Open Source MCP Server

Your AI reads too much.
Fix that.

C3 is a local intelligence layer that gives AI coding tools surgical project understanding instead of brute-force file reads.

your AI — with C3
AI> c3_search(query="auth middleware")     3 files · 0.04s · 12 tokens   AI> c3_compress("services/auth.py", mode="map")     340 lines → 28 token structural map   AI> c3_read("services/auth.py", symbols=["verify_token"])     22 lines extracted · exactly what was needed   # Without C3: 3 full files, ~4,200 tokens # With C3: 3 targeted calls, ~180 tokens
90%
Token Savings
9.9x
Context Multiplier
57%
E2E Win Rate
+28%
Quality Uplift
4.2x
Longer Sessions
View full benchmark report →
Setup

Three commands. Zero config.

Python 3.10+ is the only requirement. No cloud, no API keys. Runs 100% locally.

1

Clone & Install

Grab the repo and install dependencies.

git clone && pip install -r requirements.txt
2

Initialize

C3 indexes your code, detects your IDE, and registers the MCP server.

c3 init /path/to/project
3

Code

Open your AI tool. C3 tools appear automatically via MCP protocol.

claude / codex / gemini / cursor
Benchmarks

Measured, not marketed

14 tasks across 7 categories. Same model, same codebase. Only C3 changes the outcome.

Code Review

+50pp
100% win rate · C3 0.71 vs Base 0.20

Bug Detection

+36pp
Structural maps reveal hidden patterns

Refactoring

+31pp
100% win rate · baseline timed out

Call Chains

0.90
Near-perfect cross-file tracing

Session Net Savings

76%
After C3 overhead (mandates, schemas, recalls)

Avg Score

0.74
C3 0.736 vs Baseline 0.575
Model: Claude Sonnet 4.6  ·  Tasks: 14  ·  C3 Wins: 8/14  ·  C3 21% faster Full results on GitHub →
Toolkit

Six tools your AI didn't know it needed

S

c3_search

TF-IDF + optional semantic search via Ollama. Finds files by meaning, not just text. Sub-second results.

M

c3_compress

Structural maps that expose every class, function, and signature. 340 lines becomes a 28-token overview.

R

c3_read

Symbol-level extraction. Pull one function from a 2,000-line file. Read 22 lines instead of 2,000.

V

c3_validate

Native syntax checkers for 15+ languages. Background validation with result caching.

P

c3_memory

Persistent fact store with auto-learning. Your AI remembers decisions and conventions across sessions.

F

c3_filter

Pre-processes logs, terminal output, and data files. Extracts signal from noise before it hits context.

Compatibility

Works everywhere MCP does

One install covers every AI coding tool that supports Model Context Protocol.

Tool Provider MCP Tools Hooks Instruction Sync Notes
C
Claude Code
Anthropic Full support — hooks, CLAUDE.md sync, budget management
VS
VS Code
Microsoft MCP tools via Copilot extension
Cx
Codex CLI
OpenAI MCP tools + AGENTS.md + session config
G
Gemini CLI
Google MCP tools + GEMINI.md + session config

Support C3

Free and open source. If C3 saves you tokens, extends your sessions, and makes AI coding better — consider sponsoring its development.

$ git clone https://github.com/drknowhow/C3.git $ pip install -r requirements.txt $ c3 init /path/to/your/project   Indexed 847 files · MCP registered · ready