Skip to content

Getting started

  • Node.js ≥ 20
  • A Brave Search API key (free tier is enough for development)
  • An LLM endpoint — one of:
    • An OpenRouter API key (default, recommended for first run)
    • A local OpenAI-compatible server (LM Studio, Ollama, llama-server, vLLM)
  • Optional: a Google Safe Browsing API key for the orthogonal safety_flag axis
Terminal window
git clone https://github.com/enmito/sift-mcp.git
cd sift-mcp
npm install
cp .env.example .env

Edit .env:

Terminal window
BRAVE_API_KEY=...
LLM_API_KEY=... # OpenRouter key by default
LLM_ENDPOINT=https://openrouter.ai/api/v1
LLM_MODEL=meta-llama/llama-3.3-70b-instruct
GOOGLE_SAFE_BROWSING_KEY=... # optional

Build:

Terminal window
npm run build

Add to your MCP client’s config file. The path depends on the client:

  • Claude Desktop~/Library/Application Support/Claude/claude_desktop_config.json (macOS) / %APPDATA%\Claude\claude_desktop_config.json (Windows)
  • Claude Code.mcp.json in the project root
  • Cursor / Windsurf / Zed — check your client’s MCP settings panel
{
"mcpServers": {
"sift": {
"command": "node",
"args": ["/absolute/path/to/sift-mcp/dist/index.js"],
"env": {
"BRAVE_API_KEY": "your-brave-api-key",
"LLM_API_KEY": "your-openrouter-api-key",
"LLM_JUDGE_ENABLED": "true",
"LLM_ENDPOINT": "https://openrouter.ai/api/v1",
"LLM_MODEL": "meta-llama/llama-3.3-70b-instruct"
}
}
}
}

Restart the client. The search_vectorized tool becomes available to the model.

If you prefer to keep everything local, replace the LLM env vars:

"LLM_ENDPOINT": "http://localhost:1234/v1",
"LLM_MODEL": "openai/gpt-oss-20b"

LLM_API_KEY can be omitted for unauthenticated local endpoints. See Choosing an LLM for tradeoffs.

From a chat with the MCP-enabled model:

Use search_vectorized to find “best noise cancelling headphones 2026” and tell me which sources look biased.

The model invokes the tool, receives the augmented SERP, and should caveat vendor/affiliate sources using the tier and summary_hints[] fields.