Repository avatar
Other Tools
v1.5.3
active

ncp

io.github.portel-dev/ncp

N-to-1 MCP Orchestration. Unified gateway for multiple MCP servers with intelligent tool discovery.

Documentation

npm version npm downloads GitHub release downloads Latest release License: Elastic-2.0 MCP Compatible

NCP - Natural Context Provider

๐ŸŽฏ One MCP to Rule Them All

NCP Transformation Flow

NCP transforms N scattered MCP servers into 1 intelligent orchestrator. Your AI sees just 2 simple tools instead of 50+ complex ones, while NCP handles all the routing, discovery, and execution behind the scenes.

๐Ÿš€ NEW: Project-level configuration - each project can define its own MCPs automatically

Result: Same tools, same capabilities, but your AI becomes focused, efficient, and cost-effective again.

What's MCP? The Model Context Protocol by Anthropic lets AI assistants connect to external tools and data sources. Think of MCPs as "plugins" that give your AI superpowers like file access, web search, databases, and more.


๐Ÿ˜ค The MCP Paradox: More Tools = Less Productivity

You added MCPs to make your AI more powerful. Instead:

  • AI picks wrong tools ("Should I use read_file or get_file_content?")
  • Sessions end early ("I've reached my context limit analyzing tools")
  • Costs explode (50+ schemas burn tokens before work even starts)
  • AI becomes indecisive (used to act fast, now asks clarifying questions)

๐Ÿงธ Why Too Many Toys Break the Fun

Think about it:

A child with one toy โ†’ Treasures it, masters it, creates endless games with it

A child with 50 toys โ†’ Can't hold them all, loses pieces, gets overwhelmed, stops playing entirely

Your AI is that child. MCPs are the toys. More isn't always better.

Or picture this: You're craving pizza. Someone hands you a pizza โ†’ Pure joy! ๐Ÿ•

But take you to a buffet with 200 dishes โ†’ Analysis paralysis. You spend 20 minutes deciding, lose your appetite, leave unsatisfied.

Same with your AI: Give it one perfect tool โ†’ Instant action. Give it 50 tools โ†’ Cognitive overload.

The most creative people thrive with constraints, not infinite options. Your AI is no different.

Think about it:

  • A poet with "write about anything" โ†’ Writer's block

  • A poet with "write a haiku about rain" โ†’ Instant inspiration

  • A developer with access to "all programming languages" โ†’ Analysis paralysis

  • A developer with "Python for this task" โ†’ Focused solution

Your AI needs the same focus. NCP gives it constraints that spark creativity, not chaos that kills it.


๐Ÿ“Š The Before & After Reality

Before NCP: Tool Schema Explosion ๐Ÿ˜ตโ€๐Ÿ’ซ

When your AI connects to multiple MCPs directly:

๐Ÿค– AI Assistant Context:
โ”œโ”€โ”€ Filesystem MCP (12 tools) โ”€ 15,000 tokens
โ”œโ”€โ”€ Database MCP (8 tools) โ”€โ”€โ”€ 12,000 tokens
โ”œโ”€โ”€ Web Search MCP (6 tools) โ”€โ”€ 8,000 tokens
โ”œโ”€โ”€ Email MCP (15 tools) โ”€โ”€โ”€โ”€โ”€ 18,000 tokens
โ”œโ”€โ”€ Shell MCP (10 tools) โ”€โ”€โ”€โ”€โ”€ 14,000 tokens
โ”œโ”€โ”€ GitHub MCP (20 tools) โ”€โ”€โ”€โ”€ 25,000 tokens
โ””โ”€โ”€ Slack MCP (9 tools) โ”€โ”€โ”€โ”€โ”€โ”€ 11,000 tokens

๐Ÿ’€ Total: 80 tools = 103,000 tokens of schemas

What happens:

  • AI burns 50%+ of context just understanding what tools exist
  • Spends 5-8 seconds analyzing which tool to use
  • Often picks wrong tool due to schema confusion
  • Hits context limits mid-conversation

After NCP: Unified Intelligence โœจ

With NCP's orchestration:

๐Ÿค– AI Assistant Context:
โ””โ”€โ”€ NCP (2 unified tools) โ”€โ”€โ”€โ”€ 2,500 tokens

๐ŸŽฏ Behind the scenes: NCP manages all 80 tools
๐Ÿ“ˆ Context saved: 100,500 tokens (97% reduction!)
โšก Decision time: Sub-second tool selection
๐ŸŽช AI behavior: Confident, focused, decisive

Real results from our testing:

Your MCP SetupWithout NCPWith NCPToken Savings
Small (5 MCPs, 25 tools)15,000 tokens8,000 tokens47% saved
Medium (15 MCPs, 75 tools)45,000 tokens12,000 tokens73% saved
Large (30 MCPs, 150 tools)90,000 tokens15,000 tokens83% saved
Enterprise (50+ MCPs, 250+ tools)150,000 tokens20,000 tokens87% saved

Translation:

  • 5x faster responses (8 seconds โ†’ 1.5 seconds)
  • 12x longer conversations before hitting limits
  • 90% reduction in wrong tool selection
  • Zero context exhaustion in typical sessions

๐Ÿ“‹ Prerequisites

  • Node.js 18+ (Download here)
  • npm (included with Node.js) or npx for running packages
  • Command line access (Terminal on Mac/Linux, Command Prompt/PowerShell on Windows)

๐Ÿš€ Installation

Choose your preferred installation method:

MethodBest ForDownloads
๐Ÿ“ฆ .mcpb BundleClaude Desktop users.mcpb downloads
๐Ÿ“ฅ npm PackageAll MCP clients, CLI usersnpm downloads

โšก Option 1: One-Click Installation (.mcpb) - Claude Desktop Only

For Claude Desktop users - Download and double-click to install:

  1. Download NCP Desktop Extension: ncp.dxt
  2. Double-click the downloaded ncp.dxt file
  3. Claude Desktop will prompt you to install - click "Install"
  4. Auto-sync with Claude Desktop - NCP continuously syncs MCPs:
    • Detects MCPs from claude_desktop_config.json
    • Detects .mcpb-installed extensions
    • Runs on every startup to find new MCPs
    • Uses internal add command for cache coherence

๐Ÿ”„ Continuous sync: NCP automatically detects and imports new MCPs every time you start it! Add an MCP to Claude Desktop โ†’ NCP auto-syncs it on next startup. Zero manual configuration needed.

If you want to add more MCPs later, configure manually by editing ~/.ncp/profiles/all.json:

# Edit the profile configuration
nano ~/.ncp/profiles/all.json
{
  "mcpServers": {
    "filesystem": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-filesystem", "/Users/yourname"]
    },
    "github": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-github"],
      "env": {
        "GITHUB_PERSONAL_ACCESS_TOKEN": "ghp_xxx"
      }
    }
  }
}
  1. Restart Claude Desktop and NCP will load your configured MCPs

โ„น๏ธ About .dxt (Desktop Extension) installation:

  • Slim & Fast: Desktop extension is MCP-only (126KB, no CLI code)
  • Manual config: Edit JSON files directly (no ncp add command)
  • Power users: Fastest startup, direct control over configuration
  • Optional CLI: Install npm install -g @portel/ncp separately if you want CLI tools

Why .dxt is slim: The desktop extension excludes all CLI code, making it 13% smaller and faster to load than the full npm package. Perfect for production use where you manage configs manually or via automation.


๐Ÿ”ง Option 2: npm Installation - All MCP Clients (Cursor, Cline, Continue, etc.)

Step 1: Import Your Existing MCPs โšก

Already have MCPs? Don't start over - import everything instantly:

# Install NCP globally (recommended)
npm install -g @portel/ncp

# Copy your claude_desktop_config.json content to clipboard:
# 1. Open your claude_desktop_config.json file (see locations above)
# 2. Select all content (Ctrl+A / Cmd+A) and copy (Ctrl+C / Cmd+C)
# 3. Then run:
ncp config import

# โœจ Magic! NCP auto-detects and imports ALL your MCPs from clipboard

Note: All commands below assume global installation (npm install -g). For npx usage, see the Alternative Installation section.

NCP Import Feature

Step 2: Connect NCP to Your AI ๐Ÿ”—

Replace your entire MCP configuration with this single entry:

{
  "mcpServers": {
    "ncp": {
      "command": "ncp"
    }
  }
}

Step 3: Watch the Magic โœจ

Your AI now sees just 2 simple tools instead of 50+ complex ones:

NCP List Overview

๐ŸŽ‰ Done! Same tools, same capabilities, but your AI is now focused and efficient.


๐Ÿงช Test Drive: See the Difference Yourself

Want to experience what your AI experiences? NCP has a human-friendly CLI:

๐Ÿ” Smart Discovery

# Ask like your AI would ask:
ncp find "I need to read a file"
ncp find "help me send an email"
ncp find "search for something online"

NCP Find Command

Notice: NCP understands intent, not just keywords. Just like your AI needs.

๐Ÿ“‹ Ecosystem Overview

# See your complete MCP ecosystem:
ncp list --depth 2

# Get help anytime:
ncp --help

NCP Help Command

โšก Direct Testing

# Test any tool safely:
ncp run filesystem:read_file --params '{"path": "/tmp/test.txt"}'

Why this matters: You can debug and test tools directly, just like your AI would use them.

โœ… Verify Everything Works

# 1. Check NCP is installed correctly
ncp --version

# 2. Confirm your MCPs are imported
ncp list

# 3. Test tool discovery
ncp find "file"

# 4. Test a simple tool (if you have filesystem MCP)
ncp run filesystem:read_file --params '{"path": "/tmp/test.txt"}' --dry-run

โœ… Success indicators:

  • NCP shows version number
  • ncp list shows your imported MCPs
  • ncp find returns relevant tools
  • Your AI client shows only NCP in its tool list

๐Ÿ”„ Alternative Installation with npx

Prefer not to install globally? Use npx for any client configuration:

# All the above commands work with npx - just replace 'ncp' with 'npx @portel/ncp':

# Import MCPs
npx @portel/ncp config import

# Add MCPs
npx @portel/ncp add filesystem npx @modelcontextprotocol/server-filesystem ~/Documents

# Find tools
npx @portel/ncp find "file operations"

# Configure client (example: Claude Desktop)
{
  "mcpServers": {
    "ncp": {
      "command": "npx",
      "args": ["@portel/ncp"]
    }
  }
}

When to use npx: Perfect for trying NCP, CI/CD environments, or when you can't install packages globally.


๐Ÿ’ก Why NCP Transforms Your AI Experience

๐Ÿง  Restores AI Focus

  • Before: "I see 50 tools... which should I use... let me think..."
  • After: "I need file access. Done." (sub-second decision)

๐Ÿ’ฐ Massive Token Savings

  • Before: 100k+ tokens just for tool schemas
  • After: 2.5k tokens for unified interface
  • Result: 40x token efficiency = 40x longer conversations

๐ŸŽฏ Eliminates Tool Confusion

  • Before: AI picks read_file when you meant search_files
  • After: NCP's semantic engine finds the RIGHT tool for the task

๐Ÿš€ Faster, Smarter Responses

  • Before: 8-second delay analyzing tool options
  • After: Instant tool selection, immediate action

Bottom line: Your AI goes from overwhelmed to laser-focused.


๐Ÿ› ๏ธ For Power Users: Manual Setup

Prefer to build from scratch? Add MCPs manually:

# Add the most popular MCPs:

# AI reasoning and memory
ncp add sequential-thinking npx @modelcontextprotocol/server-sequential-thinking
ncp add memory npx @modelcontextprotocol/server-memory

# File and development tools
ncp add filesystem npx @modelcontextprotocol/server-filesystem ~/Documents  # Path: directory to access
ncp add github npx @modelcontextprotocol/server-github                       # No path needed

# Search and productivity
ncp add brave-search npx @modelcontextprotocol/server-brave-search           # No path needed

NCP Add Command

๐Ÿ’ก Pro tip: Browse Smithery.ai (2,200+ MCPs) or mcp.so to discover tools for your specific needs.


๐ŸŽฏ Popular MCPs That Work Great with NCP

๐Ÿ”ฅ Most Downloaded

# Community favorites (download counts from Smithery.ai):
ncp add sequential-thinking npx @modelcontextprotocol/server-sequential-thinking  # 5,550+ downloads
ncp add memory npx @modelcontextprotocol/server-memory                            # 4,200+ downloads
ncp add brave-search npx @modelcontextprotocol/server-brave-search                # 680+ downloads

๐Ÿ› ๏ธ Development Essentials

# Popular dev tools:
ncp add filesystem npx @modelcontextprotocol/server-filesystem ~/code
ncp add github npx @modelcontextprotocol/server-github
ncp add shell npx @modelcontextprotocol/server-shell

๐ŸŒ Productivity & Integrations

# Enterprise favorites:
ncp add gmail npx @mcptools/gmail-mcp
ncp add slack npx @modelcontextprotocol/server-slack
ncp add google-drive npx @modelcontextprotocol/server-gdrive
ncp add postgres npx @modelcontextprotocol/server-postgres
ncp add puppeteer npx @hisma/server-puppeteer

โš™๏ธ Configuration for Different AI Clients

Claude Desktop (Most Popular)

Configuration File Location:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
  • Linux: ~/.config/Claude/claude_desktop_config.json

Replace your entire claude_desktop_config.json with:

{
  "mcpServers": {
    "ncp": {
      "command": "ncp"
    }
  }
}

๐Ÿ“Œ Important: Restart Claude Desktop after saving the config file.

Note: Configuration file locations are current as of this writing. For the most up-to-date setup instructions, please refer to the official Claude Desktop documentation.

Claude Code

NCP works automatically! Just run:

ncp add <your-mcps>

VS Code with GitHub Copilot

Settings File Location:

  • macOS: ~/Library/Application Support/Code/User/settings.json
  • Windows: %APPDATA%\Code\User\settings.json
  • Linux: ~/.config/Code/User/settings.json

Add to your VS Code settings.json:

{
  "mcp.servers": {
    "ncp": {
      "command": "ncp"
    }
  }
}

๐Ÿ“Œ Important: Restart VS Code after saving the settings file.

Disclaimer: Configuration paths and methods are accurate as of this writing. VS Code and its extensions may change these locations or integration methods. Please consult the official VS Code documentation for the most current information.

Cursor IDE

{
  "mcp": {
    "servers": {
      "ncp": {
        "command": "ncp"
      }
    }
  }
}

Disclaimer: Configuration format and location may vary by Cursor IDE version. Please refer to Cursor's official documentation for the most up-to-date setup instructions.


๐Ÿ”ง Advanced Features

Smart Health Monitoring

NCP automatically detects broken MCPs and routes around them:

ncp list --depth 1    # See health status
ncp config validate   # Check configuration health

๐ŸŽฏ Result: Your AI never gets stuck on broken tools.

Multi-Profile Organization

Organize MCPs by project or environment:

# Development setup
ncp add --profile dev filesystem npx @modelcontextprotocol/server-filesystem ~/dev

# Production setup
ncp add --profile prod database npx production-db-server

# Use specific profile
ncp --profile dev find "file tools"

๐Ÿš€ Project-Level Configuration

New: Configure MCPs per project with automatic detection - perfect for teams and Cloud IDEs:

# In any project directory, create local MCP configuration:
mkdir .ncp
ncp add filesystem npx @modelcontextprotocol/server-filesystem ./
ncp add github npx @modelcontextprotocol/server-github

# NCP automatically detects and uses project-local configuration
ncp find "save file"  # Uses only project MCPs

How it works:

  • ๐Ÿ“ Local .ncp directory exists โ†’ Uses project configuration
  • ๐Ÿ  No local .ncp directory โ†’ Falls back to global ~/.ncp
  • ๐ŸŽฏ Zero profile management needed โ†’ Everything goes to default all.json

Perfect for:

  • ๐Ÿค– Claude Code projects (project-specific MCP tooling)
  • ๐Ÿ‘ฅ Team consistency (ship .ncp folder with your repo)
  • ๐Ÿ”ง Project-specific tooling (each project defines its own MCPs)
  • ๐Ÿ“ฆ Environment isolation (no global MCP conflicts)
# Example project structures:
frontend-app/
  .ncp/profiles/all.json   # โ†’ playwright, lighthouse, browser-context
  src/

api-backend/
  .ncp/profiles/all.json   # โ†’ postgres, redis, docker, kubernetes
  server/

Import from Anywhere

# From clipboard (any JSON config)
ncp config import

# From specific file
ncp config import "~/my-mcp-config.json"

# From Claude Desktop (auto-detected paths)
ncp config import

๐Ÿ›Ÿ Troubleshooting

Import Issues

# Check what was imported
ncp list

# Validate health of imported MCPs
ncp config validate

# See detailed import logs
DEBUG=ncp:* ncp config import

AI Not Using Tools

  • Check connection: ncp list (should show your MCPs)
  • Test discovery: ncp find "your query"
  • Validate config: Ensure your AI client points to ncp command

Performance Issues

# Check MCP health (unhealthy MCPs slow everything down)
ncp list --depth 1

# Clear cache if needed
rm -rf ~/.ncp/cache

# Monitor with debug logs
DEBUG=ncp:* ncp find "test"

๐Ÿ“š Deep Dive: How It Works

Want the technical details? Token analysis, architecture diagrams, and performance benchmarks:

๐Ÿ“– Read the Technical Guide โ†’

Learn about:

  • Vector similarity search algorithms
  • N-to-1 orchestration architecture
  • Real-world token usage comparisons
  • Health monitoring and failover systems

๐Ÿค Contributing

Help make NCP even better:


๐Ÿ“„ License

Elastic License 2.0 - Full License

TLDR: Free for all use including commercial. Cannot be offered as a hosted service to third parties.