Repository avatar
AI Tools
v0.2.1
active

betterprompt-mcp

io.github.AungMyoKyaw/betterprompt-mcp

MCP server for AI-enhanced prompt engineering and request conversion.

Documentation

BetterPrompt MCP Server

BetterPrompt MCP Logo

CI/CD Pipeline npm version MCP Compatible Node.js Version TypeScript License: MIT


Table of Contents


Overview

BetterPrompt MCP is a Model Context Protocol (MCP) server that enhances user requests using advanced prompt engineering techniques. It exposes a single, powerful tool that transforms simple requests into structured, context-rich instructions tailored for optimal AI model performance.

Instead of manually crafting detailed prompts, BetterPrompt MCP converts your requests into expertly engineered prompts that get better results from AI models.

Before & After Example

Without BetterPrompt:

"Write a function to calculate fibonacci numbers"

With BetterPrompt Enhancement:

"You are a world-class AI assistant with expertise in advanced prompt engineering techniques from top AI research labs like Anthropic, OpenAI, and Google DeepMind.

Your task is to provide an exceptional response to the following user request:

"Write a function to calculate fibonacci numbers"

Please enhance your response by:

  1. Analyzing the intent and requirements behind this request
  2. Applying appropriate prompt engineering techniques to ensure maximum effectiveness
  3. Adding clarity, specificity, and structure to your approach
  4. Including relevant context and constraints for comprehensive understanding
  5. Ensuring optimal interaction patterns for complex reasoning tasks
  6. Specifying the most appropriate output format for the task
  7. Defining clear success criteria for high-quality results

Structure your response with clear headings, detailed explanations, and examples where appropriate. Ensure your answer is comprehensive, actionable, and directly addresses all aspects of the request."


Quickstart

Install and run via npx:

npx -y betterprompt-mcp

Or add to your MCP client configuration:

{
  "mcpServers": {
    "betterprompt": {
      "command": "npx",
      "args": ["-y", "betterprompt-mcp"]
    }
  }
}

Installation

Most MCP clients work with this standard config:

{
  "mcpServers": {
    "betterprompt": {
      "command": "npx",
      "args": ["-y", "betterprompt-mcp"]
    }
  }
}

Pick your client below. Where available, click the install button; otherwise follow the manual steps.

VS Code

Click a button to install:

Install in VS Code Install in VS Code Insiders

Fallback (CLI):

code --add-mcp '{"name":"betterprompt","command":"npx","args":["-y","betterprompt-mcp"]}'

Docs: Add an MCP server

Cursor

Click to install:

Install in Cursor

Or add manually: Settings → MCP → Add new MCP Server → Type: command, Command: npx -y betterprompt-mcp.

LM Studio

Click to install:

Add MCP Server betterprompt to LM Studio

Or manually: Program → Install → Edit mcp.json, add the standard config above.

Continue

Install button: TODO – no public deeplink available yet.

Manual setup:

  1. Open Continue Settings → open JSON configuration
  2. Add mcpServers entry:
{
  "mcpServers": {
    "betterprompt": {
      "command": "npx",
      "args": ["-y", "betterprompt-mcp"]
    }
  }
}

Restart Continue if needed.

Goose

Click to install:

Install in Goose

Or manually: Advanced settings → Extensions → Add custom extension → Type: STDIO → Command: npx -y betterprompt-mcp.

Claude Code (CLI)

Install via CLI:

claude mcp add betterprompt npx -y betterprompt-mcp
Claude Desktop

Add to claude_desktop_config.json using the standard config above, then restart Claude Desktop. See the MCP quickstart:

Model Context Protocol – Quickstart

Windsurf

Follow the Windsurf MCP documentation and use the standard config above.

Docs: Windsurf MCP

Gemini CLI

Follow the Gemini CLI MCP server guide; use the standard config above.

Docs: Configure MCP server in Gemini CLI

Qodo Gen

Open Qodo Gen chat panel → Connect more tools → + Add new MCP → Paste the standard config above → Save.

Qodo Gen documentation

opencode

Create or edit ~/.config/opencode/opencode.json:

{
  "$schema": "https://opencode.ai/config.json",
  "mcp": {
    "betterprompt": {
      "type": "local",
      "command": ["npx", "-y", "betterprompt-mcp"],
      "enabled": true
    }
  }
}

opencode MCP documentation


Tool

enhance-request

Transforms user requests into world-class AI-enhanced prompts using advanced prompt engineering techniques.

Input:

  • request (string, required): The user request to transform into an enhanced AI prompt

Output: AI-enhanced prompt with structure, context, and clear instructions.

Example Usage:

{
  "name": "enhance-request",
  "arguments": {
    "request": "Write a function to calculate fibonacci numbers"
  }
}

Usage Example

Request:

{
  "name": "enhance-request",
  "arguments": {
    "request": "Explain quantum computing"
  }
}

Enhanced Result:

"You are a world-class AI assistant with expertise in advanced prompt engineering techniques from top AI research labs like Anthropic, OpenAI, and Google DeepMind.

Your task is to provide an exceptional response to the following user request:

"Explain quantum computing"

Please enhance your response by:

  1. Analyzing the intent and requirements behind this request
  2. Applying appropriate prompt engineering techniques to ensure maximum effectiveness
  3. Adding clarity, specificity, and structure to your approach
  4. Including relevant context and constraints for comprehensive understanding
  5. Ensuring optimal interaction patterns for complex reasoning tasks
  6. Specifying the most appropriate output format for the task
  7. Defining clear success criteria for high-quality results

Structure your response with clear headings, detailed explanations, and examples where appropriate. Ensure your answer is comprehensive, actionable, and directly addresses all aspects of the request."


How It Works

BetterPrompt MCP leverages the MCP Sampling API to enhance user requests:

  1. When you call the enhance-request tool, the server sends a sampling request to your MCP client
  2. Your client uses its configured LLM to enhance the prompt using advanced prompt engineering techniques
  3. The enhanced prompt is returned to you for use with any AI model

This approach has several benefits:

  • No API keys required - uses your client's existing LLM configuration
  • Leverages the most capable model available in your client
  • Works with any MCP-compatible client (Claude Desktop, VS Code, Cursor, etc.)
  • Always up-to-date with the latest prompt engineering techniques

Development

Project Structure

betterprompt-mcp/
├── src/
│   └── index.ts          # Main server implementation
├── tests/                # Test files and verification scripts
├── dist/                 # Compiled output (generated)
├── package.json          # Dependencies and scripts
├── tsconfig.json         # TypeScript configuration
└── README.md             # Documentation

Build & Development

Build:

npm run build

Watch (dev):

npm run watch

Format:

npm run format
npm run format:check

Test:

npm run test:comprehensive

Linting and Formatting

We use ESLint + Prettier to keep the codebase consistent.

  • Run the linter locally: npm run lint
  • Apply autofixes: npm run lint -- --fix or npm run lint:fix
  • Run the CI-oriented lint (JSON output): npm run lint:ci (produces artifacts/lint-report.json)
  • Autofix auto-commit policy: safe, formatting-only autofixes are auto-committed using scripts/lint-autofix-and-commit.sh. The script uses a conservative heuristic (small change threshold) and will abort auto-commit when changes appear large or potentially behavior-affecting; in such cases open a PR for human review.

License

MIT License


Support

For questions or issues, open an issue on GitHub or contact the author via GitHub profile.


Author

Aung Myo Kyaw (GitHub)