
betterprompt-mcp
io.github.AungMyoKyaw/betterprompt-mcp
MCP server for AI-enhanced prompt engineering and request conversion.
Documentation
BetterPrompt MCP Server
Table of Contents
- Overview
- Quickstart
- Installation
- Tool
- Usage Example
- Client Integration
- How It Works
- Development
- License
- Support
Overview
BetterPrompt MCP is a Model Context Protocol (MCP) server that enhances user requests using advanced prompt engineering techniques. It exposes a single, powerful tool that transforms simple requests into structured, context-rich instructions tailored for optimal AI model performance.
Instead of manually crafting detailed prompts, BetterPrompt MCP converts your requests into expertly engineered prompts that get better results from AI models.
Before & After Example
Without BetterPrompt:
"Write a function to calculate fibonacci numbers"
With BetterPrompt Enhancement:
"You are a world-class AI assistant with expertise in advanced prompt engineering techniques from top AI research labs like Anthropic, OpenAI, and Google DeepMind.
Your task is to provide an exceptional response to the following user request:
"Write a function to calculate fibonacci numbers"
Please enhance your response by:
- Analyzing the intent and requirements behind this request
- Applying appropriate prompt engineering techniques to ensure maximum effectiveness
- Adding clarity, specificity, and structure to your approach
- Including relevant context and constraints for comprehensive understanding
- Ensuring optimal interaction patterns for complex reasoning tasks
- Specifying the most appropriate output format for the task
- Defining clear success criteria for high-quality results
Structure your response with clear headings, detailed explanations, and examples where appropriate. Ensure your answer is comprehensive, actionable, and directly addresses all aspects of the request."
Quickstart
Install and run via npx:
npx -y betterprompt-mcp
Or add to your MCP client configuration:
{
"mcpServers": {
"betterprompt": {
"command": "npx",
"args": ["-y", "betterprompt-mcp"]
}
}
}
Installation
Most MCP clients work with this standard config:
{
"mcpServers": {
"betterprompt": {
"command": "npx",
"args": ["-y", "betterprompt-mcp"]
}
}
}
Pick your client below. Where available, click the install button; otherwise follow the manual steps.
VS Code
Click a button to install:
Fallback (CLI):
code --add-mcp '{"name":"betterprompt","command":"npx","args":["-y","betterprompt-mcp"]}'
Cursor
Click to install:
Or add manually: Settings → MCP → Add new MCP Server → Type: command, Command: npx -y betterprompt-mcp
.
LM Studio
Click to install:
Or manually: Program → Install → Edit mcp.json
, add the standard config above.
Continue
Install button: TODO – no public deeplink available yet.
Manual setup:
- Open Continue Settings → open JSON configuration
- Add
mcpServers
entry:
{
"mcpServers": {
"betterprompt": {
"command": "npx",
"args": ["-y", "betterprompt-mcp"]
}
}
}
Restart Continue if needed.
Goose
Click to install:
Or manually: Advanced settings → Extensions → Add custom extension → Type: STDIO → Command: npx -y betterprompt-mcp
.
Claude Code (CLI)
Install via CLI:
claude mcp add betterprompt npx -y betterprompt-mcp
Claude Desktop
Add to claude_desktop_config.json
using the standard config above, then restart Claude Desktop. See the MCP quickstart:
Windsurf
Follow the Windsurf MCP documentation and use the standard config above.
Gemini CLI
Follow the Gemini CLI MCP server guide; use the standard config above.
Qodo Gen
Open Qodo Gen chat panel → Connect more tools → + Add new MCP → Paste the standard config above → Save.
opencode
Create or edit ~/.config/opencode/opencode.json
:
{
"$schema": "https://opencode.ai/config.json",
"mcp": {
"betterprompt": {
"type": "local",
"command": ["npx", "-y", "betterprompt-mcp"],
"enabled": true
}
}
}
Tool
enhance-request
Transforms user requests into world-class AI-enhanced prompts using advanced prompt engineering techniques.
Input:
request
(string, required): The user request to transform into an enhanced AI prompt
Output: AI-enhanced prompt with structure, context, and clear instructions.
Example Usage:
{
"name": "enhance-request",
"arguments": {
"request": "Write a function to calculate fibonacci numbers"
}
}
Usage Example
Request:
{
"name": "enhance-request",
"arguments": {
"request": "Explain quantum computing"
}
}
Enhanced Result:
"You are a world-class AI assistant with expertise in advanced prompt engineering techniques from top AI research labs like Anthropic, OpenAI, and Google DeepMind.
Your task is to provide an exceptional response to the following user request:
"Explain quantum computing"
Please enhance your response by:
- Analyzing the intent and requirements behind this request
- Applying appropriate prompt engineering techniques to ensure maximum effectiveness
- Adding clarity, specificity, and structure to your approach
- Including relevant context and constraints for comprehensive understanding
- Ensuring optimal interaction patterns for complex reasoning tasks
- Specifying the most appropriate output format for the task
- Defining clear success criteria for high-quality results
Structure your response with clear headings, detailed explanations, and examples where appropriate. Ensure your answer is comprehensive, actionable, and directly addresses all aspects of the request."
How It Works
BetterPrompt MCP leverages the MCP Sampling API to enhance user requests:
- When you call the
enhance-request
tool, the server sends a sampling request to your MCP client - Your client uses its configured LLM to enhance the prompt using advanced prompt engineering techniques
- The enhanced prompt is returned to you for use with any AI model
This approach has several benefits:
- No API keys required - uses your client's existing LLM configuration
- Leverages the most capable model available in your client
- Works with any MCP-compatible client (Claude Desktop, VS Code, Cursor, etc.)
- Always up-to-date with the latest prompt engineering techniques
Development
Project Structure
betterprompt-mcp/
├── src/
│ └── index.ts # Main server implementation
├── tests/ # Test files and verification scripts
├── dist/ # Compiled output (generated)
├── package.json # Dependencies and scripts
├── tsconfig.json # TypeScript configuration
└── README.md # Documentation
Build & Development
Build:
npm run build
Watch (dev):
npm run watch
Format:
npm run format
npm run format:check
Test:
npm run test:comprehensive
Linting and Formatting
We use ESLint + Prettier to keep the codebase consistent.
- Run the linter locally:
npm run lint
- Apply autofixes:
npm run lint -- --fix
ornpm run lint:fix
- Run the CI-oriented lint (JSON output):
npm run lint:ci
(producesartifacts/lint-report.json
) - Autofix auto-commit policy: safe, formatting-only autofixes are auto-committed using
scripts/lint-autofix-and-commit.sh
. The script uses a conservative heuristic (small change threshold) and will abort auto-commit when changes appear large or potentially behavior-affecting; in such cases open a PR for human review.
License
MIT License
Support
For questions or issues, open an issue on GitHub or contact the author via GitHub profile.
Author
Aung Myo Kyaw (GitHub)
betterprompt-mcp
npm install betterprompt-mcp