
FelixYifeiWang-felix-mcp-smithery
ai.smithery/FelixYifeiWang-felix-mcp-smithery
Streamline your workflow with Felix. Integrate it into your workspace and tailor its behavior to y…
Documentation
Felix MCP (Smithery)
A tiny Model Context Protocol server with a few useful tools, deployed on Smithery, tested in Claude Desktop, and indexed in NANDA.
Tools included
hello(name)
– quick greetingrandomNumber(max?)
– random integer (default 100)weather(city)
– current weather via wttr.insummarize(text, maxSentences?, model?)
– OpenAI-powered summary (requiresOPENAI_API_KEY
)
Public server page
https://smithery.ai/server/@FelixYifeiWang/felix-mcp-smithery
MCP endpoint (streamable HTTP)
https://server.smithery.ai/@FelixYifeiWang/felix-mcp-smithery/mcp
(In Smithery/NANDA, auth is attached via query param api_key
and optional profile
, configured in the platform UI; do not hardcode secrets here.)
Demo
In Claude Desktop (recommended)
-
Open Settings → Developer → mcpServers and add:
{ "mcpServers": { "felix-mcp-smithery": { "command": "npx", "args": [ "-y", "@smithery/cli@latest", "run", "@FelixYifeiWang/felix-mcp-smithery", "--key", "YOUR_SMITHERY_API_KEY", "--profile", "YOUR_PROFILE_ID" ] } } }
-
Start a new chat and run:
- “List tools from felix-mcp-smithery”
- “Call hello with
{ "name": "Felix" }
” - “Call summarize on this text (2 sentences): …”
Features
- Streamable HTTP MCP – Express + MCP SDK’s
StreamableHTTPServerTransport
on/mcp
(POST/GET/DELETE). - Session-aware – proper handling of
Mcp-Session-Id
(no close recursion). - OpenAI summarization – tidy summaries via chat completions (model default
gpt-4o-mini
). - Zero-friction hosting – packaged as a container and deployed on Smithery.
Install (local)
Requires Node 18+ (tested on Node 20).
git clone https://github.com/FelixYifeiWang/felix-mcp-smithery
cd felix-mcp-smithery
npm install
Set env (only needed if you’ll call summarize
locally):
export OPENAI_API_KEY="sk-..."
Run:
node index.js
# ✅ MCP Streamable HTTP server on 0.0.0.0:8081 (POST/GET/DELETE /mcp)
Local curl:
curl -s -X POST "http://localhost:8081/mcp" \
-H 'Content-Type: application/json' \
-H 'Mcp-Protocol-Version: 2025-06-18' \
--data '{"jsonrpc":"2.0","id":0,"method":"initialize","params":{"protocolVersion":"2025-06-18"}}'
Usage (tools)
hello
{"jsonrpc":"2.0","id":1,"method":"tools/call","params":{"name":"hello","arguments":{"name":"Felix"}}}
randomNumber
{"jsonrpc":"2.0","id":2,"method":"tools/call","params":{"name":"randomNumber","arguments":{"max":10}}}
weather
{"jsonrpc":"2.0","id":3,"method":"tools/call","params":{"name":"weather","arguments":{"city":"Boston"}}}
summarize (needs OPENAI_API_KEY
set on the server)
{"jsonrpc":"2.0","id":4,"method":"tools/call","params":{"name":"summarize","arguments":{"text":"(paste long text)","maxSentences":2}}}
How it works
-
Server core:
McpServer
from@modelcontextprotocol/sdk
with tools registered inbuildServer()
. Transport:StreamableHTTPServerTransport
on/mcp
handling:POST /mcp
— JSON-RPC requests (and first-timeinitialize
)GET /mcp
— server-to-client notifications (SSE)DELETE /mcp
— end session
-
CORS: Allows all origins; exposes
Mcp-Session-Id
header (good for hosted clients). -
OpenAI summarize: Thin
fetch
wrapper around/v1/chat/completions
with a short “crisp summarizer” system prompt.
Deployment (Smithery)
-
GitHub repo with:
-
index.js
(Express + MCP) -
package.json
(@modelcontextprotocol/sdk
,express
,cors
,zod
) -
Dockerfile
-
smithery.yaml
:kind: server name: felix-mcp-smithery version: 1.0.0 runtime: container startCommand: type: http transport: streamable-http port: 8081 path: /mcp ssePath: /mcp health: /
-
-
In Smithery:
- Create server from the repo.
- Add Environment Variables:
OPENAI_API_KEY
(optional forsummarize
). - Deploy → confirm logs show:
✅ MCP Streamable HTTP server on 0.0.0.0:8081 (POST/GET/DELETE /mcp)
NANDA Index
-
Go to join39.org → Context Agents → Add
- Agent Name:
Felix MCP (Smithery)
- MCP Endpoint:
https://server.smithery.ai/@FelixYifeiWang/felix-mcp-smithery/mcp?api_key=YOUR_KEY&profile=YOUR_PROFILE
- Description:
Streamable-HTTP MCP hosted on Smithery. Tools: hello, randomNumber, weather, summarize (OpenAI).
- Agent Name:
-
Test from NANDA:
initialize
→tools/list
→ callhello
.
Project structure
.
├─ index.js # Express + Streamable HTTP + tools
├─ package.json # sdk/express/cors/zod
├─ Dockerfile # container build for Smithery
└─ smithery.yaml # Smithery project config
Assignment rubric mapping
- ✅ Find/Build: Custom MCP server with 4 tools
- ✅ Deploy: Hosted on Smithery (public server page linked)
- ✅ Test in a host: Verified in Claude Desktop (screenshots/recording included)
- ✅ NANDA Index: Added as a Context Agent (screenshot included)
- ✅ Deliverables: Repo link + working endpoint + host screenshots
What worked
- Streamable HTTP transport with session management is stable once the close-loop gotcha is avoided.
- Smithery makes deployment + auth key distribution straightforward.
- Claude Desktop connects cleanly via
@smithery/cli run …
.
AI Acknowledgement
Parts of this project (tool scaffolding, error fixes, and documentation polish) were produced with AI assistance. The final code, deployment, and testing steps were implemented and verified by me.
No installation packages available.