| Crates.io | turbomcp-proxy |
| lib.rs | turbomcp-proxy |
| version | 3.0.0-beta.3 |
| created_at | 2025-11-06 17:09:19.527864+00 |
| updated_at | 2026-01-22 16:45:24.055526+00 |
| description | Universal MCP adapter/generator - introspection, proxying, and code generation for any MCP server |
| homepage | https://turbomcp.org |
| repository | https://github.com/Epistates/turbomcp |
| max_upload_size | |
| id | 1920021 |
| size | 690,838 |
Universal MCP Adapter/Generator - Introspection, proxying, and code generation for any MCP server
turbomcp-proxy is a universal tool that works with ANY MCP server implementation (TurboMCP, Python SDK, TypeScript SDK, custom implementations). It discovers server capabilities via the MCP protocol and dynamically generates adapters for different transports and protocols.
# Inspect any MCP server
turbomcp-proxy inspect stdio --cmd "python my-server.py"
# Expose STDIO server over HTTP/SSE (development)
turbomcp-proxy serve \
--backend stdio --cmd "python my-server.py" \
--frontend http --bind 127.0.0.1:3000
# Connect to TCP server and expose over HTTP
turbomcp-proxy serve \
--backend tcp --tcp localhost:5000 \
--frontend http --bind 127.0.0.1:3001
# Connect to Unix socket and expose over HTTP
turbomcp-proxy serve \
--backend unix --unix /tmp/mcp.sock \
--frontend http --bind 127.0.0.1:3002
# Expose with JWT authentication (production - symmetric)
turbomcp-proxy serve \
--backend stdio --cmd "python my-server.py" \
--frontend http --bind 0.0.0.0:3000 \
--jwt-secret "your-secret-key" \
--jwt-algorithm HS256
# Expose with JWKS (production - asymmetric, OAuth providers)
turbomcp-proxy serve \
--backend stdio --cmd "python my-server.py" \
--frontend http --bind 0.0.0.0:3000 \
--jwt-jwks-uri "https://accounts.google.com/.well-known/jwks.json" \
--jwt-algorithm RS256 \
--jwt-audience "https://api.example.com" \
--jwt-issuer "https://accounts.google.com"
# Generate optimized Rust proxy
turbomcp-proxy generate \
--backend stdio --cmd "python my-server.py" \
--frontend http \
--output ./my-proxy \
--build --release
# Export OpenAPI 3.1 schema
turbomcp-proxy schema openapi \
--backend stdio --cmd "python my-server.py" \
--output api-spec.json
# Export GraphQL schema
turbomcp-proxy schema graphql \
--backend tcp --tcp localhost:5000 \
--output schema.graphql
# Export Protobuf definition
turbomcp-proxy schema protobuf \
--backend unix --unix /tmp/mcp.sock \
--output server.proto
Works with any MCP implementation:
Problem: You have a CLI MCP server, but need HTTP clients to access it
# Your CLI server
./my-mcp-server
# Expose it over HTTP (development)
turbomcp-proxy serve \
--backend stdio --cmd "./my-mcp-server" \
--frontend http --bind 127.0.0.1:3000
# Expose with JWT authentication (production)
turbomcp-proxy serve \
--backend stdio --cmd "./my-mcp-server" \
--frontend http --bind 0.0.0.0:3000 \
--jwt-secret "your-secret-key"
# Expose with API key authentication (production)
turbomcp-proxy serve \
--backend stdio --cmd "./my-mcp-server" \
--frontend http --bind 0.0.0.0:3000 \
--require-auth \
--api-key-header x-api-key
# Now accessible via HTTP
curl -X POST http://localhost:3000/mcp \
-H "Content-Type: application/json" \
-H "Authorization: Bearer <jwt-token>" \
-d '{"jsonrpc":"2.0","id":1,"method":"tools/list","params":{}}'
Problem: Your tool expects STDIO, but server is HTTP
# Connect to HTTP server, expose as STDIO
turbomcp-proxy serve \
--backend http --http https://api.example.com/mcp \
--frontend stdio \
| my-cli-tool
# With backend authentication
turbomcp-proxy serve \
--backend http --http https://api.example.com/mcp \
--auth-token "your-secret-token" \
--frontend stdio
Problem: Want REST API with Swagger docs
# Generate and serve REST API
turbomcp-proxy adapter rest \
--backend stdio --cmd "python my-server.py" \
--bind 0.0.0.0:3000 \
--openapi-ui
# Endpoints automatically created:
# POST /tools/{tool_name} → tools/call
# GET /resources/{uri} → resources/read
# GET /openapi.json → Auto-generated spec
# GET /docs → Swagger UI
Problem: Need optimized binary for production deployment
# Generate standalone Rust project
turbomcp-proxy generate \
--backend stdio --cmd "python my-server.py" \
--frontend http \
--output ./production-proxy \
--build --release
# Deploy optimized binary (0ms overhead)
./production-proxy/target/release/proxy
┌─────────────────────────────────────────────────────────┐
│ Introspection Layer │
│ • McpIntrospector: Discovers server capabilities │
│ • ServerSpec: Complete server description │
│ • Backends: STDIO, HTTP, WebSocket │
└─────────────────────────────────────────────────────────┘
↓
┌─────────────────────────────────────────────────────────┐
│ Generation Layer │
│ • RuntimeProxyBuilder: Dynamic, no codegen │
│ • RustCodeGenerator: Optimized Rust source │
│ • Schema Generators: OpenAPI, GraphQL, Protobuf │
└─────────────────────────────────────────────────────────┘
↓
┌─────────────────────────────────────────────────────────┐
│ Adapter Layer │
│ • Transport Adapters: STDIO ↔ HTTP/SSE ↔ WebSocket │
│ • Protocol Adapters: MCP → REST API / GraphQL │
└─────────────────────────────────────────────────────────┘
See Design Document for complete architecture details.
From source:
cd crates/turbomcp-proxy
cargo install --path .
From crates.io: (coming soon)
cargo install turbomcp-proxy
turbomcp-proxy <COMMAND> [OPTIONS]
Commands:
inspect Discover MCP server capabilities
serve Run runtime proxy (no codegen)
generate Generate optimized proxy source code
schema Export schemas (OpenAPI, GraphQL, Protobuf)
adapter Run protocol adapter (MCP → REST/GraphQL)
help Print help
inspect - Discover Capabilitiesturbomcp-proxy inspect [OPTIONS]
Backend Options:
--backend <TYPE> Backend type (stdio, http, tcp, unix, websocket)
--cmd <CMD> Command to run (for stdio backend)
--http <URL> HTTP/SSE server URL
--tcp <ADDR> TCP endpoint (host:port)
--unix <PATH> Unix socket path
--websocket <URL> WebSocket server URL
Output Options:
--output <FILE> Save to file
--format <FORMAT> Output format (human, json, yaml)
Examples:
turbomcp-proxy inspect --backend stdio --cmd "python server.py"
turbomcp-proxy inspect --backend http --http https://api.example.com/mcp
turbomcp-proxy inspect --backend tcp --tcp localhost:5000 --format json
turbomcp-proxy inspect --backend unix --unix /tmp/mcp.sock --output capabilities.json
Note: Inspect currently supports stdio backend. TCP, Unix, HTTP, and WebSocket backends are supported in the backend connector but not yet in the inspect introspection layer.
serve - Runtime Proxyturbomcp-proxy serve [OPTIONS]
Backend Options:
--backend <TYPE> Backend type (stdio, http, tcp, unix, websocket)
--cmd <CMD> Command to run (for stdio backend)
--server <URL> Server URL (for http/websocket backend)
--tcp <HOST:PORT> TCP endpoint (for tcp backend)
--unix <PATH> Unix socket path (for unix backend)
--auth-token <TOK> Authentication token for HTTP backend
Frontend Options:
--frontend <TYPE> Frontend type (stdio, http, tcp, unix, websocket)
--bind <ADDR> Bind address (for http/tcp/websocket frontend)
--endpoint <PATH> HTTP endpoint path (default: /mcp)
Authentication Options (Frontend HTTP Server):
--jwt-secret <SECRET> JWT secret for token validation
--api-key-header <HEADER> API key header name (default: x-api-key)
--require-auth Require authentication for all requests
Environment Variables:
TURBOMCP_JWT_SECRET JWT secret (alternative to --jwt-secret)
Examples:
# STDIO → HTTP (development, localhost only)
turbomcp-proxy serve \
--backend stdio --cmd "python server.py" \
--frontend http --bind 127.0.0.1:3000
# STDIO → HTTP with JWT authentication (production)
turbomcp-proxy serve \
--backend stdio --cmd "python server.py" \
--frontend http --bind 0.0.0.0:3000 \
--jwt-secret "your-secret-key"
# STDIO → HTTP with API key authentication (production)
turbomcp-proxy serve \
--backend stdio --cmd "python server.py" \
--frontend http --bind 0.0.0.0:3000 \
--require-auth
# HTTP → STDIO with backend authentication
turbomcp-proxy serve \
--backend http --server https://api.example.com/mcp \
--auth-token "backend-token" \
--frontend stdio
# TCP → HTTP (high-performance network)
turbomcp-proxy serve \
--backend tcp --tcp localhost:5000 \
--frontend http --bind 0.0.0.0:3000
# Unix socket → HTTP (IPC security)
turbomcp-proxy serve \
--backend unix --unix /tmp/mcp.sock \
--frontend http --bind 0.0.0.0:3000
generate - Code Generationturbomcp-proxy generate [OPTIONS]
Options:
--backend <TYPE> Backend type
--cmd <CMD> Command to run (for stdio)
--server <URL> Server URL (for http/websocket)
--frontend <TYPE> Frontend type
--output <DIR> Output directory
--build Build after generation
--release Build in release mode
--run Run after building
Examples:
# Generate and build
turbomcp-proxy generate \
--backend stdio --cmd "python server.py" \
--frontend http \
--output ./my-proxy \
--build --release
schema - Schema ExportExport MCP server capabilities as standard schema formats.
turbomcp-proxy schema <FORMAT> [OPTIONS]
Formats:
openapi OpenAPI 3.1 specification (REST API schema)
graphql GraphQL Schema Definition Language
protobuf Protocol Buffers 3 definition
Backend Options:
--backend <TYPE> Backend type (stdio, http, tcp, unix, websocket)
--cmd <CMD> Command to run (for stdio backend)
--http <URL> HTTP/SSE server URL
--tcp <ADDR> TCP endpoint (host:port)
--unix <PATH> Unix socket path
Output Options:
--output <FILE> Output file (default: stdout)
--with-examples Include example requests/responses (OpenAPI only)
Examples:
# Export OpenAPI from STDIO server
turbomcp-proxy schema openapi \
--backend stdio --cmd "python server.py" \
--output api-spec.json
# Export GraphQL from TCP server
turbomcp-proxy schema graphql \
--backend tcp --tcp localhost:5000 \
--output schema.graphql
# Export Protobuf from Unix socket
turbomcp-proxy schema protobuf \
--backend unix --unix /tmp/mcp.sock \
--output server.proto
# Export to stdout
turbomcp-proxy schema openapi \
--backend stdio --cmd "npx @mcp/server-fs /tmp"
adapter - Protocol Adapters (Phase 6 - Scaffolded)Expose MCP servers through standard web protocols. Adapter framework is ready for full implementation.
turbomcp-proxy adapter <PROTOCOL> [OPTIONS]
Protocols:
rest REST API with OpenAPI documentation
graphql GraphQL API with schema explorer
Backend Options:
--backend <TYPE> Backend type (stdio, http, tcp, unix, websocket)
--cmd <CMD> Command to run (for stdio backend)
--http <URL> HTTP/SSE server URL
--tcp <ADDR> TCP endpoint (host:port)
--unix <PATH> Unix socket path
Server Options:
--bind <ADDR> Bind address (default: 127.0.0.1:3001)
REST-Specific:
--openapi-ui Serve Swagger UI at /docs (future)
GraphQL-Specific:
--playground Serve GraphQL Playground at /playground (future)
Examples:
# REST API (framework ready)
turbomcp-proxy adapter rest \
--backend stdio --cmd "python server.py" \
--bind 127.0.0.1:3000
# GraphQL API (framework ready)
turbomcp-proxy adapter graphql \
--backend tcp --tcp localhost:5000 \
--bind 127.0.0.1:4000
Status: Command structure complete. Full implementation of REST and GraphQL adapters coming in next release.
Current Version: 2.2.0 MVP Status: Complete - Production Ready (Phases 1-4) Latest Release: 2.2.0 - Security Hardening & HTTP Header Propagation
See Progress Tracker for detailed progress.
Transport Coverage:
Authentication & Security:
Quality Assurance:
Core Components:
MVP Target: Phases 1-3 (Complete - October 2025) Code Generation: Phase 4 (Complete - October 2025) Authentication: Phase 4.5 (Complete - October 2025) Schema Export & Transports: Phase 5-5.5 (Complete - November 2025) Full Release: 6/7 phases complete - 86%
We welcome contributions! Please see:
Licensed under either of:
at your option.
MCP servers are often CLI tools (STDIO), but clients need network access (HTTP). Manually bridging this gap requires:
turbomcp-proxy does this automatically via introspection:
Result: Zero-configuration, universal MCP adapter that works with any implementation.
Built by the TurboMCP team