visproj commited on
Commit
1e6d6a3
Β·
verified Β·
1 Parent(s): e78593a

Upload 17 files

Browse files
README.md CHANGED
@@ -1,13 +1,180 @@
1
- ---
2
- title: Mcp Generator
3
- emoji: πŸ“Š
4
- colorFrom: red
5
- colorTo: green
6
- sdk: gradio
7
- sdk_version: 6.0.1
8
- app_file: app.py
9
- pinned: false
10
- license: other
11
- ---
12
-
13
- Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ title: MCP Generator
3
+ emoji: πŸ€–
4
+ colorFrom: blue
5
+ colorTo: purple
6
+ sdk: gradio
7
+ sdk_version: 4.44.0
8
+ app_file: app.py
9
+ pinned: false
10
+ license: mit
11
+ tags:
12
+ - mcp
13
+ - model-context-protocol
14
+ - code-generation
15
+ - api
16
+ - agents
17
+ - langgraph
18
+ ---
19
+
20
+ # πŸ€– MCP Generator
21
+
22
+ **Turn Any API into an MCP Server in Seconds!**
23
+
24
+ Built for the **MCP 1st Birthday Hackathon** - Track 2: MCP in Action πŸŽ‰
25
+
26
+ ## 🎯 What is This?
27
+
28
+ A **meta-MCP** that generates MCP servers from any API! This is:
29
+ - βœ… An **MCP server itself** (uses MCP Fetch Server)
30
+ - βœ… A **code generator** (powered by LangGraph agents)
31
+ - βœ… A **one-click deployment tool** (instant MCP hosting)
32
+ - βœ… **Non-technical friendly** (no terminal commands needed!)
33
+
34
+ ## ✨ Features
35
+
36
+ - **πŸ” Automatic API Analysis** - Just provide a URL, we analyze the API structure
37
+ - **πŸ€– AI-Powered Code Generation** - Claude generates complete, working MCP servers
38
+ - **πŸ“¦ Complete Package** - Get server code, README, and config files
39
+ - **πŸš€ Instant Hosting** - Your MCP runs immediately in our Space
40
+ - **πŸ“₯ Download Option** - Self-host later if you want
41
+ - **🎨 Beautiful UI** - Built with Gradio for ease of use
42
+
43
+ ## πŸ—οΈ Architecture
44
+
45
+ ```
46
+ β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
47
+ β”‚ Gradio Frontend β”‚
48
+ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
49
+ β”‚
50
+ β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
51
+ β”‚ LangGraph Agent Factory β”‚
52
+ β”‚ β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚
53
+ β”‚ β”‚ API Analyzer Agent β”‚ β”‚ ← Uses Fetch MCP
54
+ β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚
55
+ β”‚ β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚
56
+ β”‚ β”‚ Code Generator Agent β”‚ β”‚ ← Uses Claude API
57
+ β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚
58
+ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
59
+ β”‚
60
+ β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
61
+ β”‚ Generated MCP Server β”‚
62
+ β”‚ β€’ stdio transport β”‚
63
+ β”‚ β€’ Complete documentation β”‚
64
+ β”‚ β€’ Ready to deploy β”‚
65
+ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
66
+ ```
67
+
68
+ ## πŸš€ Quick Start
69
+
70
+ ### Try it on HuggingFace Spaces
71
+
72
+ πŸ‘‰ **[Launch MCP Generator](https://huggingface.co/spaces/MCP-1st-Birthday/mcp-generator)** πŸ‘ˆ
73
+
74
+ ### Run Locally
75
+
76
+ ```bash
77
+ # Clone the repo
78
+ git clone https://github.com/visprogithub/MCP_Generator_Agent.git
79
+ cd MCP_Generator_Agent
80
+
81
+ # Install dependencies
82
+ npm install -g npx # For MCP Fetch Server
83
+ pip install -r requirements.txt
84
+
85
+ # Set up API key
86
+ echo "ANTHROPIC_API_KEY=your_key_here" > .env
87
+
88
+ # Run the app
89
+ python app.py
90
+ ```
91
+
92
+ Then open http://localhost:7860 in your browser!
93
+
94
+ ## πŸ“– How to Use
95
+
96
+ 1. **Enter an API URL** (e.g., `https://api.github.com`)
97
+ 2. **Click "Generate & Host MCP Server"**
98
+ 3. **Download the ZIP** or copy the connection config
99
+ 4. **Use in Claude Desktop!**
100
+
101
+ That's it! ✨
102
+
103
+ ## πŸŽ“ Example APIs to Try
104
+
105
+ - `https://api.github.com` - GitHub API
106
+ - `https://api.stripe.com` - Stripe Payment API
107
+ - `https://api.openweathermap.org` - Weather Data
108
+ - `https://jsonplaceholder.typicode.com` - Fake REST API (for testing)
109
+
110
+ ## πŸ› οΈ Technology Stack
111
+
112
+ - **Frontend:** [Gradio 4](https://gradio.app) - Beautiful web UI
113
+ - **Agents:** [LangGraph](https://github.com/langchain-ai/langgraph) - Agent orchestration
114
+ - **LLM:** [Anthropic Claude](https://anthropic.com) - Code generation
115
+ - **MCP Client:** [@modelcontextprotocol/server-fetch](https://github.com/modelcontextprotocol/servers) - Fetch API docs
116
+ - **MCP Server:** [MCP Python SDK](https://github.com/modelcontextprotocol/python-sdk) - Generated servers
117
+
118
+ ## 🎯 MCP 1st Birthday Hackathon
119
+
120
+ This project demonstrates:
121
+
122
+ βœ… **Using MCP** - Integrates Fetch MCP for API analysis
123
+ βœ… **Providing MCP** - Generates working MCP servers
124
+ βœ… **Real-world Impact** - Makes MCP development accessible to everyone
125
+ βœ… **Creativity** - Meta-MCP that builds MCPs using MCPs!
126
+ βœ… **Polish** - Beautiful UI, complete docs, one-click experience
127
+
128
+ ### Track 2: MCP in Action
129
+
130
+ Category: **Productivity**
131
+
132
+ This tool makes developers and non-technical users 10x more productive by eliminating the manual work of creating MCP servers.
133
+
134
+ ## πŸ“ Project Structure
135
+
136
+ ```
137
+ MCP_Generator_Agent/
138
+ β”œβ”€β”€ app.py # Main Gradio application
139
+ β”œβ”€β”€ src/
140
+ β”‚ β”œβ”€β”€ agents/ # LangGraph agents
141
+ β”‚ β”‚ β”œβ”€β”€ factory.py # Agent orchestration
142
+ β”‚ β”‚ β”œβ”€β”€ api_analyzer.py
143
+ β”‚ β”‚ └── code_generator.py
144
+ β”‚ β”œβ”€β”€ mcp_clients/ # MCP client wrappers
145
+ β”‚ β”‚ └── fetch_client.py
146
+ β”‚ β”œβ”€β”€ templates/ # Code templates
147
+ β”‚ β”‚ β”œβ”€β”€ mcp_server_template.py.jinja
148
+ β”‚ β”‚ └── readme_template.md.jinja
149
+ β”‚ β”œβ”€β”€ hosted_mcps/ # Generated MCPs
150
+ β”‚ β”œβ”€β”€ mcp_host.py # MCP hosting manager
151
+ β”‚ └── config.py # Configuration
152
+ β”œβ”€β”€ requirements.txt
153
+ └── README.md
154
+ ```
155
+
156
+ ## 🀝 Contributing
157
+
158
+ This is a hackathon project, but contributions are welcome! Feel free to:
159
+
160
+ - Report bugs
161
+ - Suggest features
162
+ - Submit pull requests
163
+ - Share your generated MCPs!
164
+
165
+ ## πŸ“„ License
166
+
167
+ MIT License - See LICENSE file for details
168
+
169
+ ## πŸ™ Acknowledgments
170
+
171
+ - [Anthropic](https://anthropic.com) for Claude and MCP
172
+ - [Gradio](https://gradio.app) for the amazing UI framework
173
+ - [LangGraph](https://github.com/langchain-ai/langgraph) for agent orchestration
174
+ - The entire MCP community! πŸŽ‚
175
+
176
+ ---
177
+
178
+ **Made with ❀️ for the MCP 1st Birthday Hackathon**
179
+
180
+ πŸ”— [HuggingFace Space](https://huggingface.co/spaces/MCP-1st-Birthday/mcp-generator) | πŸ”— [GitHub](https://github.com/visprogithub/MCP_Generator_Agent)
app.py ADDED
@@ -0,0 +1,365 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ MCP Generator - Gradio Frontend
3
+ Turn any API into an MCP server in seconds!
4
+ """
5
+
6
+ import asyncio
7
+ import gradio as gr
8
+ from pathlib import Path
9
+ import zipfile
10
+ import io
11
+ import os
12
+
13
+ from src.agents.factory import AgentFactory
14
+ from src.mcp_host import mcp_host
15
+ from src.mcp_registry import mcp_registry
16
+ from src.config import LLM_PROVIDER, ANTHROPIC_API_KEY, OPENAI_API_KEY, HOSTED_MCPS_DIR
17
+
18
+
19
+ # Initialize agent factory
20
+ try:
21
+ agent_factory = AgentFactory()
22
+ print(f"βœ… Using {LLM_PROVIDER.upper()} for code generation")
23
+ except ValueError as e:
24
+ print(f"❌ Error: {e}")
25
+ print(f"Please set {'ANTHROPIC_API_KEY' if LLM_PROVIDER == 'anthropic' else 'OPENAI_API_KEY'} in .env file")
26
+ agent_factory = None
27
+
28
+
29
+ async def generate_and_host_mcp(api_url: str, api_key: str = None, force_regenerate: bool = False, progress=gr.Progress()):
30
+ """Generate and host an MCP server
31
+
32
+ Args:
33
+ api_url: The API URL to analyze
34
+ api_key: Optional API key for the target API
35
+ force_regenerate: If True, regenerate even if exists
36
+ progress: Gradio progress tracker
37
+
38
+ Returns:
39
+ Tuple of (status_text, code, download_file, readme, connection_config)
40
+ """
41
+ if not agent_factory:
42
+ api_key_name = "ANTHROPIC_API_KEY" if LLM_PROVIDER == "anthropic" else "OPENAI_API_KEY"
43
+ return (
44
+ f"❌ Error: {api_key_name} not configured. Please set it in your .env file.",
45
+ "",
46
+ None,
47
+ "",
48
+ ""
49
+ )
50
+
51
+ try:
52
+ # Check if MCP already exists for this URL
53
+ existing_mcp = mcp_registry.find_by_url(api_url)
54
+
55
+ if existing_mcp and not force_regenerate:
56
+ progress(0.5, desc="Found existing MCP, reusing...")
57
+
58
+ # Reuse existing MCP
59
+ mcp_id = existing_mcp['mcp_id']
60
+ mcp_path = HOSTED_MCPS_DIR / mcp_id
61
+
62
+ # Update last used timestamp
63
+ mcp_registry.update_last_used(api_url)
64
+
65
+ # Load existing files
66
+ server_code = (mcp_path / "server.py").read_text()
67
+ readme = (mcp_path / "README.md").read_text()
68
+
69
+ status_text = f"""♻️ **Reusing Existing MCP!**
70
+
71
+ **MCP ID:** `{mcp_id}`
72
+ **Originally Created:** {existing_mcp['created_at']}
73
+ **Last Used:** {existing_mcp['last_used']}
74
+
75
+ This MCP was already generated for this API URL. Using existing version to save time and API calls!
76
+
77
+ πŸ’‘ **Tip:** To regenerate from scratch, check "Force Regenerate" below.
78
+ """
79
+
80
+ # Create ZIP file
81
+ zip_buffer = io.BytesIO()
82
+ with zipfile.ZipFile(zip_buffer, 'w', zipfile.ZIP_DEFLATED) as zipf:
83
+ for file_path in mcp_path.rglob('*'):
84
+ if file_path.is_file():
85
+ arcname = file_path.relative_to(mcp_path.parent)
86
+ zipf.write(file_path, arcname)
87
+
88
+ zip_buffer.seek(0)
89
+ temp_zip = f"/tmp/{mcp_id}.zip"
90
+ with open(temp_zip, 'wb') as f:
91
+ f.write(zip_buffer.read())
92
+
93
+ connection_config = f"""{{
94
+ "mcpServers": {{
95
+ "{mcp_id}": {{
96
+ "command": "python",
97
+ "args": ["server.py"]
98
+ }}
99
+ }}
100
+ }}"""
101
+
102
+ return (
103
+ status_text,
104
+ server_code,
105
+ temp_zip,
106
+ readme,
107
+ connection_config
108
+ )
109
+
110
+ # Generate new MCP
111
+ if existing_mcp:
112
+ progress(0.1, desc="Regenerating MCP (forced)...")
113
+ else:
114
+ progress(0.1, desc="Analyzing API...")
115
+
116
+ # Generate the MCP
117
+ result = await agent_factory.generate_mcp(api_url, api_key)
118
+
119
+ if result["status"] == "error":
120
+ return (
121
+ f"❌ Error: {result['error']}",
122
+ "",
123
+ None,
124
+ "",
125
+ ""
126
+ )
127
+
128
+ progress(0.6, desc="Generating code...")
129
+
130
+ # Get the generated files
131
+ mcp_id = result["mcp_id"]
132
+ server_code = result["server_code"]
133
+ readme = result["readme_content"]
134
+
135
+ progress(0.8, desc="Starting MCP server...")
136
+
137
+ # Start the MCP server
138
+ start_result = await mcp_host.start_mcp(mcp_id)
139
+
140
+ if not start_result["success"]:
141
+ status_text = f"⚠️ MCP generated but failed to start: {start_result.get('error')}"
142
+ else:
143
+ status_text = f"""βœ… **MCP Server Running!**
144
+
145
+ **MCP ID:** `{mcp_id}`
146
+ **Status:** {start_result['status']}
147
+ **Connection:** stdio (local)
148
+
149
+ Your MCP server is generated and ready to use!
150
+ """
151
+
152
+ progress(0.9, desc="Creating download package...")
153
+
154
+ # Create ZIP file for download
155
+ zip_buffer = io.BytesIO()
156
+ mcp_path = Path(result["download_path"])
157
+
158
+ with zipfile.ZipFile(zip_buffer, 'w', zipfile.ZIP_DEFLATED) as zipf:
159
+ for file_path in mcp_path.rglob('*'):
160
+ if file_path.is_file():
161
+ arcname = file_path.relative_to(mcp_path.parent)
162
+ zipf.write(file_path, arcname)
163
+
164
+ zip_buffer.seek(0)
165
+
166
+ # Save to temp file for Gradio
167
+ temp_zip = f"/tmp/{mcp_id}.zip"
168
+ with open(temp_zip, 'wb') as f:
169
+ f.write(zip_buffer.read())
170
+
171
+ # Create connection config
172
+ connection_config = f"""{{
173
+ "mcpServers": {{
174
+ "{mcp_id}": {{
175
+ "command": "python",
176
+ "args": ["server.py"]
177
+ }}
178
+ }}
179
+ }}"""
180
+
181
+ progress(1.0, desc="Done!")
182
+
183
+ return (
184
+ status_text,
185
+ server_code,
186
+ temp_zip,
187
+ readme,
188
+ connection_config
189
+ )
190
+
191
+ except Exception as e:
192
+ return (
193
+ f"❌ Unexpected error: {str(e)}",
194
+ "",
195
+ None,
196
+ "",
197
+ ""
198
+ )
199
+
200
+
201
+ # Build Gradio Interface
202
+ with gr.Blocks(
203
+ title="MCP Generator",
204
+ theme=gr.themes.Soft(),
205
+ css="""
206
+ .gradio-container {max-width: 1200px !important}
207
+ .output-box {border: 2px solid #4CAF50; border-radius: 8px; padding: 16px;}
208
+ """
209
+ ) as app:
210
+
211
+ gr.Markdown("""
212
+ # πŸ€– MCP Generator
213
+ ## Turn Any API into an MCP Server in Seconds!
214
+
215
+ Simply enter an API URL and we'll generate a complete, working MCP server with:
216
+ - βœ… Automatically analyzed endpoints
217
+ - βœ… Generated MCP tools
218
+ - βœ… Complete documentation
219
+ - βœ… Ready to use immediately!
220
+
221
+ **Built for the MCP 1st Birthday Hackathon** πŸŽ‰
222
+ """)
223
+
224
+ with gr.Row():
225
+ with gr.Column(scale=2):
226
+ gr.Markdown("### πŸ“ Input")
227
+
228
+ api_url = gr.Textbox(
229
+ label="API URL or Documentation URL",
230
+ placeholder="https://api.example.com",
231
+ info="Enter the base URL or documentation URL of the API"
232
+ )
233
+
234
+ api_key = gr.Textbox(
235
+ label="API Key (Optional)",
236
+ placeholder="sk-...",
237
+ type="password",
238
+ info="If the API requires authentication"
239
+ )
240
+
241
+ force_regenerate = gr.Checkbox(
242
+ label="Force Regenerate",
243
+ value=False,
244
+ info="Regenerate even if MCP already exists for this URL (saves API calls when unchecked)"
245
+ )
246
+
247
+ generate_btn = gr.Button(
248
+ "πŸš€ Generate & Host MCP Server",
249
+ variant="primary",
250
+ size="lg"
251
+ )
252
+
253
+ with gr.Accordion("πŸ“š Examples & Tips", open=False):
254
+ gr.Markdown("""
255
+ **Try these APIs:**
256
+ - `https://jsonplaceholder.typicode.com` - Fake REST API (great for testing!)
257
+ - `https://api.github.com` - GitHub API
258
+ - `https://api.stripe.com` - Stripe API
259
+ - `https://api.openweathermap.org` - Weather API
260
+
261
+ **πŸ’‘ Tips:**
262
+ - MCPs are cached by URL to save API calls
263
+ - Check "Force Regenerate" to create a fresh version
264
+ - Generated MCPs use stdio transport (works locally)
265
+ """)
266
+
267
+ gr.Markdown("---")
268
+
269
+ with gr.Row():
270
+ with gr.Column():
271
+ gr.Markdown("### πŸ“Š Results")
272
+
273
+ status_output = gr.Markdown(label="Status")
274
+
275
+ with gr.Tab("Generated Code"):
276
+ code_output = gr.Code(
277
+ label="server.py",
278
+ language="python",
279
+ lines=20
280
+ )
281
+
282
+ with gr.Tab("README"):
283
+ readme_output = gr.Markdown()
284
+
285
+ with gr.Tab("Connection Config"):
286
+ connection_output = gr.Code(
287
+ label="Claude Desktop Config",
288
+ language="json"
289
+ )
290
+
291
+ download_output = gr.File(
292
+ label="πŸ“¦ Download Complete Package (ZIP)"
293
+ )
294
+
295
+ # Wire up the button
296
+ generate_btn.click(
297
+ fn=generate_and_host_mcp,
298
+ inputs=[api_url, api_key, force_regenerate],
299
+ outputs=[
300
+ status_output,
301
+ code_output,
302
+ download_output,
303
+ readme_output,
304
+ connection_output
305
+ ]
306
+ )
307
+
308
+ with gr.Accordion("πŸ“‹ Previously Generated MCPs", open=False):
309
+ def get_existing_mcps():
310
+ """Get list of existing MCPs for display"""
311
+ mcps = mcp_registry.list_all()
312
+ if not mcps:
313
+ return "No MCPs generated yet. Generate your first one above! πŸ‘†"
314
+
315
+ output = "| API Name | URL | Created | Last Used |\n"
316
+ output += "|----------|-----|---------|----------|\n"
317
+ for mcp in mcps[:10]: # Show last 10
318
+ api_name = mcp['api_name']
319
+ api_url = mcp['api_url'][:40] + "..." if len(mcp['api_url']) > 40 else mcp['api_url']
320
+ created = mcp['created_at'].split('T')[0]
321
+ last_used = mcp['last_used'].split('T')[0]
322
+ output += f"| {api_name} | {api_url} | {created} | {last_used} |\n"
323
+
324
+ return output
325
+
326
+ existing_mcps_display = gr.Markdown(get_existing_mcps())
327
+ refresh_btn = gr.Button("πŸ”„ Refresh List", size="sm")
328
+ refresh_btn.click(fn=get_existing_mcps, outputs=existing_mcps_display)
329
+
330
+ gr.Markdown("""
331
+ ---
332
+ ### 🎯 How to Use Your Generated MCP
333
+
334
+ 1. **Download** the ZIP file above
335
+ 2. **Extract** it to a folder
336
+ 3. **Add** the connection config to your Claude Desktop settings
337
+ 4. **Restart** Claude Desktop
338
+
339
+ Your MCP server is ready to use! πŸŽ‰
340
+
341
+ ### πŸš€ About This Project
342
+
343
+ This is a meta-MCP: an MCP server that generates other MCP servers!
344
+
345
+ - Built with [Gradio](https://gradio.app)
346
+ - Powered by [LangGraph](https://github.com/langchain-ai/langgraph) agents
347
+ - Uses [Anthropic's Claude](https://anthropic.com) for code generation
348
+ - Integrates with [MCP Fetch Server](https://github.com/modelcontextprotocol/servers)
349
+
350
+ **For MCP 1st Birthday Hackathon - Track 2: MCP in Action** πŸŽ‚
351
+ """)
352
+
353
+
354
+ if __name__ == "__main__":
355
+ # Check for API key
356
+ if not ANTHROPIC_API_KEY:
357
+ print("⚠️ WARNING: ANTHROPIC_API_KEY not set!")
358
+ print("Please create a .env file with your API key")
359
+ print("Example: echo 'ANTHROPIC_API_KEY=your_key_here' > .env")
360
+
361
+ app.launch(
362
+ server_name="0.0.0.0",
363
+ server_port=7860,
364
+ share=False
365
+ )
requirements.txt ADDED
@@ -0,0 +1,11 @@
 
 
 
 
 
 
 
 
 
 
 
 
1
+ gradio==4.44.0
2
+ anthropic==0.39.0
3
+ openai==1.54.3
4
+ mcp==1.1.2
5
+ langgraph==0.2.45
6
+ jinja2==3.1.4
7
+ pydantic==2.9.2
8
+ python-dotenv==1.0.1
9
+ aiohttp==3.10.10
10
+ uvicorn==0.32.0
11
+ fastapi==0.115.4
src/__init__.py ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ """MCP Generator - Generate MCP servers from any API"""
2
+
3
+ __version__ = "0.1.0"
src/agents/__init__.py ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ """Agent factory for MCP generation"""
2
+
3
+ from .factory import AgentFactory
4
+ from .state import AgentState
5
+
6
+ __all__ = ["AgentFactory", "AgentState"]
src/agents/api_analyzer.py ADDED
@@ -0,0 +1,99 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """API Analyzer Agent - Analyzes API documentation and structure"""
2
+
3
+ import asyncio
4
+ import json
5
+ import re
6
+ from .state import AgentState
7
+ from ..mcp_clients.fetch_client import fetch_url
8
+ from ..llm_client import get_llm_client
9
+
10
+
11
+ async def analyze_api(state: AgentState) -> AgentState:
12
+ """Analyze the API and extract structure
13
+
14
+ Args:
15
+ state: Current agent state
16
+
17
+ Returns:
18
+ Updated state with API analysis
19
+ """
20
+ state["status"] = "analyzing_api"
21
+
22
+ try:
23
+ # Try to fetch API documentation
24
+ api_url = state["api_url"]
25
+
26
+ # Common patterns for API doc URLs
27
+ doc_urls = [
28
+ api_url,
29
+ f"{api_url}/docs",
30
+ f"{api_url}/api-docs",
31
+ f"{api_url}/swagger.json",
32
+ f"{api_url}/openapi.json",
33
+ ]
34
+
35
+ api_content = None
36
+ successful_url = None
37
+
38
+ # Try to fetch from potential doc URLs
39
+ for doc_url in doc_urls:
40
+ try:
41
+ result = await fetch_url(doc_url)
42
+ if result.get("success"):
43
+ api_content = result["content"]
44
+ successful_url = doc_url
45
+ break
46
+ except Exception:
47
+ continue
48
+
49
+ if not api_content:
50
+ # If we can't fetch docs, use Claude to infer structure
51
+ api_content = f"No documentation found for {api_url}"
52
+
53
+ state["api_docs"] = api_content
54
+ state["api_docs_url"] = successful_url
55
+
56
+ # Use LLM to analyze the API
57
+ llm = get_llm_client()
58
+
59
+ analysis_prompt = f"""Analyze this API and extract information:
60
+
61
+ API URL: {api_url}
62
+ Documentation: {api_content[:5000]} # Limit to avoid token limits
63
+
64
+ Please provide a JSON response with:
65
+ 1. api_name: A short name for this API
66
+ 2. api_description: Brief description of what the API does
67
+ 3. endpoints: List of 3-5 most useful endpoints with structure:
68
+ - path: endpoint path
69
+ - method: HTTP method
70
+ - description: what it does
71
+ - parameters: dict of parameter names and types
72
+ 4. auth_type: "bearer", "api_key", "basic", or "none"
73
+
74
+ Format as valid JSON only, no other text."""
75
+
76
+ try:
77
+ analysis = await llm.generate_json(analysis_prompt, max_tokens=2000)
78
+ except Exception:
79
+ # Fallback analysis if JSON parsing fails
80
+ analysis = {
81
+ "api_name": api_url.split("//")[-1].split("/")[0].replace(".", "-"),
82
+ "api_description": f"API at {api_url}",
83
+ "endpoints": [],
84
+ "auth_type": "api_key"
85
+ }
86
+
87
+ # Update state
88
+ state["api_name"] = analysis.get("api_name", "unknown-api")
89
+ state["api_description"] = analysis.get("api_description", "")
90
+ state["endpoints"] = analysis.get("endpoints", [])
91
+ state["auth_type"] = analysis.get("auth_type", "none")
92
+ state["status"] = "analyzed"
93
+
94
+ return state
95
+
96
+ except Exception as e:
97
+ state["status"] = "error"
98
+ state["error"] = f"Analysis failed: {str(e)}"
99
+ return state
src/agents/code_generator.py ADDED
@@ -0,0 +1,142 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """Code Generator Agent - Generates MCP server code"""
2
+
3
+ import asyncio
4
+ import json
5
+ import uuid
6
+ from datetime import datetime
7
+ from pathlib import Path
8
+ from jinja2 import Environment, FileSystemLoader
9
+ import re
10
+
11
+ from .state import AgentState
12
+ from ..config import TEMPLATES_DIR, HOSTED_MCPS_DIR
13
+ from ..mcp_registry import mcp_registry
14
+ from ..llm_client import get_llm_client
15
+
16
+
17
+ async def generate_code(state: AgentState) -> AgentState:
18
+ """Generate MCP server code from API analysis
19
+
20
+ Args:
21
+ state: Current agent state
22
+
23
+ Returns:
24
+ Updated state with generated code
25
+ """
26
+ state["status"] = "generating_code"
27
+
28
+ try:
29
+ # Generate unique MCP ID
30
+ mcp_id = f"{state['api_name']}-{uuid.uuid4().hex[:8]}"
31
+ state["mcp_id"] = mcp_id
32
+
33
+ # Use LLM to generate detailed tool specifications
34
+ llm = get_llm_client()
35
+
36
+ code_gen_prompt = f"""Generate Python MCP tools for this API:
37
+
38
+ API Name: {state['api_name']}
39
+ Description: {state['api_description']}
40
+ Base URL: {state['api_url']}
41
+ Auth Type: {state['auth_type']}
42
+ Endpoints: {json.dumps(state['endpoints'], indent=2)}
43
+
44
+ For each endpoint, create a tool specification with:
45
+ - name: snake_case function name
46
+ - description: what the tool does
47
+ - function_name: Python function name
48
+ - endpoint: full URL path
49
+ - method: HTTP method (GET/POST)
50
+ - parameters: dict of param names to schema (type, description)
51
+ - required: list of required param names
52
+ - headers: dict of headers needed (include auth if needed)
53
+
54
+ Provide response as valid JSON array of tool objects.
55
+ Only include JSON, no other text."""
56
+
57
+ try:
58
+ # Try to get JSON response
59
+ tools_text = await llm.generate(code_gen_prompt, max_tokens=3000)
60
+ json_match = re.search(r'\[.*\]', tools_text, re.DOTALL)
61
+
62
+ if json_match:
63
+ tools = json.loads(json_match.group())
64
+ else:
65
+ raise ValueError("No JSON array found")
66
+ except Exception:
67
+ # Fallback to basic tools
68
+ tools = [{
69
+ "name": f"{state['api_name']}_api_call",
70
+ "description": f"Make a call to {state['api_name']} API",
71
+ "function_name": "api_call",
72
+ "endpoint": state['api_url'],
73
+ "method": "GET",
74
+ "parameters": {},
75
+ "required": [],
76
+ "headers": {}
77
+ }]
78
+
79
+ # Setup Jinja2 environment
80
+ env = Environment(loader=FileSystemLoader(TEMPLATES_DIR))
81
+
82
+ # Generate server code
83
+ server_template = env.get_template("mcp_server_template.py.jinja")
84
+ server_code = server_template.render(
85
+ api_name=state['api_name'],
86
+ api_url=state['api_url'],
87
+ server_name=mcp_id,
88
+ timestamp=datetime.now().isoformat(),
89
+ tools=tools
90
+ )
91
+
92
+ # Generate README
93
+ readme_template = env.get_template("readme_template.md.jinja")
94
+ readme_content = readme_template.render(
95
+ api_name=state['api_name'],
96
+ api_url=state['api_url'],
97
+ server_name=mcp_id,
98
+ timestamp=datetime.now().isoformat(),
99
+ tools=tools,
100
+ hosted_url=f"http://localhost:8000/mcps/{mcp_id}", # Will be updated when hosted
101
+ auth_type=state['auth_type'],
102
+ rate_limits="Check API documentation"
103
+ )
104
+
105
+ # Generate requirements.txt
106
+ requirements = """mcp>=1.1.0
107
+ httpx>=0.27.0
108
+ asyncio
109
+ """
110
+
111
+ # Save to hosted_mcps directory
112
+ mcp_dir = HOSTED_MCPS_DIR / mcp_id
113
+ mcp_dir.mkdir(exist_ok=True)
114
+
115
+ (mcp_dir / "server.py").write_text(server_code)
116
+ (mcp_dir / "README.md").write_text(readme_content)
117
+ (mcp_dir / "requirements.txt").write_text(requirements)
118
+
119
+ # Register in the MCP registry
120
+ mcp_registry.register(
121
+ api_url=state['api_url'],
122
+ mcp_id=mcp_id,
123
+ api_name=state['api_name'],
124
+ metadata={
125
+ 'auth_type': state['auth_type'],
126
+ 'endpoints_count': len(state['endpoints'])
127
+ }
128
+ )
129
+
130
+ # Update state
131
+ state["server_code"] = server_code
132
+ state["readme_content"] = readme_content
133
+ state["requirements"] = requirements
134
+ state["download_path"] = str(mcp_dir)
135
+ state["status"] = "completed"
136
+
137
+ return state
138
+
139
+ except Exception as e:
140
+ state["status"] = "error"
141
+ state["error"] = f"Code generation failed: {str(e)}"
142
+ return state
src/agents/factory.py ADDED
@@ -0,0 +1,81 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """Agent Factory - Orchestrates MCP generation using LangGraph"""
2
+
3
+ import asyncio
4
+ from typing import Dict, Any
5
+ from langgraph.graph import StateGraph, END
6
+
7
+ from .state import AgentState
8
+ from .api_analyzer import analyze_api
9
+ from .code_generator import generate_code
10
+ from ..config import LLM_PROVIDER, ANTHROPIC_API_KEY, OPENAI_API_KEY
11
+
12
+
13
+ class AgentFactory:
14
+ """Factory for creating and running MCP generation agents"""
15
+
16
+ def __init__(self):
17
+ """Initialize the agent factory"""
18
+ # Check that we have at least one API key configured
19
+ if LLM_PROVIDER == "anthropic" and not ANTHROPIC_API_KEY:
20
+ raise ValueError("ANTHROPIC_API_KEY is required when LLM_PROVIDER=anthropic")
21
+ elif LLM_PROVIDER == "openai" and not OPENAI_API_KEY:
22
+ raise ValueError("OPENAI_API_KEY is required when LLM_PROVIDER=openai")
23
+
24
+ self.graph = self._build_graph()
25
+
26
+ def _build_graph(self) -> StateGraph:
27
+ """Build the LangGraph workflow"""
28
+
29
+ # Create the graph
30
+ workflow = StateGraph(AgentState)
31
+
32
+ # Add nodes (agents)
33
+ workflow.add_node("analyze_api", analyze_api)
34
+ workflow.add_node("generate_code", generate_code)
35
+
36
+ # Define edges (workflow)
37
+ workflow.set_entry_point("analyze_api")
38
+ workflow.add_edge("analyze_api", "generate_code")
39
+ workflow.add_edge("generate_code", END)
40
+
41
+ return workflow.compile()
42
+
43
+ async def generate_mcp(self, api_url: str, api_key: str = None) -> AgentState:
44
+ """Generate an MCP server from an API URL
45
+
46
+ Args:
47
+ api_url: The API URL to analyze
48
+ api_key: Optional API key for the target API
49
+
50
+ Returns:
51
+ Final agent state with generated MCP
52
+ """
53
+
54
+ # Initial state
55
+ initial_state: AgentState = {
56
+ "api_url": api_url,
57
+ "api_key": api_key,
58
+ "api_docs": None,
59
+ "api_docs_url": None,
60
+ "api_name": "",
61
+ "api_description": "",
62
+ "endpoints": [],
63
+ "auth_type": "none",
64
+ "server_code": "",
65
+ "readme_content": "",
66
+ "requirements": "",
67
+ "mcp_id": "",
68
+ "hosted_url": "",
69
+ "download_path": "",
70
+ "status": "initializing",
71
+ "error": None
72
+ }
73
+
74
+ # Run the graph
75
+ try:
76
+ final_state = await self.graph.ainvoke(initial_state)
77
+ return final_state
78
+ except Exception as e:
79
+ initial_state["status"] = "failed"
80
+ initial_state["error"] = str(e)
81
+ return initial_state
src/agents/state.py ADDED
@@ -0,0 +1,35 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """Agent state definitions"""
2
+
3
+ from typing import TypedDict, Optional, List, Dict, Any
4
+
5
+
6
+ class AgentState(TypedDict):
7
+ """State shared across all agents"""
8
+
9
+ # Input
10
+ api_url: str
11
+ api_key: Optional[str]
12
+
13
+ # Fetched data
14
+ api_docs: Optional[str]
15
+ api_docs_url: Optional[str]
16
+
17
+ # Analysis
18
+ api_name: str
19
+ api_description: str
20
+ endpoints: List[Dict[str, Any]]
21
+ auth_type: str
22
+
23
+ # Generated code
24
+ server_code: str
25
+ readme_content: str
26
+ requirements: str
27
+
28
+ # Output
29
+ mcp_id: str
30
+ hosted_url: str
31
+ download_path: str
32
+
33
+ # Status
34
+ status: str
35
+ error: Optional[str]
src/config.py ADDED
@@ -0,0 +1,28 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """Configuration management"""
2
+ import os
3
+ from pathlib import Path
4
+ from dotenv import load_dotenv
5
+
6
+ # Load environment variables
7
+ load_dotenv()
8
+
9
+ # Project paths
10
+ PROJECT_ROOT = Path(__file__).parent.parent
11
+ TEMPLATES_DIR = PROJECT_ROOT / "src" / "templates"
12
+ HOSTED_MCPS_DIR = PROJECT_ROOT / "src" / "hosted_mcps"
13
+
14
+ # LLM Configuration
15
+ LLM_PROVIDER = os.getenv("LLM_PROVIDER", "openai").lower() # "anthropic" or "openai"
16
+
17
+ # API Keys
18
+ ANTHROPIC_API_KEY = os.getenv("ANTHROPIC_API_KEY")
19
+ OPENAI_API_KEY = os.getenv("OPENAI_API_KEY")
20
+ GITHUB_TOKEN = os.getenv("GITHUB_TOKEN")
21
+
22
+ # MCP Server settings
23
+ MCP_HOST = "0.0.0.0"
24
+ MCP_PORT_RANGE_START = 8100
25
+ MCP_PORT_RANGE_END = 8200
26
+
27
+ # Ensure directories exist
28
+ HOSTED_MCPS_DIR.mkdir(exist_ok=True)
src/llm_client.py ADDED
@@ -0,0 +1,93 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """Unified LLM client supporting multiple providers"""
2
+
3
+ import json
4
+ import re
5
+ from typing import Optional
6
+ from anthropic import Anthropic
7
+ from openai import OpenAI
8
+
9
+ from .config import LLM_PROVIDER, ANTHROPIC_API_KEY, OPENAI_API_KEY
10
+
11
+
12
+ class LLMClient:
13
+ """Unified client for different LLM providers"""
14
+
15
+ def __init__(self, provider: str = None):
16
+ """Initialize LLM client
17
+
18
+ Args:
19
+ provider: "anthropic" or "openai". If None, uses config default.
20
+ """
21
+ self.provider = provider or LLM_PROVIDER
22
+
23
+ if self.provider == "anthropic":
24
+ if not ANTHROPIC_API_KEY:
25
+ raise ValueError("ANTHROPIC_API_KEY not set")
26
+ self.client = Anthropic(api_key=ANTHROPIC_API_KEY)
27
+ self.model = "claude-3-5-sonnet-20241022"
28
+
29
+ elif self.provider == "openai":
30
+ if not OPENAI_API_KEY:
31
+ raise ValueError("OPENAI_API_KEY not set")
32
+ self.client = OpenAI(api_key=OPENAI_API_KEY)
33
+ self.model = "gpt-4o" # or "gpt-4o-mini" for cheaper
34
+
35
+ else:
36
+ raise ValueError(f"Unknown provider: {self.provider}")
37
+
38
+ async def generate(self, prompt: str, max_tokens: int = 2000) -> str:
39
+ """Generate text from prompt
40
+
41
+ Args:
42
+ prompt: The prompt to send
43
+ max_tokens: Maximum tokens to generate
44
+
45
+ Returns:
46
+ Generated text
47
+ """
48
+ if self.provider == "anthropic":
49
+ response = self.client.messages.create(
50
+ model=self.model,
51
+ max_tokens=max_tokens,
52
+ messages=[{
53
+ "role": "user",
54
+ "content": prompt
55
+ }]
56
+ )
57
+ return response.content[0].text
58
+
59
+ elif self.provider == "openai":
60
+ response = self.client.chat.completions.create(
61
+ model=self.model,
62
+ max_tokens=max_tokens,
63
+ messages=[{
64
+ "role": "user",
65
+ "content": prompt
66
+ }]
67
+ )
68
+ return response.choices[0].message.content
69
+
70
+ async def generate_json(self, prompt: str, max_tokens: int = 3000) -> dict:
71
+ """Generate JSON from prompt
72
+
73
+ Args:
74
+ prompt: The prompt to send
75
+ max_tokens: Maximum tokens to generate
76
+
77
+ Returns:
78
+ Parsed JSON dict
79
+ """
80
+ text = await self.generate(prompt, max_tokens)
81
+
82
+ # Extract JSON from response (might be wrapped in markdown)
83
+ json_match = re.search(r'\{.*\}|\[.*\]', text, re.DOTALL)
84
+ if json_match:
85
+ return json.loads(json_match.group())
86
+ else:
87
+ raise ValueError(f"No valid JSON found in response: {text[:200]}")
88
+
89
+
90
+ # Global instance
91
+ def get_llm_client() -> LLMClient:
92
+ """Get configured LLM client"""
93
+ return LLMClient()
src/mcp_clients/__init__.py ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ """MCP client implementations"""
2
+
3
+ from .fetch_client import FetchMCPClient, fetch_url
4
+
5
+ __all__ = ["FetchMCPClient", "fetch_url"]
src/mcp_clients/fetch_client.py ADDED
@@ -0,0 +1,86 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """Fetch MCP Client - Wrapper for @modelcontextprotocol/server-fetch"""
2
+
3
+ import asyncio
4
+ import json
5
+ from typing import Optional
6
+ from mcp import ClientSession, StdioServerParameters
7
+ from mcp.client.stdio import stdio_client
8
+
9
+
10
+ class FetchMCPClient:
11
+ """Client for interacting with the Fetch MCP server"""
12
+
13
+ def __init__(self):
14
+ self.session: Optional[ClientSession] = None
15
+ self.read_stream = None
16
+ self.write_stream = None
17
+
18
+ async def connect(self):
19
+ """Connect to the Fetch MCP server"""
20
+ server_params = StdioServerParameters(
21
+ command="npx",
22
+ args=["-y", "@modelcontextprotocol/server-fetch"],
23
+ env=None
24
+ )
25
+
26
+ self.read_stream, self.write_stream = await stdio_client(server_params)
27
+ self.session = ClientSession(self.read_stream, self.write_stream)
28
+ await self.session.initialize()
29
+
30
+ async def disconnect(self):
31
+ """Disconnect from the MCP server"""
32
+ if self.session:
33
+ await self.session.__aexit__(None, None, None)
34
+
35
+ async def fetch_url(self, url: str) -> dict:
36
+ """Fetch content from a URL
37
+
38
+ Args:
39
+ url: The URL to fetch
40
+
41
+ Returns:
42
+ dict with 'content' and 'metadata'
43
+ """
44
+ if not self.session:
45
+ await self.connect()
46
+
47
+ result = await self.session.call_tool("fetch", arguments={"url": url})
48
+
49
+ # Parse the result
50
+ if result and result.content:
51
+ content = result.content[0].text if result.content else ""
52
+ return {
53
+ "content": content,
54
+ "url": url,
55
+ "success": True
56
+ }
57
+
58
+ return {
59
+ "content": "",
60
+ "url": url,
61
+ "success": False,
62
+ "error": "Failed to fetch content"
63
+ }
64
+
65
+ async def __aenter__(self):
66
+ """Async context manager entry"""
67
+ await self.connect()
68
+ return self
69
+
70
+ async def __aexit__(self, exc_type, exc_val, exc_tb):
71
+ """Async context manager exit"""
72
+ await self.disconnect()
73
+
74
+
75
+ # Convenience function
76
+ async def fetch_url(url: str) -> dict:
77
+ """Fetch content from a URL using Fetch MCP
78
+
79
+ Args:
80
+ url: The URL to fetch
81
+
82
+ Returns:
83
+ dict with content and metadata
84
+ """
85
+ async with FetchMCPClient() as client:
86
+ return await client.fetch_url(url)
src/mcp_host.py ADDED
@@ -0,0 +1,175 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """MCP Host Manager - Runs generated MCP servers in our Space"""
2
+
3
+ import asyncio
4
+ import subprocess
5
+ import signal
6
+ from pathlib import Path
7
+ from typing import Dict, Optional
8
+ import psutil
9
+
10
+ from .config import HOSTED_MCPS_DIR
11
+
12
+
13
+ class MCPHostManager:
14
+ """Manages running MCP server processes"""
15
+
16
+ def __init__(self):
17
+ self.running_mcps: Dict[str, subprocess.Popen] = {}
18
+ self.mcp_ports: Dict[str, int] = {}
19
+
20
+ async def start_mcp(self, mcp_id: str) -> dict:
21
+ """Start a hosted MCP server
22
+
23
+ Args:
24
+ mcp_id: The MCP server ID
25
+
26
+ Returns:
27
+ dict with status and connection info
28
+ """
29
+ mcp_path = HOSTED_MCPS_DIR / mcp_id / "server.py"
30
+
31
+ if not mcp_path.exists():
32
+ return {
33
+ "success": False,
34
+ "error": f"MCP {mcp_id} not found"
35
+ }
36
+
37
+ # Check if already running
38
+ if mcp_id in self.running_mcps:
39
+ proc = self.running_mcps[mcp_id]
40
+ if proc.poll() is None: # Still running
41
+ return {
42
+ "success": True,
43
+ "status": "already_running",
44
+ "mcp_id": mcp_id,
45
+ "connection": "stdio"
46
+ }
47
+
48
+ try:
49
+ # Start the MCP server as a subprocess
50
+ proc = subprocess.Popen(
51
+ ["python", str(mcp_path)],
52
+ stdin=subprocess.PIPE,
53
+ stdout=subprocess.PIPE,
54
+ stderr=subprocess.PIPE,
55
+ cwd=str(mcp_path.parent)
56
+ )
57
+
58
+ self.running_mcps[mcp_id] = proc
59
+
60
+ return {
61
+ "success": True,
62
+ "status": "started",
63
+ "mcp_id": mcp_id,
64
+ "pid": proc.pid,
65
+ "connection": "stdio"
66
+ }
67
+
68
+ except Exception as e:
69
+ return {
70
+ "success": False,
71
+ "error": str(e)
72
+ }
73
+
74
+ async def stop_mcp(self, mcp_id: str) -> dict:
75
+ """Stop a running MCP server
76
+
77
+ Args:
78
+ mcp_id: The MCP server ID
79
+
80
+ Returns:
81
+ dict with status
82
+ """
83
+ if mcp_id not in self.running_mcps:
84
+ return {
85
+ "success": False,
86
+ "error": "MCP not running"
87
+ }
88
+
89
+ proc = self.running_mcps[mcp_id]
90
+
91
+ try:
92
+ # Try graceful shutdown first
93
+ proc.terminate()
94
+ try:
95
+ proc.wait(timeout=5)
96
+ except subprocess.TimeoutExpired:
97
+ # Force kill if necessary
98
+ proc.kill()
99
+ proc.wait()
100
+
101
+ del self.running_mcps[mcp_id]
102
+
103
+ return {
104
+ "success": True,
105
+ "status": "stopped",
106
+ "mcp_id": mcp_id
107
+ }
108
+
109
+ except Exception as e:
110
+ return {
111
+ "success": False,
112
+ "error": str(e)
113
+ }
114
+
115
+ def get_status(self, mcp_id: str) -> dict:
116
+ """Get status of an MCP server
117
+
118
+ Args:
119
+ mcp_id: The MCP server ID
120
+
121
+ Returns:
122
+ dict with status info
123
+ """
124
+ if mcp_id not in self.running_mcps:
125
+ return {
126
+ "running": False,
127
+ "mcp_id": mcp_id
128
+ }
129
+
130
+ proc = self.running_mcps[mcp_id]
131
+
132
+ if proc.poll() is not None:
133
+ # Process ended
134
+ del self.running_mcps[mcp_id]
135
+ return {
136
+ "running": False,
137
+ "mcp_id": mcp_id,
138
+ "exit_code": proc.returncode
139
+ }
140
+
141
+ return {
142
+ "running": True,
143
+ "mcp_id": mcp_id,
144
+ "pid": proc.pid
145
+ }
146
+
147
+ def list_all(self) -> list:
148
+ """List all hosted MCPs
149
+
150
+ Returns:
151
+ List of MCP info dicts
152
+ """
153
+ mcps = []
154
+
155
+ for mcp_dir in HOSTED_MCPS_DIR.iterdir():
156
+ if mcp_dir.is_dir():
157
+ mcp_id = mcp_dir.name
158
+ status = self.get_status(mcp_id)
159
+
160
+ mcps.append({
161
+ "mcp_id": mcp_id,
162
+ "path": str(mcp_dir),
163
+ "running": status["running"]
164
+ })
165
+
166
+ return mcps
167
+
168
+ async def cleanup(self):
169
+ """Stop all running MCPs"""
170
+ for mcp_id in list(self.running_mcps.keys()):
171
+ await self.stop_mcp(mcp_id)
172
+
173
+
174
+ # Global instance
175
+ mcp_host = MCPHostManager()
src/mcp_registry.py ADDED
@@ -0,0 +1,167 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """MCP Registry - Track generated MCPs to avoid duplicates"""
2
+
3
+ import json
4
+ from pathlib import Path
5
+ from typing import Optional, Dict, List
6
+ from datetime import datetime
7
+ import hashlib
8
+
9
+ from .config import HOSTED_MCPS_DIR
10
+
11
+
12
+ class MCPRegistry:
13
+ """Registry for tracking generated MCPs"""
14
+
15
+ def __init__(self):
16
+ self.registry_file = HOSTED_MCPS_DIR / "registry.json"
17
+ self.registry: Dict[str, dict] = {}
18
+ self._load()
19
+
20
+ def _load(self):
21
+ """Load registry from disk"""
22
+ if self.registry_file.exists():
23
+ try:
24
+ with open(self.registry_file, 'r') as f:
25
+ self.registry = json.load(f)
26
+ except Exception:
27
+ self.registry = {}
28
+ else:
29
+ self.registry = {}
30
+
31
+ def _save(self):
32
+ """Save registry to disk"""
33
+ self.registry_file.parent.mkdir(exist_ok=True)
34
+ with open(self.registry_file, 'w') as f:
35
+ json.dump(self.registry, indent=2, fp=f)
36
+
37
+ def _url_hash(self, api_url: str) -> str:
38
+ """Generate a consistent hash for an API URL"""
39
+ # Normalize URL (remove trailing slashes, lowercase)
40
+ normalized = api_url.lower().rstrip('/')
41
+ return hashlib.md5(normalized.encode()).hexdigest()
42
+
43
+ def find_by_url(self, api_url: str) -> Optional[dict]:
44
+ """Find an existing MCP for a given API URL
45
+
46
+ Args:
47
+ api_url: The API URL to search for
48
+
49
+ Returns:
50
+ MCP info dict if found, None otherwise
51
+ """
52
+ url_hash = self._url_hash(api_url)
53
+
54
+ if url_hash in self.registry:
55
+ mcp_info = self.registry[url_hash]
56
+
57
+ # Check if the MCP directory still exists
58
+ mcp_path = HOSTED_MCPS_DIR / mcp_info['mcp_id']
59
+ if mcp_path.exists():
60
+ return mcp_info
61
+ else:
62
+ # Clean up stale entry
63
+ del self.registry[url_hash]
64
+ self._save()
65
+
66
+ return None
67
+
68
+ def register(self, api_url: str, mcp_id: str, api_name: str, metadata: dict = None) -> dict:
69
+ """Register a new MCP
70
+
71
+ Args:
72
+ api_url: The API URL
73
+ mcp_id: The generated MCP ID
74
+ api_name: The API name
75
+ metadata: Optional metadata dict
76
+
77
+ Returns:
78
+ The registered MCP info
79
+ """
80
+ url_hash = self._url_hash(api_url)
81
+
82
+ mcp_info = {
83
+ 'mcp_id': mcp_id,
84
+ 'api_url': api_url,
85
+ 'api_name': api_name,
86
+ 'created_at': datetime.now().isoformat(),
87
+ 'last_used': datetime.now().isoformat(),
88
+ 'metadata': metadata or {}
89
+ }
90
+
91
+ self.registry[url_hash] = mcp_info
92
+ self._save()
93
+
94
+ return mcp_info
95
+
96
+ def update_last_used(self, api_url: str):
97
+ """Update the last_used timestamp for an MCP
98
+
99
+ Args:
100
+ api_url: The API URL
101
+ """
102
+ url_hash = self._url_hash(api_url)
103
+
104
+ if url_hash in self.registry:
105
+ self.registry[url_hash]['last_used'] = datetime.now().isoformat()
106
+ self._save()
107
+
108
+ def list_all(self) -> List[dict]:
109
+ """List all registered MCPs
110
+
111
+ Returns:
112
+ List of MCP info dicts
113
+ """
114
+ # Filter out MCPs whose directories no longer exist
115
+ valid_mcps = []
116
+
117
+ for url_hash, mcp_info in list(self.registry.items()):
118
+ mcp_path = HOSTED_MCPS_DIR / mcp_info['mcp_id']
119
+ if mcp_path.exists():
120
+ valid_mcps.append(mcp_info)
121
+ else:
122
+ # Clean up stale entry
123
+ del self.registry[url_hash]
124
+
125
+ if len(valid_mcps) != len(self.registry):
126
+ self._save()
127
+
128
+ # Sort by last_used, most recent first
129
+ valid_mcps.sort(key=lambda x: x['last_used'], reverse=True)
130
+
131
+ return valid_mcps
132
+
133
+ def delete(self, api_url: str) -> bool:
134
+ """Delete an MCP from registry
135
+
136
+ Args:
137
+ api_url: The API URL
138
+
139
+ Returns:
140
+ True if deleted, False if not found
141
+ """
142
+ url_hash = self._url_hash(api_url)
143
+
144
+ if url_hash in self.registry:
145
+ del self.registry[url_hash]
146
+ self._save()
147
+ return True
148
+
149
+ return False
150
+
151
+ def get_stats(self) -> dict:
152
+ """Get registry statistics
153
+
154
+ Returns:
155
+ Stats dict
156
+ """
157
+ mcps = self.list_all()
158
+
159
+ return {
160
+ 'total_mcps': len(mcps),
161
+ 'oldest': mcps[-1]['created_at'] if mcps else None,
162
+ 'newest': mcps[0]['created_at'] if mcps else None,
163
+ }
164
+
165
+
166
+ # Global registry instance
167
+ mcp_registry = MCPRegistry()
src/templates/mcp_server_template.py.jinja ADDED
@@ -0,0 +1,82 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ Generated MCP Server for {{ api_name }}
3
+ Auto-generated by MCP Generator
4
+
5
+ API URL: {{ api_url }}
6
+ Generated: {{ timestamp }}
7
+ """
8
+
9
+ import asyncio
10
+ import json
11
+ from typing import Any
12
+ from mcp.server import Server
13
+ from mcp.server.stdio import stdio_server
14
+ from mcp.types import Tool, TextContent
15
+ import httpx
16
+
17
+ # Initialize MCP server
18
+ app = Server("{{ server_name }}")
19
+
20
+ @app.list_tools()
21
+ async def list_tools() -> list[Tool]:
22
+ """List available tools"""
23
+ return [
24
+ {% for tool in tools %}
25
+ Tool(
26
+ name="{{ tool.name }}",
27
+ description="{{ tool.description }}",
28
+ inputSchema={
29
+ "type": "object",
30
+ "properties": {{ tool.parameters | tojson }},
31
+ "required": {{ tool.required | tojson }}
32
+ }
33
+ ){% if not loop.last %},{% endif %}
34
+ {% endfor %}
35
+ ]
36
+
37
+ @app.call_tool()
38
+ async def call_tool(name: str, arguments: Any) -> list[TextContent]:
39
+ """Handle tool calls"""
40
+ {% for tool in tools %}
41
+ {% if loop.first %}if{% else %}elif{% endif %} name == "{{ tool.name }}":
42
+ return await {{ tool.function_name }}(arguments)
43
+ {% endfor %}
44
+ else:
45
+ raise ValueError(f"Unknown tool: {name}")
46
+
47
+ {% for tool in tools %}
48
+ async def {{ tool.function_name }}(args: dict) -> list[TextContent]:
49
+ """{{ tool.description }}"""
50
+ async with httpx.AsyncClient() as client:
51
+ {% if tool.method == "GET" %}
52
+ response = await client.get(
53
+ "{{ tool.endpoint }}",
54
+ params=args,
55
+ headers={{ tool.headers | tojson }}
56
+ )
57
+ {% elif tool.method == "POST" %}
58
+ response = await client.post(
59
+ "{{ tool.endpoint }}",
60
+ json=args,
61
+ headers={{ tool.headers | tojson }}
62
+ )
63
+ {% endif %}
64
+
65
+ return [TextContent(
66
+ type="text",
67
+ text=json.dumps(response.json(), indent=2)
68
+ )]
69
+
70
+ {% endfor %}
71
+
72
+ async def main():
73
+ """Run the MCP server"""
74
+ async with stdio_server() as (read_stream, write_stream):
75
+ await app.run(
76
+ read_stream,
77
+ write_stream,
78
+ app.create_initialization_options()
79
+ )
80
+
81
+ if __name__ == "__main__":
82
+ asyncio.run(main())
src/templates/readme_template.md.jinja ADDED
@@ -0,0 +1,70 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # {{ api_name }} MCP Server
2
+
3
+ Auto-generated MCP server for {{ api_name }} API.
4
+
5
+ **Generated:** {{ timestamp }}
6
+ **API URL:** {{ api_url }}
7
+
8
+ ## πŸš€ Quick Start
9
+
10
+ ### Option 1: Use Hosted Version (Easiest!)
11
+
12
+ This MCP is already running! Just add to your Claude Desktop config:
13
+
14
+ ```json
15
+ {
16
+ "mcpServers": {
17
+ "{{ server_name }}": {
18
+ "url": "{{ hosted_url }}"
19
+ }
20
+ }
21
+ }
22
+ ```
23
+
24
+ ### Option 2: Run Locally
25
+
26
+ ```bash
27
+ # Install dependencies
28
+ pip install -r requirements.txt
29
+
30
+ # Run the server
31
+ python server.py
32
+ ```
33
+
34
+ ## πŸ› οΈ Available Tools
35
+
36
+ {% for tool in tools %}
37
+ ### `{{ tool.name }}`
38
+
39
+ {{ tool.description }}
40
+
41
+ **Parameters:**
42
+ {% for param, schema in tool.parameters.items() %}
43
+ - `{{ param }}` ({{ schema.type }}): {{ schema.description }}
44
+ {% endfor %}
45
+
46
+ **Example:**
47
+ ```json
48
+ {
49
+ {% for param in tool.required %}
50
+ "{{ param }}": "example_value"{% if not loop.last %},{% endif %}
51
+ {% endfor %}
52
+ }
53
+ ```
54
+
55
+ ---
56
+ {% endfor %}
57
+
58
+ ## πŸ“ API Information
59
+
60
+ - **Base URL:** {{ api_url }}
61
+ - **Authentication:** {{ auth_type }}
62
+ - **Rate Limits:** {{ rate_limits }}
63
+
64
+ ## πŸ€– Generated by
65
+
66
+ [MCP Generator](https://huggingface.co/spaces/MCP-1st-Birthday/mcp-generator) - Turn any API into an MCP server in seconds!
67
+
68
+ ---
69
+
70
+ **Note:** This is an auto-generated MCP server. You may need to customize authentication and error handling for production use.