Spaces:
Sleeping
Sleeping
File size: 7,025 Bytes
40ee6b4 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 |
# Hugging Face Spaces Deployment Guide
This guide walks you through deploying the LangGraph Multi-Agent MCTS demo to Hugging Face Spaces.
## Prerequisites
- [Hugging Face Account](https://huggingface.co/join)
- Git installed locally
- Python 3.10+ (for local testing)
## Step 1: Create a New Space
1. Go to [Hugging Face Spaces](https://huggingface.co/spaces)
2. Click **"Create new Space"**
3. Fill in the form:
- **Owner**: Your username or organization
- **Space name**: `langgraph-mcts-demo` (or your choice)
- **License**: MIT
- **SDK**: Gradio
- **Hardware**: CPU Basic (Free tier - sufficient for demo)
- **Visibility**: Public (or Private)
4. Click **"Create Space"**
## Step 2: Clone and Deploy
### Option A: Git-based Deployment (Recommended)
```bash
# 1. Clone your new empty Space
git clone https://huggingface.co/spaces/YOUR_USERNAME/langgraph-mcts-demo
cd langgraph-mcts-demo
# 2. Copy demo files from this directory
cp -r /path/to/huggingface_space/* .
cp -r /path/to/huggingface_space/.gitignore .
# 3. Verify structure
ls -la
# Should show:
# - app.py
# - requirements.txt
# - README.md
# - .gitignore
# - demo_src/
# - __init__.py
# - agents_demo.py
# - llm_mock.py
# - mcts_demo.py
# 4. Commit and push
git add -A
git commit -m "Initial deployment of LangGraph Multi-Agent MCTS demo"
git push
# 5. Space will automatically build and deploy (takes 2-5 minutes)
```
### Option B: Direct Upload via Web UI
1. Navigate to your Space on Hugging Face
2. Click **"Files"** tab
3. Click **"Add file"** β **"Upload files"**
4. Upload all files maintaining the directory structure:
- `app.py`
- `requirements.txt`
- `README.md`
- `.gitignore`
- `demo_src/__init__.py`
- `demo_src/agents_demo.py`
- `demo_src/llm_mock.py`
- `demo_src/mcts_demo.py`
5. Commit changes
## Step 3: Monitor Deployment
1. Go to your Space URL: `https://huggingface.co/spaces/YOUR_USERNAME/langgraph-mcts-demo`
2. Click **"Logs"** tab to monitor build progress
3. Wait for "Running on" message
4. Your demo is now live!
## Step 4: Test the Demo
1. Enter a query or select an example
2. Enable/disable different agents
3. Adjust MCTS parameters
4. Click "Process Query"
5. Review results and consensus scores
## Optional: Enable Real LLM Responses
To use Hugging Face Inference API instead of mock responses:
### 1. Update requirements.txt
```txt
gradio>=4.0.0,<5.0.0
numpy>=1.24.0,<2.0.0
huggingface_hub>=0.20.0
```
### 2. Add Secret Token
1. Go to Space Settings β **Repository secrets**
2. Add new secret:
- Name: `HF_TOKEN`
- Value: Your Hugging Face token (from [Settings β Access Tokens](https://huggingface.co/settings/tokens))
### 3. Update app.py Initialization
Change line ~290 in `app.py`:
```python
# From:
framework = MultiAgentFrameworkDemo(use_hf_inference=False)
# To:
import os
framework = MultiAgentFrameworkDemo(
use_hf_inference=True,
hf_model="mistralai/Mistral-7B-Instruct-v0.2"
)
```
### 4. Commit and Push
```bash
git add -A
git commit -m "Enable Hugging Face Inference API"
git push
```
## Optional: Enable Weights & Biases Tracking
Track experiments and visualize metrics with W&B integration.
### 1. Get W&B API Key
1. Sign up at [wandb.ai](https://wandb.ai)
2. Go to Settings β API Keys
3. Copy your API key
### 2. Add W&B Secret to Space
1. Go to Space Settings β **Repository secrets**
2. Add new secret:
- Name: `WANDB_API_KEY`
- Value: Your W&B API key
### 3. Use W&B in the Demo
1. Expand "Weights & Biases Tracking" accordion in the UI
2. Check "Enable W&B Tracking"
3. Optionally set:
- **Project Name**: Your W&B project (default: `langgraph-mcts-demo`)
- **Run Name**: Custom name for this run (auto-generated if empty)
4. Process your query
5. View the W&B run URL in the results
### 4. What Gets Logged
- **Agent Metrics**: Confidence scores, execution times, response lengths
- **MCTS Metrics**: Best value, visits, tree depth, exploration paths
- **Consensus Metrics**: Agreement scores, agent combinations
- **Performance**: Total processing time
- **Artifacts**: Full JSON results as artifacts
### 5. View Your Dashboard
After runs, visit your W&B project dashboard to:
- Compare different agent configurations
- Visualize consensus patterns
- Analyze MCTS exploration strategies
- Track performance over time
## Customization Options
### Change Gradio Theme
In `app.py`, modify:
```python
with gr.Blocks(
theme=gr.themes.Soft(), # Try: Default(), Monochrome(), Glass()
...
) as demo:
```
### Add Custom Examples
Update `EXAMPLE_QUERIES` list in `app.py`:
```python
EXAMPLE_QUERIES = [
"Your custom query 1",
"Your custom query 2",
...
]
```
### Adjust MCTS Parameters
Modify sliders in `app.py`:
```python
mcts_iterations = gr.Slider(
minimum=10,
maximum=200, # Increase for more thorough search
value=50, # Change default
...
)
```
### Add More Agent Types
1. Create new agent in `demo_src/agents_demo.py`
2. Add to `MultiAgentFrameworkDemo` in `app.py`
3. Add UI controls in Gradio interface
## Troubleshooting
### Build Fails
- Check **Logs** tab for error details
- Verify `requirements.txt` has compatible versions
- Ensure all imports in `app.py` are satisfied
### Slow Performance
- Reduce default MCTS iterations
- Use mock LLM (no API calls)
- Simplify tree visualization
### Memory Issues (Free Tier)
- Limit max MCTS iterations to 100
- Reduce tree depth in `demo_src/mcts_demo.py`
- Simplify response generation
### Missing Files
Ensure directory structure:
```
your-space/
βββ app.py
βββ requirements.txt
βββ README.md
βββ .gitignore
βββ demo_src/
βββ __init__.py
βββ agents_demo.py
βββ llm_mock.py
βββ mcts_demo.py
βββ wandb_tracker.py
```
## Upgrading Hardware
For better performance:
1. Go to Space Settings
2. Under **Hardware**, select:
- **CPU Upgrade** ($0.03/hr) - Faster processing
- **T4 Small** ($0.60/hr) - GPU for neural models
3. Save changes
## Sharing Your Space
### Embed in Website
```html
<iframe
src="https://YOUR_USERNAME-langgraph-mcts-demo.hf.space"
frameborder="0"
width="100%"
height="600"
></iframe>
```
### Direct Link
Share: `https://huggingface.co/spaces/YOUR_USERNAME/langgraph-mcts-demo`
### API Access
Gradio automatically provides API endpoint:
```
https://YOUR_USERNAME-langgraph-mcts-demo.hf.space/api/predict
```
## Next Steps
1. **Collect Feedback**: Enable flagging for user feedback
2. **Add Analytics**: Track usage patterns
3. **Extend Agents**: Add domain-specific reasoning modules
4. **Integrate RAG**: Connect to vector databases for real context
5. **Add Visualization**: Enhanced tree and consensus displays
## Support
- **Hugging Face Docs**: https://huggingface.co/docs/hub/spaces
- **Gradio Docs**: https://www.gradio.app/docs
- **Full Framework**: https://github.com/ianshank/langgraph_multi_agent_mcts
---
**Happy Deploying!** π
|