Secure Proxy Configuration
Using a server-side proxy to keep your API key secure
Security Best Practice
This example uses a proxy server to keep your API key secure. The API key never reaches the client browser.
Security Checklist
- Never commit API keys to version control
- Use environment variables for sensitive data
- Implement rate limiting on your proxy
- Add authentication if needed
- Use HTTPS in production
- Restrict CORS to your domain only
- Monitor API usage and costs
- Log requests for debugging and security audits
🔐 Secure Chatbot
✅ API key stored on server
✅ Secure communication
✅ Production-ready
📚 Setup Instructions
1. Choose Your Proxy
- • Node.js: proxy-nodejs.js
- • PHP: proxy-php.php
- • Cloudflare: proxy-cloudflare-worker.js
2. Set API Key
- • Use environment variables
- • Never commit keys to git
- • Keep keys secure on server
3. Configure Chatbot
- • Point to your proxy URL
- • Remove API key from HTML
- • Test the connection
Node.js/Express Proxy
1. Install Dependencies:
npm install express node-fetch
2. Set Environment Variable:
# Windows
set OPENROUTER_API_KEY=your-api-key-here
# Linux/Mac
export OPENROUTER_API_KEY=your-api-key-here
3. Run Server:
node proxy-nodejs.js
4. Configure Chatbot:
<div data-swc
data-swc-api-mode="ai-only"
data-swc-api-base-url="http://localhost:3000/api/chat"
data-swc-api-model="openai/gpt-3.5-turbo">
</div>
PHP Proxy
1. Upload proxy-php.php to your server
2. Set API Key in .env or code:
// In proxy-php.php
define('OPENROUTER_API_KEY', 'your-api-key-here');
3. Configure Chatbot:
<div data-swc
data-swc-api-mode="ai-only"
data-swc-api-base-url="https://yourdomain.com/api/chat.php"
data-swc-api-model="openai/gpt-3.5-turbo">
</div>
Cloudflare Workers Proxy
1. Install Wrangler CLI:
npm install -g wrangler
2. Set API Key Secret:
wrangler secret put OPENROUTER_API_KEY
3. Deploy Worker:
wrangler deploy
4. Configure Chatbot:
<div data-swc
data-swc-api-mode="ai-only"
data-swc-api-base-url="https://your-worker.workers.dev/chat"
data-swc-api-model="openai/gpt-3.5-turbo">
</div>
Testing Your Proxy
1. Test Health Endpoint:
curl http://localhost:3000/api/health
Expected: {"status":"ok","timestamp":"..."}
2. Test Chat Endpoint:
curl -X POST http://localhost:3000/api/chat \
-H "Content-Type: application/json" \
-d '{
"model": "openai/gpt-3.5-turbo",
"messages": [{"role": "user", "content": "Hello!"}],
"stream": false
}'
3. Check Browser Console:
Open DevTools (F12) and verify no API keys appear in Network tab