Factory Droid CLI Setup Guide
Configure Factory's Droid CLI to use your AI subscriptions through KorProxy
Overview
Factory's Droid is a powerful AI coding assistant that enables end-to-end feature development directly from your terminal. It integrates with your codebase, engineering systems, and workflows.
Using KorProxy with Droid's BYOK (Bring Your Own Key) feature, you can route AI requests through your authenticated OAuth accounts instead of using direct API keys—maximizing the value of subscriptions you already pay for.
Prerequisites
- KorProxy app installed and running on port 1337
- Factory Droid CLI installed (
npm install -g @anthropic/factoryor via the Factory website) - At least one provider authenticated in KorProxy (Gemini, Claude, or Codex)
Configuration
Create or edit config.json
Edit ~/.factory/config.json to add custom models pointing to KorProxy:
For Claude Models (Anthropic)
{
"custom_models": [
{
"model_display_name": "Claude Opus 4.5 Thinking High [KorProxy]",
"model": "claude-opus-4-5-thinking-high",
"base_url": "http://localhost:1337",
"api_key": "not-needed",
"provider": "anthropic",
"max_tokens": 16384
},
{
"model_display_name": "Claude Sonnet 4.5 Thinking [KorProxy]",
"model": "claude-sonnet-4-5-thinking",
"base_url": "http://localhost:1337",
"api_key": "not-needed",
"provider": "anthropic",
"max_tokens": 16384
},
{
"model_display_name": "Claude Haiku 4.5 [KorProxy]",
"model": "claude-haiku-4-5-20251001",
"base_url": "http://localhost:1337",
"api_key": "not-needed",
"provider": "anthropic",
"max_tokens": 8192
}
]
}For OpenAI/Codex Models
{
"custom_models": [
{
"model_display_name": "GPT 5.1 Codex Max XHigh [KorProxy]",
"model": "gpt-5.1-codex-max-xhigh",
"base_url": "http://localhost:1337/v1",
"api_key": "not-needed",
"provider": "openai",
"max_tokens": 128000
},
{
"model_display_name": "GPT 5.1 Codex High [KorProxy]",
"model": "gpt-5.1-codex-high",
"base_url": "http://localhost:1337/v1",
"api_key": "not-needed",
"provider": "openai",
"max_tokens": 128000
},
{
"model_display_name": "GPT 5.1 Codex [KorProxy]",
"model": "gpt-5.1-codex",
"base_url": "http://localhost:1337/v1",
"api_key": "not-needed",
"provider": "openai",
"max_tokens": 128000
}
]
}For Gemini Models
{
"custom_models": [
{
"model_display_name": "Gemini 3 Pro Image [KorProxy]",
"model": "gemini-3-pro-image-preview",
"base_url": "http://localhost:1337/v1",
"api_key": "not-needed",
"provider": "generic-chat-completion-api",
"max_tokens": 8192
},
{
"model_display_name": "Gemini 3 Pro [KorProxy]",
"model": "gemini-3-pro-preview",
"base_url": "http://localhost:1337/v1",
"api_key": "not-needed",
"provider": "generic-chat-completion-api",
"max_tokens": 8192
},
{
"model_display_name": "Gemini 2.5 Pro [KorProxy]",
"model": "gemini-2.5-pro",
"base_url": "http://localhost:1337/v1",
"api_key": "not-needed",
"provider": "generic-chat-completion-api",
"max_tokens": 8192
}
]
}Complete Example (All Providers)
{
"custom_models": [
{
"model_display_name": "Claude Opus 4.5 Thinking High [KorProxy]",
"model": "claude-opus-4-5-thinking-high",
"base_url": "http://localhost:1337",
"api_key": "not-needed",
"provider": "anthropic",
"max_tokens": 16384
},
{
"model_display_name": "Claude Sonnet 4.5 Thinking [KorProxy]",
"model": "claude-sonnet-4-5-thinking",
"base_url": "http://localhost:1337",
"api_key": "not-needed",
"provider": "anthropic",
"max_tokens": 16384
},
{
"model_display_name": "Claude Haiku 4.5 [KorProxy]",
"model": "claude-haiku-4-5-20251001",
"base_url": "http://localhost:1337",
"api_key": "not-needed",
"provider": "anthropic",
"max_tokens": 8192
},
{
"model_display_name": "GPT 5.1 Codex Max XHigh [KorProxy]",
"model": "gpt-5.1-codex-max-xhigh",
"base_url": "http://localhost:1337/v1",
"api_key": "not-needed",
"provider": "openai",
"max_tokens": 128000
},
{
"model_display_name": "GPT 5.1 Codex High [KorProxy]",
"model": "gpt-5.1-codex-high",
"base_url": "http://localhost:1337/v1",
"api_key": "not-needed",
"provider": "openai",
"max_tokens": 128000
},
{
"model_display_name": "Gemini 3 Pro Image [KorProxy]",
"model": "gemini-3-pro-image-preview",
"base_url": "http://localhost:1337/v1",
"api_key": "not-needed",
"provider": "generic-chat-completion-api",
"max_tokens": 8192
}
]
}Using Custom Models
After saving your configuration, your KorProxy models will appear in Droid:
- Start Droid CLI:
droid - Use the
/modelcommand - Look for your models in the "Custom models" section
- Select a KorProxy model to start using your OAuth subscription
How It Works
No API Keys Needed: KorProxy handles authentication using your OAuth sessions, so you can set api_key to any non-empty value
Local Processing: All requests are routed through localhost:1337, keeping your data private and using your subscription quotas
Multi-Account Support: KorProxy load-balances across multiple authenticated accounts to avoid rate limits
Cost Savings: Use the AI models included in your ChatGPT Plus, Claude Pro, or Google subscriptions instead of paying for API credits
Provider Settings
| Provider | base_url | provider type |
|---|---|---|
| Claude (Anthropic) | http://localhost:1337 | anthropic |
| OpenAI / Codex | http://localhost:1337/v1 | openai |
| Gemini / Other | http://localhost:1337/v1 | generic-chat-completion-api |
Troubleshooting
Model not appearing in /model selector
Check JSON syntax in ~/.factory/config.json and restart Droid
"Connection refused" error
Ensure KorProxy is running on port 1337
"401 Unauthorized" or auth errors
Re-authenticate the provider in KorProxy (Providers tab)
"Invalid provider" error
Provider must be exactly: anthropic, openai, or generic-chat-completion-api
Check cost and usage
Use the /cost command in Droid to view cost breakdowns:
/cost