Emacs Setup Guide
Configure Emacs AI packages to use your own AI subscriptions through KorProxy
1Overview
Emacs has several excellent packages for AI-assisted coding and writing, including gptel, chatgpt-shell, and org-ai.
KorProxy provides an OpenAI-compatible endpoint at localhost:1337/v1 that these packages can use with minimal configuration.
This guide covers Elisp configuration for the most popular Emacs AI packages.
2Prerequisites
- KorProxy app installed and running
- Emacs 28+ installed (29+ recommended)
- Package manager configured (use-package, straight.el, etc.)
- At least one provider authenticated in KorProxy
3Package Configurations
gptel
A versatile LLM client for Emacs with streaming support. Add to your config:
(use-package gptel
:config
;; Define KorProxy as a custom backend
(setq gptel-backend
(gptel-make-openai "KorProxy"
:host "localhost:1337"
:protocol "http"
:key "korproxy"
:models '("claude-sonnet-4-5-20250929"
"gpt-5.1-codex"
"gemini-2.5-pro")))
;; Set default model
(setq gptel-model "claude-sonnet-4-5-20250929"))chatgpt-shell
Interactive shell for ChatGPT conversations:
(use-package chatgpt-shell
:custom
(chatgpt-shell-api-url-base "http://localhost:1337")
(chatgpt-shell-openai-key "korproxy")
(chatgpt-shell-model-version "gpt-5.1-codex"))org-ai
AI integration for Org-mode documents:
(use-package org-ai
:after org
:custom
(org-ai-openai-api-token "korproxy")
:config
;; Override the API URL
(setq org-ai-openai-chat-endpoint
"http://localhost:1337/v1/chat/completions")
(setq org-ai-default-chat-model "claude-sonnet-4-5-20250929"))ellama
LLM tool with chat and code assistance. Configure for OpenAI-compatible API:
(use-package ellama
:init
(require 'llm-openai)
(setopt ellama-provider
(make-llm-openai-compatible
:url "http://localhost:1337/v1"
:key "korproxy"
:chat-model "claude-sonnet-4-5-20250929")))Alternative: Environment Variables
Some packages respect standard environment variables. Add to your shell config:
# ~/.zshrc or ~/.bashrc
export OPENAI_API_BASE="http://localhost:1337/v1"
export OPENAI_API_KEY="korproxy"
# Or set in Emacs early-init.el
(setenv "OPENAI_API_BASE" "http://localhost:1337/v1")
(setenv "OPENAI_API_KEY" "korproxy")4Selecting Models
Available Models
Use any supported model name—KorProxy routes to the appropriate provider. See the full model list.
claude-sonnet-4-5-20250929Anthropic (balanced)
gpt-5.1-codexOpenAI (standard)
gemini-2.5-proGoogle (flagship)
claude-opus-4-5-20251101Anthropic (premium)
5Troubleshooting
Common Issues
- 1
Connection refused
Check that KorProxy is running—look for the status indicator in the menu bar
- 2
Authentication errors
Verify the provider is authenticated in KorProxy for the model you're using
- 3
Package not loading config
Ensure
:configruns after package load. TryM-x eval-bufferafter changes - 4
SSL/TLS errors
KorProxy uses HTTP locally. Ensure you're using
http://nothttps:// - 5
Missing dependencies
Some packages need
curlorplz.el. Check package requirements
Next Steps
- Supported Models — Full list of available models and capabilities
- Troubleshooting — Common issues and solutions