Use CodeGate with Aider
Aider is an open source AI coding assistant that lets you pair program with LLMs in your terminal.
CodeGate works with the following AI model providers through Aider:
This guide assumes you have already installed Aider using their installation instructions.
Configure Aider to use CodeGate
To configure Aider to send requests through CodeGate:
- OpenAI
- Ollama
You need an OpenAI API account to use this provider.
Before you run Aider, set environment variables for your API key and to set the API base URL to CodeGate's API port. Alternately, use one of Aider's other supported configuration methods to set the corresponding values.
- macOS / Linux
- Windows
export OPENAI_API_KEY=<YOUR_API_KEY>
export OPENAI_API_BASE=http://localhost:8989/openai
To persist these variables, add them to your shell profile (e.g., ~/.bashrc
or
~/.zshrc
).
setx OPENAI_API_KEY <YOUR_API_KEY>
setx OPENAI_API_BASE http://localhost:8989/openai
Restart your shell after running setx
.
Replace <YOUR_API_KEY>
with your
OpenAI API key.
Then run aider
as normal. For more information, see the
Aider docs for connecting to OpenAI.
You need Ollama installed on your local system with the server running
(ollama serve
) to use this provider.
CodeGate connects to http://host.docker.internal:11434
by default. If you
changed the default Ollama server port or to connect to a remote Ollama
instance, launch CodeGate with the CODEGATE_OLLAMA_URL
environment variable
set to the correct URL. See Configure CodeGate.
Before you run Aider, set the Ollama base URL to CodeGate's API port using an environment variable. Alternately, use one of Aider's other supported configuration methods to set the corresponding values.
- macOS / Linux
- Windows
export OLLAMA_API_BASE=http://localhost:8989/ollama
To persist this setting, add it to your shell profile (e.g., ~/.bashrc
or
~/.zshrc
) or use one of Aider's other
supported configuration methods.
setx OLLAMA_API_BASE http://localhost:8989/ollama
Restart your shell after running setx
.
Then run Aider:
aider --model ollama/<MODEL_NAME>
Replace <MODEL_NAME>
with the name of a coding model you have installed
locally using ollama pull
.
We recommend the Qwen2.5-Coder
series of models. Our minimum recommendation for quality results is the 7
billion parameter (7B) version, qwen2.5-coder:7b
.
This model balances performance and quality for typical systems with at least 4 CPU cores and 16GB of RAM. If you have more compute resources available, our experimentation shows that larger models do yield better results.
For more information, see the Aider docs for connecting to Ollama.
Verify configuration
To verify that you've successfully connected Aider to CodeGate, type
/ask codegate-version
into the Aider chat in your terminal. You should receive
a response like "CodeGate version 0.1.0":
Next steps
Learn more about CodeGate's features:
Remove CodeGate
If you decide to stop using CodeGate, follow these steps to remove it and revert your environment.
-
Stop Aider and unset the environment variables you set during the configuration process:
OpenAI:
unset OPENAI_API_BASE
(macOS/Linux) orsetx OPENAI_API_BASE ""
(Windows)Ollama:
unset OLLAMA_API_BASE
(macOS/Linux) orsetx OLLAMA_API_BASE ""
(Windows) -
Re-launch Aider.
-
Stop and remove the CodeGate container:
docker stop codegate && docker rm codegate
-
If you launched CodeGate with a persistent volume, delete it to remove the CodeGate database and other files:
docker volume rm codegate_volume