Choose how you want to connect to an LLM.
Easiest setup. Uses our cloud models.
Sign in to use cloud models
Connect to llama.cpp, LM Studio, Ollama, or any local server.
Free, unlimited, runs on your hardware.
Connect to OpenRouter, Together AI, or any OpenAI-compatible endpoint.
Bring your own API key.
No connections yet.
Add your first connection to get started.
This will permanently remove this connection. Any model stacks referencing it will be updated.