The ultimate power tool for operators. Run headless on servers, orchestrate Docker containers, and pipe CLI output directly into your agent's context.
Run Luca as a background service on your servers or Raspberry Pi. Access the agent via WebSocket API or CLI.
First-class integration with Docker engine. Luca can spin up containers, inspect logs, and manage deployments.
Swap out the underlying LLM backend. Support for Ollama, Llama.cpp, or vLLM running locally.
Install the latest release via our secure script.
curl -fsSL https://get.luca.os/install.sh | sh