Run Cogitator on your own hardware. Your data stays with you.
Download the macOS app and drag it to your Applications folder.
Open Cogitator, go to the Models page, and configure your AI provider (OpenAI, Anthropic, Google, or others). You can also use local models through Ollama for fully offline operation.
Start a conversation. Everything runs locally on your Mac.
Pull the image:
docker pull deiu/cogitator
Run the container:
docker run -d \
--name cogitator \
-p 8484:8484 \
-v cogitator-data:/data \
deiu/cogitator
Open http://localhost:8484 in your browser.
Go to the Models page and configure your AI provider.
You can also use local models through Ollama.
For full configuration options, environment variables, and Docker Compose examples, see the Docker Hub documentation.