Aichat with local docs

Aichat with local docs
Want to chat with your proprietary documentation on your own machine with completely free tools? Here’s how. You will need aichat, an AI CLI tool and Ollama, a local model server. First, let’s get Ollama set up.
Install
Download and install Ollama from https://ollama.com/download. Then pull a chat model and an embedding model:
ollama pull deepseek-r1:7b
ollama pull nomic-embed-text:latest
That’s it for Ollama. Now let’s get aichat
set up. Install it with:
brew install aichat
Configure
Configure ~/.config/aichat/config.yaml
with:
# see https://github.com/sigoden/aichat/blob/main/config.example.yaml
model: ollama:deepseek-r1:7b
clients:
- type: openai-compatible
name: ollama
api_base: http://localhost:11434/v1
models:
- name: deepseek-r1:7b
max_input_tokens: 131072
supports_reasoning: true
- name: nomic-embed-text
type: embedding
max_tokens_per_chunk: 8192
default_chunk_size: 1000
max_batch_size: 50
Run
Finally, run aichat in REPL mode and start a new RAG instance for your docs called mydocs
:
aichat
.rag mydocs
You will now be prompted to add documents. These can be any text you want to
chat with. You can even proved a URL to pull content from. Here’s
an example of using the aichat wiki
as rag content. To add more documents to this instance, just type
.edit rag-docs
and add more docs to the file.
You’re done!
🚀 That’s it! You’re now chatting with your own proprietary docs. To exit, type
.exit
. To get started again, just runaichat
and type.rag mydocs
. Enjoy!