A Zsh-based CLI tool providing direct terminal access to local Ollama models with context management, session persistence, and customizable system behavior.
- 🗣️ Streaming Output: Continuous token-by-token response display from your local Ollama models.
- 🧠 Global Session Management: Maintains a single conversation context across all terminal instances, with automatic token limit handling regardless of which terminal window you use.
- 📝 Global Persistent Memory: System-wide notes file (
~/.config/ollama/ai_persistent_notes.txt) accessible to the model across all sessions. - 🤖 Customizable System Prompt: Define model behavior and parameters via
~/.config/ollama/ai_system_prompt.txt. - ⚙️ Configurable Settings: Set Ollama model, API endpoint, and context limits in
~/.config/ollama/ai_settings.conf. - ✍️ Multi-line Input: Support for complex multi-line queries.
- 🛠️ Session Management:
- Edit notes, system prompts, and settings with
aisubcommands (e.g.,ai --edit-notes). - Reset context (
ai --reset). - View context info (
ai --info context).
- Edit notes, system prompts, and settings with
- 🚀 Simple Installation: Automated setup via installer script.
Make sure you have these installed:
- Ollama: Running with a model pulled (e.g.,
ollama pull gemma3:4b-it-qat). See ollama.com. jq: JSON processor (e.g.,brew install jqorapt-get install jq).curl: Data transfer tool (usually pre-installed).zsh: The Z shell.
-
Clone or Download:
# If you have git git clone <your_repository_url> # Replace <your_repository_url> with the actual URL cd personal_ollama_cli # If downloaded, navigate to the personal_ollama_cli directory
-
Run the Installer: From the
personal_ollama_clidirectory:./install.sh
The installer will:
- Create config and cache directories (
~/.config/ollama,~/.config/zsh,~/.cache). - Copy
ollama_ai.zshand default configs. - Add a source line to
~/.zshrc.
- Create config and cache directories (
-
Apply Changes: Open a new terminal or run:
source ~/.zshrc
Interact with your Ollama model using the ai command:
Basic Prompt:
ai Tell me a jokeMulti-line Input:
(End with """ on a new line)
ai """
What are the best practices
for writing a good README file?
"""-
Settings:
ai --show-settings: View current settings.ai --edit-settings: Open settings file in your editor.
-
Persistent Notes:
ai --view-notes: Show your persistent notes.ai --edit-notes: Edit your notes. Psst! Customize this with your info!
-
System Prompt:
ai --view-system: Display the system prompt.ai --edit-system: Customize the AI's base instructions.
-
Conversation Context:
ai --info context: Show context token count and limit.ai --reset: Clear conversation context.ai -r "New prompt": Reset context and send a new prompt.
-
Temporary Overrides:
ai -m <model_name> "Your prompt": Use a different model for this query (e.g.,ai -m gemma3:12b-it-qat "Hi").ai -s "New system prompt" "Your prompt": Use a different system prompt (resets context).
-
Help:
ai --helporai -h: Show all commands and options.
~/.config/ollama/ai_settings.conf: Change default model (AI_OLLAMA_MODEL), API URL (AI_OLLAMA_API_URL), max context tokens (AI_MAX_CONTEXT_TOKENS).~/.config/ollama/ai_persistent_notes.txt: Global information store for the AI across all sessions.~/.config/ollama/ai_system_prompt.txt: Define the AI's personality and default instructions.
- Remove files from
~/.config/ollama/,~/.config/zsh/ollama_ai.zsh, and~/.cache/ollama_ai_context.json. - Remove the sourcing line from
~/.zshrc.
© All rights reserved.
