Ollama
Run Commands
Start interactive chat sessions with LLMs using Ollama. Run models with custom parameters, set system prompts, and control generation settings.
7 commands
Pro Tips
Press Ctrl+D to exit chat, or type '/bye' to end the session gracefully.
Use '/set system <prompt>' in chat to change the system prompt mid-conversation.
Pass '--verbose' to see generation speed and token statistics.
Common Mistakes
First run loads the model into memory - subsequent prompts are much faster.
Running multiple models simultaneously multiplies memory requirements.