Kowalski
"AI agents are like pets - they're cute but they make a mess." - Anonymous AI Developer "Programming is like writing a love letter to a computer that doesn't love you back." - Unknown
A Rust-based agent for interacting with Ollama models. Because apparently, we need another way to talk to AI.
Project Overview
This project implements a basic agent that can communicate with Ollama's API, supporting both regular chat and streaming responses. It's built as a learning exercise and foundation for more complex agent implementations.
"Simplicity is prerequisite for reliability." - Edsger W. Dijkstra
Features
"Features are like promises - they're great until you try to use them." - A Disappointed User
- 🤖 Multiple Model Support: Because one AI model is never enough
- 💬 Conversation Management: Keep track of your AI's ramblings
- 🎭 Role-Based Interactions: Give your AI a personality (or at least pretend to)
- 📝 PDF and Text File Support: Read files because typing is too mainstream
- 🔄 Streaming Responses: Watch your AI think in real-time (it's more exciting than it sounds)
- ⚙️ Configurable Settings: Customize everything until it breaks
Installation
"Installation is like cooking - it's easy until you burn something." - A Frustrated Developer
-
Clone the repository (because copying files manually is so last year):
-
Build the project (and pray it works):
-
Run the agent (and hope for the best):
Usage
"Usage instructions are like recipes - nobody reads them until something goes wrong." - A Support Agent
Basic Usage
use ;
// Load configuration
let config = load?;
// Create agents (because one agent is never enough)
let academic_agent = new?;
let tooling_agent = new?;
// Start conversations (double the fun, double the existential crisis)
let model_name = "llama2";
let academic_conv_id = academic_agent.start_conversation;
let tooling_conv_id = tooling_agent.start_conversation;
General Chat
use GeneralAgent;
// Create a general-purpose chat agent
let general_agent = new?;
// Optionally customize the system prompt
let general_agent = general_agent.with_system_prompt;
// Start a conversation
let conv_id = general_agent.start_conversation;
// Simple chat interaction
let mut response = general_agent
.chat_with_history
.await?;
// Process streaming response
while let Some = response.chunk.await?
// Continue the conversation with context
let mut response = general_agent
.chat_with_history
.await?;
Academic Research
use ;
// Create a role for academic translation
let role = translator;
// Process a research paper
let mut response = academic_agent
.chat_with_history
.await?;
// Process streaming response
while let Some = response.chunk.await?
Web Research
// Perform web search
let query = "Latest developments in Rust programming";
let search_results = tooling_agent.search.await?;
// Process search results
for result in &search_results
// Fetch and analyze a webpage
if let Some = search_results.first
Configuration
"Configuration is like a relationship - it's complicated until you give up." - A System Administrator
The agent can be configured using a TOML file or environment variables:
[]
= "http://localhost:11434"
= "mistral-small"
[]
= 0.7
= 512
= true
Contributing
"Contributing is like dating - it's fun until someone suggests changes." - An Open Source Maintainer
Contributions are welcome! Please feel free to submit a Pull Request. Just remember:
- Keep it clean (unlike my code)
- Add tests (because we all love writing tests)
- Update documentation (because reading code is so last year)
License
"Licenses are like prenuptial agreements - they're boring until you need them." - A Lawyer
This project is licensed under the MIT License - see the LICENSE file for details.
Acknowledgments
"Acknowledgments are like thank you notes - they're nice but nobody reads them." - A Grateful Developer
- Thanks to the Ollama team for making this possible
- Thanks to all contributors who helped make this project better
- Thanks to my coffee machine for keeping me awake during development
VISION
@see features
ROADMAP
@see roadmap