mini_ollama_client

Crates.iomini_ollama_client
lib.rsmini_ollama_client
version0.1.0
created_at2025-01-10 21:09:10.694548+00
updated_at2025-01-10 21:09:10.694548+00
descriptionSimple ollama client with minimal dependency in rust
homepage
repositoryhttps://github.com/roquess/mini_ollama_client
max_upload_size
id1511733
size7,453
Roques Steve (roquess)

documentation

README

mini_ollama_client

Simple Ollama client with minimal dependency in rust

Ollama Client Library

This is a simple Rust library to interact with the Ollama server. It provides a minimal set of dependencies and functionality to send requests to the server and receive responses.

License

This project is licensed under the MIT License. See the LICENSE file for details.

Features

  • Minimal dependencies
  • Simple API to send requests to the Ollama server
  • Default model "phi3" if none is specified

Usage

Add the following to your Cargo.toml:

[dependencies]
mini_ollama_client = "0.1.0"

Example

Here is an example of how to use the ollama_client library:

use mini_ollama_client::send_request;
use std::error::Error;

fn main() -> Result<(), Box<dyn Error>> {
    println!("Starting the Ollama client...");

    let server = "localhost:11434";
    let prompt = "Hello ollama.";
    let model = None; // You can specify a model here or leave None to use "phi3" by default

    match send_request(server, prompt, model) {
        Ok(response) => println!("Response: {}", response),
        Err(e) => eprintln!("Error: {}", e),
    }

    Ok(())
}

Contributing

Contributions are welcome! Please open an issue or submit a pull request.

Commit count: 0

cargo fmt