sermo

Crates.iosermo
lib.rssermo
version0.1.0
created_at2025-03-23 22:04:04.92049+00
updated_at2025-03-23 22:04:04.92049+00
descriptionA Rust client library for interacting with various LLM provider APIs
homepagehttps://github.com/mdizak/rust-sermo
repositoryhttps://github.com/mdizak/rust-sermo
max_upload_size
id1603117
size40,451
(mdizak)

documentation

https://docs.rs/sermo

README

Sermo

A Rust client library for interacting with various Large Language Model (LLM) provider APIs.

Crates.io Docs.rs License: MIT

Features

  • Supports multiple LLM providers (Ollama, OpenAI, Anthropic, Google, X.ai, Mistral, Deepseek, Groq, TogetherAI, and more)
  • Simple API for sending chat messages and receiving responses
  • Configurable model settings (temperature, max tokens)
  • Flexible JSON extraction from responses

Installation

Add this to your Cargo.toml:

[dependencies]
sermo = "0.1.0"

Usage
Here's a quick example of using Sermo with Ollama:
rust

use sermo::{LlmProfile, LlmProvider};

fn main() -> Result<(), std::io::Error> {
    let profile = LlmProfile {
        provider: LlmProvider::ollama,
        api_key: String::new(),
        model_name: "llama2".to_string(),
        temperature: Some(0.7),
        max_tokens: Some(100),
        api_url: "http://localhost:11434/api/chat".to_string(),
    };

    let response = profile.send_single("Hello! Tell me something about Rust.")?;
    println!("Response: {}", response);

    Ok(())
}

Run the example with:
bash

cargo run --example ollama

Supported Providers

    Ollama
    OpenAI
    Anthropic
    Google Gemini
    X.ai
    Mistral
    Deepseek
    Groq
    TogetherAI
    Custom (via other provider)

Documentation
Full documentation is available on Docs.rs.
Contributing
Contributions are welcome! Please open an issue or submit a pull request on GitHub.
License
This project is licensed under the MIT License - see the LICENSE file for details.
Author
Matt Dizak matt@cicero.sh (mailto:matt@cicero.sh)


Commit count: 2

cargo fmt