aidale-provider

Crates.ioaidale-provider
lib.rsaidale-provider
version0.1.0
created_at2025-11-01 04:13:55.702239+00
updated_at2025-11-01 04:13:55.702239+00
descriptionAI provider implementations for Aidale (OpenAI, Anthropic, etc.)
homepagehttps://github.com/hanxuanliang/aidale
repositoryhttps://github.com/hanxuanliang/aidale
max_upload_size
id1911678
size84,086
hanhotfox (hanxuanliang)

documentation

https://docs.rs/aidale

README

aidale-provider

AI provider implementations for Aidale (OpenAI, DeepSeek, etc.).

Crates.io Documentation License

Overview

aidale-provider contains concrete implementations of the Provider trait from aidale-core:

  • OpenAI: GPT-3.5, GPT-4, and compatible APIs
  • DeepSeek: DeepSeek Chat with automatic configuration
  • Extensible for custom providers

Supported Providers

OpenAI

use aidale_provider::OpenAiProvider;

// Default OpenAI API
let provider = OpenAiProvider::new("your-api-key");

// Custom base URL (for compatible APIs)
let provider = OpenAiProvider::builder()
    .api_key("your-api-key")
    .api_base("https://api.custom.com/v1")
    .build_with_id("custom", "Custom API")?;

DeepSeek

use aidale_provider::deepseek;

// One-liner setup with automatic configuration
let provider = deepseek("your-api-key")?;

The deepseek() helper automatically configures:

  • Base URL: https://api.deepseek.com/v1
  • Provider ID: deepseek
  • Provider name: DeepSeek

Features

  • OpenAI-compatible: Works with any OpenAI-compatible API
  • Streaming support: Full support for streaming responses
  • Type-safe: Strongly-typed request/response models
  • Async-first: Built on tokio and reqwest

Usage

Via the main aidale crate:

[dependencies]
aidale = { version = "0.1", features = ["openai"] }

Directly:

[dependencies]
aidale-provider = "0.1"
aidale-core = "0.1"

Examples

Basic Usage

use aidale_core::Provider;
use aidale_provider::OpenAiProvider;

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
    let provider = OpenAiProvider::new("sk-...");

    let params = ChatCompletionParams {
        messages: vec![
            Message::system("You are a helpful assistant."),
            Message::user("Hello!"),
        ],
        ..Default::default()
    };

    let response = provider
        .chat_completion("gpt-3.5-turbo", params)
        .await?;

    println!("{}", response.choices[0].message.content);
    Ok(())
}

Streaming

use futures::StreamExt;

let mut stream = provider
    .stream_chat_completion("gpt-3.5-turbo", params)
    .await?;

while let Some(chunk) = stream.next().await {
    let chunk = chunk?;
    if let Some(content) = chunk.choices[0].delta.content {
        print!("{}", content);
    }
}

Custom Providers

Implement the Provider trait from aidale-core:

use aidale_core::{Provider, ChatCompletionParams, ChatCompletionResponse};
use async_trait::async_trait;

pub struct MyCustomProvider {
    // Your fields
}

#[async_trait]
impl Provider for MyCustomProvider {
    fn id(&self) -> &str { "custom" }
    fn name(&self) -> &str { "My Custom Provider" }

    async fn chat_completion(
        &self,
        model: &str,
        params: ChatCompletionParams,
    ) -> Result<ChatCompletionResponse> {
        // Your implementation
    }

    // ... other methods
}

Related Crates

License

MIT OR Apache-2.0

Commit count: 0

cargo fmt