| Crates.io | aisdk |
| lib.rs | aisdk |
| version | 0.4.0 |
| created_at | 2025-09-07 11:36:39.212704+00 |
| updated_at | 2026-01-24 10:30:28.039727+00 |
| description | An open-source Rust library for building AI-powered applications, inspired by the Vercel AI SDK. It provides a robust, type-safe, and easy-to-use interface for interacting with various Large Language Models (LLMs). |
| homepage | https://github.com/lazy-hq/aisdk |
| repository | https://github.com/lazy-hq/aisdk |
| max_upload_size | |
| id | 1828054 |
| size | 674,469 |
An open-source, provider-agnostic Rust library for building AI-powered applications, inspired by the Vercel AI SDK. It provides a type-safe interface for interacting with Large Language Models (LLMs) and offers seamless support for Rust backend frameworks as well as popular UI frameworks like React, Solid, Vue, Svelte, and more.
To learn more about how to use the AI SDK, check out our Documentation and API Reference.
cargo add aisdk
Enable Providers of your choice such as OpenAI, Anthropic, Google, and more
cargo add aisdk --features openai
use aisdk::core::LanguageModelRequest;
use aisdk::providers::OpenAI;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let openai = OpenAI::gpt_5();
let result = LanguageModelRequest::builder()
.model(openai)
.prompt("What is the meaning of life?")
.build()
.generate_text() // or stream_text() for streaming
.await?;
println!("Response: {:?}", result.text());
Ok(())
}
Use the #[tool] macro to expose a Rust function as a callable tool.
use aisdk::core::Tool;
use aisdk::macros::tool;
#[tool]
/// Get the weather information given a location
pub fn get_weather(location: String) -> Tool {
let weather = match location.as_str() {
"New York" => 75,
"Tokyo" => 80,
_ => 70,
};
Ok(weather.to_string())
}
Register tools with an agent so the model can call them during its reasoning loop.
use aisdk::core::{LanguageModelRequest, utils::step_count_is};
use aisdk::providers::OpenAI;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let result = LanguageModelRequest::builder()
.model(OpenAI::gpt_4o())
.system("You are a helpful assistant.")
.prompt("What is the weather in New York?")
.with_tool(get_weather())
.stop_when(step_count_is(3)) // Limit agent loop to 3 steps
.build()
.generate_text()
.await?;
println!("Response: {:?}", result.text());
Ok(())
}
Define your target output format.
use serde::Deserialize;
use schemars::JsonSchema;
#[derive(JsonSchema, Deserialize, Debug)]
struct User {
name: String,
age: u32,
email: Option<String>,
}
Use the schema attribute to infer the structure of the output.
use aisdk::core::LanguageModelRequest;
use aisdk::providers::OpenAI;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let user: User = LanguageModelRequest::builder()
.model(OpenAI::gpt_5())
.prompt("Generate a random user")
.schema::<User>()
.build()
.generate_text()
.await?
.into_schema()?;
println!("Name: {}", user.name);
println!("Age: {}", user.age);
println!("Email: {}", user.email.unwrap_or_default());
Ok(())
}
The AISDK prompt feature provides a powerful, file-based template system for managing AI prompts using the Tera template engine. It allows you to create reusable prompt templates with variable substitution, conditionals, loops, and template inclusion. See Examples for more template examples. Enable with cargo add aisdk --features prompt
We welcome contributions! Please see CONTRIBUTING.md for guidelines.
Licensed under the MIT License. See LICENSE for details.