Crates.io | rig-core |
lib.rs | rig-core |
version | |
source | src |
created_at | 2024-05-29 20:20:19.887346 |
updated_at | 2024-12-03 21:57:56.216108 |
description | An opinionated library for building LLM powered applications. |
homepage | |
repository | https://github.com/0xPlaygrounds/rig |
max_upload_size | |
id | 1256017 |
Cargo.toml error: | TOML parse error at line 17, column 1 | 17 | autolib = false | ^^^^^^^ unknown field `autolib`, expected one of `name`, `version`, `edition`, `authors`, `description`, `readme`, `license`, `repository`, `homepage`, `documentation`, `build`, `resolver`, `links`, `default-run`, `default_dash_run`, `rust-version`, `rust_dash_version`, `rust_version`, `license-file`, `license_dash_file`, `license_file`, `licenseFile`, `license_capital_file`, `forced-target`, `forced_dash_target`, `autobins`, `autotests`, `autoexamples`, `autobenches`, `publish`, `metadata`, `keywords`, `categories`, `exclude`, `include` |
size | 0 |
Rig is a Rust library for building LLM-powered applications that focuses on ergonomics and modularity.
More information about this crate can be found in the crate documentation.
cargo add rig-core
use rig::{completion::Prompt, providers::openai};
#[tokio::main]
async fn main() {
// Create OpenAI client and model
// This requires the `OPENAI_API_KEY` environment variable to be set.
let openai_client = openai::Client::from_env();
let gpt4 = openai_client.model("gpt-4").build();
// Prompt the model and print its response
let response = gpt4
.prompt("Who are you?")
.await
.expect("Failed to prompt GPT-4");
println!("GPT-4: {response}");
}
Note using #[tokio::main]
requires you enable tokio's macros
and rt-multi-thread
features
or just full
to enable all features (cargo add tokio --features macros,rt-multi-thread
).
Rig supports the following LLM providers natively:
Additionally, Rig currently has the following integration sub-libraries:
rig-mongodb