| Crates.io | bep-core |
| lib.rs | bep-core |
| version | 0.5.0 |
| created_at | 2024-12-19 04:31:51.905625+00 |
| updated_at | 2024-12-19 04:31:51.905625+00 |
| description | An opinionated library for building LLM powered applications. |
| homepage | |
| repository | https://github.com/bepdotai/bep |
| max_upload_size | |
| id | 1488789 |
| size | 471,493 |
Bep is a Rust library for building LLM-powered applications that focuses on ergonomics and modularity.
More information about this crate can be found in the crate documentation.
cargo add bep-core
use bep::{completion::Prompt, providers::openai};
#[tokio::main]
async fn main() {
// Create OpenAI client and model
// This requires the `OPENAI_API_KEY` environment variable to be set.
let openai_client = openai::Client::from_env();
let gpt4 = openai_client.model("gpt-4").build();
// Prompt the model and print its response
let response = gpt4
.prompt("Who are you?")
.await
.expect("Failed to prompt GPT-4");
println!("GPT-4: {response}");
}
Note using #[tokio::main] requires you enable tokio's macros and rt-multi-thread features
or just full to enable all features (cargo add tokio --features macros,rt-multi-thread).
Bep supports the following LLM providers natively:
Additionally, Bep currently has the following integration sub-libraries:
bep-mongodb