| Crates.io | vex-llm |
| lib.rs | vex-llm |
| version | 0.1.4 |
| created_at | 2025-12-14 21:53:30.752806+00 |
| updated_at | 2025-12-20 03:46:51.33019+00 |
| description | LLM provider integrations for VEX |
| homepage | |
| repository | https://github.com/provnai/vex |
| max_upload_size | |
| id | 1985158 |
| size | 262,432 |
LLM provider integrations for the VEX Protocol.
[dependencies]
vex-llm = "0.1"
# With OpenAI support
vex-llm = { version = "0.1", features = ["openai"] }
use vex_llm::{LlmProvider, OllamaProvider};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let provider = OllamaProvider::new("http://localhost:11434");
let response = provider.complete("Hello, world!").await?;
println!("{}", response);
Ok(())
}
MIT License - see LICENSE for details.