| Crates.io | ask_ai |
| lib.rs | ask_ai |
| version | 0.1.4 |
| created_at | 2025-02-03 15:58:09.777474+00 |
| updated_at | 2025-06-02 23:04:46.795517+00 |
| description | A library for interacting with various AI frameworks |
| homepage | |
| repository | https://github.com/EduardoNeville/ask_ai |
| max_upload_size | |
| id | 1540736 |
| size | 111,905 |
This Rust crate provides a unified way to call different Large Language Model (Framework) providers, including OpenAI, Anthropic, and Ollama, enabling the user to ask questions and interact with these models seamlessly. The crate abstracts away the complexities of interacting with different Framework APIs and offers a unified interface to query these models.
Before you can use the crate, you need to configure it through the AiConfig structure. This configuration tells the system:
Framework::OpenAI, Framework::Anthropic, or Framework::Ollama)."chatgpt-4o-latest" for OpenAI or "claude-2" for Anthropic.AiConfiguse ask_ai::config::{AiConfig, Framework};
let ai_config = AiConfig {
llm: Framework::OpenAI, // Specify Framework provider
model: "chatgpt-4o-latest".to_string(), // Specify model
max_token: Some(1000), // Optional: Limit max tokens in response
};
You can ask a one-off question using the following example:
use ask_ai::{config::{AiConfig, Framework, Question}, model::ask_question};
#[tokio::main]
async fn main() {
let ai_config = AiConfig {
llm: Framework::OpenAI,
model: "chatgpt-4o-latest".to_string(),
max_token: Some(1000),
};
let question = Question {
system_prompt: None, // Optional system prompt
messages: None, // No previous history
new_prompt: "What is Rust?".to_string(),
};
match ask_question(&ai_config, question).await {
Ok(answer) => println!("Answer: {}", answer),
Err(e) => eprintln!("Error: {}", e),
}
}
A system-level prompt modifies the assistant's behavior. For example, you might instruct the assistant to answer concisely or role-play as an expert.
let question = Question {
system_prompt: Some("You are an expert Rust programmer. Answer concisely.".to_string()), // Custom prompt
messages: None,
new_prompt: "How do closures work in Rust?".to_string(),
};
To maintain a conversation, you can include previous messages and their respective responses.
use ask_ai::config::{AiPrompt};
let previous_messages = vec![
AiPrompt {
content: "What is Rust?".to_string(),
output: "Rust is a systems programming language focused on safety, speed, and concurrency.".to_string(),
},
AiPrompt {
content: "Why is Rust popular?".to_string(),
output: "Rust is popular because of features like memory safety, modern tooling, and high performance.".to_string(),
},
];
let question = Question {
system_prompt: None,
messages: Some(previous_messages), // Include chat history
new_prompt: "What are Rust's main drawbacks?".to_string(),
};
This crate requires API keys to interface with the Framework providers. Store these keys as environment variables to keep them secure. Below is a list of required variables:
| Provider | Environment Variable |
|---|---|
| OpenAI | OPENAI_API_KEY |
| Anthropic | ANTHROPIC_API_KEY |
| Ollama | No key required currently |
For security, avoid hardcoding API keys into your application code. Use a .env file or a secret storage mechanism.
All interactions with Framework return Result<String>. Errors are encapsulated using the AppError enum, which defines three main error types:
match ask_question(&ai_config, question).await {
Ok(answer) => println!("Answer: {}", answer),
Err(e) => match e {
AppError::ModelError { model_name, failure_str } => {
eprintln!("Model Error: {} - {}", model_name, failure_str);
},
AppError::ApiError { model_name, failure_str } => {
eprintln!("API Error: {:?} - {}", model_name, failure_str);
},
AppError::UnexpectedError(msg) => {
eprintln!("Unexpected Error: {}", msg);
},
},
}
Contributions, bug reports, and feature requests are welcome! Feel free to open an issue or submit a pull request in GitHub.
git clone https://github.com/EduardoNeville/ask_aigit checkout -b feature-nameThis project is licensed under the MIT License.