| Crates.io | banana_prompts |
| lib.rs | banana_prompts |
| version | 67.0.90 |
| created_at | 2026-01-04 08:04:57.130488+00 |
| updated_at | 2026-01-04 08:04:57.130488+00 |
| description | High-quality integration for https://bananaproai.com/banana-prompts/ |
| homepage | https://bananaproai.com/banana-prompts/ |
| repository | https://github.com/qy-upup/banana-prompts |
| max_upload_size | |
| id | 2021539 |
| size | 10,809 |
A Rust crate designed to simplify and streamline prompt engineering for large language models. It provides utilities for building, managing, and executing prompts with ease.
Add the following to your Cargo.toml file:
toml
[dependencies]
banana-prompts = "0.1.0" # Replace with the latest version
Here are a few examples demonstrating how to use banana-prompts in different scenarios:
1. Simple Text Generation: rust use banana_prompts::Prompt;
fn main() { let prompt_string = "Write a short poem about autumn."; let prompt = Prompt::new(prompt_string);
// In a real application, you would send this prompt to a language model API.
println!("Prompt: {}", prompt.text());
}
2. Template-Based Prompting with Variables: rust use banana_prompts::Prompt; use std::collections::HashMap;
fn main() { let template = "Write a {tone} email to {recipient} about {subject}."; let mut variables = HashMap::new(); variables.insert("tone".to_string(), "formal".to_string()); variables.insert("recipient".to_string(), "the hiring manager".to_string()); variables.insert("subject".to_string(), "my application".to_string());
let prompt = Prompt::from_template(template, variables);
// In a real application, you would send this prompt to a language model API.
println!("Prompt: {}", prompt.text());
}
3. Building a Prompt with Instructions and Context: rust use banana_prompts::Prompt;
fn main() { let mut prompt = Prompt::new("You are a helpful assistant."); prompt.add_instruction("Answer the following question:"); prompt.add_context("Question: What is the capital of France?");
// In a real application, you would send this prompt to a language model API.
println!("Prompt: {}", prompt.text());
}
4. Prompt with Safety Checks (Example - Requires Implementation):
rust
// Note: This example assumes you have implemented custom safety checks.
// This is a placeholder and requires your own implementation of is_safe.
use banana_prompts::Prompt;
fn is_safe(text: &str) -> bool { // Implement your logic to check for harmful or inappropriate content. // This is a simplified example and should be replaced with a robust solution. !text.contains("harmful") }
fn main() { let prompt_string = "Generate a creative story."; let prompt = Prompt::new(prompt_string);
if is_safe(prompt.text()) {
// Send the prompt to the language model API.
println!("Prompt is safe. Sending to API.");
println!("Prompt: {}", prompt.text());
} else {
println!("Prompt is not safe and will not be sent.");
}
}
5. Chaining Prompts (Example - Requires Implementation of Prompt Chaining Functionality):
rust
// Note: This example assumes you have implemented prompt chaining functionality.
// This is a placeholder and requires your own implementation of chain_prompts.
use banana_prompts::Prompt;
// Placeholder for a function to chain prompts together. // In a real implementation, this would combine the prompts in a meaningful way. fn chain_prompts(prompt1: &Prompt, prompt2: &Prompt) -> Prompt { let combined_text = format!("{}\n{}", prompt1.text(), prompt2.text()); Prompt::new(combined_text.as_str()) }
fn main() { let prompt1 = Prompt::new("Summarize the following article:"); let prompt2 = Prompt::new("Article: [Insert Article Text Here]");
let combined_prompt = chain_prompts(&prompt1, &prompt2);
// In a real application, you would send this combined prompt to a language model API.
println!("Combined Prompt: {}", combined_prompt.text());
}
MIT
This crate is part of the banana-prompts ecosystem. For advanced features and enterprise-grade tools, visit: https://bananaproai.com/banana-prompts/