Crates.io | promptforge |
lib.rs | promptforge |
version | 0.1.11 |
source | src |
created_at | 2024-09-08 22:59:34.742978 |
updated_at | 2024-10-12 17:17:39.375982 |
description | A Rust crate for building and formatting prompts for AI agents. |
homepage | |
repository | https://github.com/kinghuynh/promptforge.git |
max_upload_size | |
id | 1368604 |
size | 205,155 |
PromptForge is a Rust library designed for building, formatting, and managing prompts for AI agents. With support for both F-string-like and Mustache-style templating, PromptForge allows developers to create dynamic and customizable prompts for use with Large Language Models (LLMs) and various AI-driven applications.
Template Flexibility: PromptForge provides two powerful templating engines: FmtString, which is inspired by Python's F-strings, and Mustache, a widely used logic-less templating system. These tools allow you to define templates that are flexible, expressive, and reusable across different AI tasks.
Dynamic Prompt Construction: You can define placeholders in your templates, and dynamically insert variables at runtime to generate context-specific prompts for your AI models. This makes PromptForge a great tool for use cases like chatbot conversations, task automation, and AI content generation.
Compatibility with LLMs: PromptForge is designed to help manage prompts for Large Language Models (LLMs) like OpenAI’s GPT models and other AI platforms. Whether you need simple text completion or complex interactive AI behavior, PromptForge gives you the power to generate prompts that align with your needs.
Error Handling: The library provides robust error detection for malformed templates, including identifying mismatched or mixed formatting styles, invalid placeholders, and other common issues in prompt construction.
Extensibility: Developers can easily extend PromptForge to support custom templating engines or additional placeholder validation strategies.
PromptForge draws inspiration from the excellent work done in the LangChain prompts library. LangChain’s approach to managing prompts and integrating with LLMs served as a valuable reference in the design and development of PromptForge, especially in terms of structuring reusable, dynamic prompts for AI applications.
PromptForge is aimed at simplifying the process of working with AI-driven systems, especially when it comes to generating and managing prompts. Here are some of the core goals:
PromptForge will soon be available on crates.io, making installation as simple as adding the following to your Cargo.toml
:
[dependencies]
promptforge = "0.1"
use promptforge::{PromptTemplate, TemplateError, prompt_vars};
fn main() -> Result<(), TemplateError> {
let tmpl = PromptTemplate::new("Hello, {name}! Your order number is {order_id}.")?;
let variables = prompt_vars!(name = "Alice", order_id = "12345");
let result = tmpl.format(variables)?;
println!("{}", result); // Outputs: Hello, Alice! Your order number is 12345.
Ok(())
}
use promptforge::{PromptTemplate, TemplateError, prompt_vars};
fn main() -> Result<(), TemplateError> {
let tmpl = PromptTemplate::new("Hello, {{name}}! Your favorite color is {{color}}.")?;
let variables = prompt_vars!(name = "Bob", color = "blue");
let result = tmpl.format(variables)?;
println!("{}", result); // Outputs: Hello, Bob! Your favorite color is blue.
Ok(())
}
use promptforge::{PromptTemplate, TemplateError, prompt_vars};
fn main() -> Result<(), TemplateError> {
let tmpl = PromptTemplate::new("Hi, {name}! Please confirm your email: {email}.")?;
let variables = prompt_vars!(name = "Charlie");
let result = tmpl.format(variables);
assert!(result.is_err());
println!("Error: {:?}", result.unwrap_err()); // Outputs: Error: MissingVariable("email")
Ok(())
}
Contributions are welcome! If you're interested in contributing to PromptForge, please take a moment to review the following guidelines:
If you encounter any issues, feel free to open an issue on the repository with details about the bug or feature request.
PromptForge is licensed under the MIT License. See the LICENSE file for more details.
For questions or discussions, feel free to reach out or submit an issue on the GitHub repository.