llm-chain-openai

Crates.iollm-chain-openai
lib.rsllm-chain-openai
version0.13.0
sourcesrc
created_at2023-03-25 14:34:16.194662
updated_at2023-11-15 20:53:05.935949
descriptionA library implementing `llm-chains` for OpenAI's models. Chains can be use to apply the model series to complete complex tasks, such as text summation.
homepage
repositoryhttps://github.com/sobelio/llm-chain/
max_upload_size
id820310
size147,766
William HΓΆgman Rudenmalm (williamhogman)

documentation

README

llm-chain πŸš€

llm-chain is a collection of Rust crates designed to help you create advanced LLM applications such as chatbots, agents, and more. As a comprehensive LLM-Ops platform we have strong support for both cloud and locally-hosted LLMs. We also provide robust support for prompt templates and chaining together prompts in multi-step chains, enabling complex tasks that LLMs can't handle in a single step. We also provide vector store integrations making it easy to give your model long-term memory and subject matter knowledge. This empowers you to build sophisticated applications.

Discord Crates.io License Docs: Tutorial

Examples πŸ’‘

To help you get started, here is an example demonstrating how to use llm-chain. You can find more examples in the examples folder in the repository.

let exec = executor!()?;
let res = prompt!(
    "You are a robot assistant for making personalized greetings",
    "Make a personalized greeting for Joe"
)
.run(parameters()!, &exec)
.await?;
println!("{}", res);

➑️ tutorial: get started with llm-chain ➑️ quick-start: Create project based on our template

Features 🌟

  • Prompt templates: Create reusable and easily customizable prompt templates for consistent and structured interactions with LLMs.
  • Chains: Build powerful chains of prompts that allow you to execute more complex tasks, step by step, leveraging the full potential of LLMs.
  • ChatGPT support: Supports ChatGPT models, with plans to add OpenAI's other models in the future.
  • LLaMa support: Provides seamless integration with LLaMa models, enabling natural language understanding and generation tasks with Facebook's research models.
  • Alpaca support: Incorporates support for Stanford's Alpaca models, expanding the range of available language models for advanced AI applications.
  • llm.rs support: Use llms in rust without dependencies on C++ code with our support for llm.rs
  • Tools: Enhance your AI agents' capabilities by giving them access to various tools, such as running Bash commands, executing Python scripts, or performing web searches, enabling more complex and powerful interactions.
  • Extensibility: Designed with extensibility in mind, making it easy to integrate additional LLMs as the ecosystem grows.
  • Community-driven: We welcome and encourage contributions from the community to help improve and expand the capabilities of llm-chain.

Getting Started πŸš€

To start using llm-chain, add it as a dependency in your Cargo.toml (you need Rust 1.65.0 or newer):

[dependencies]
llm-chain = "0.12.0"
llm-chain-openai = "0.12.0"

The examples for llm-chain-openai require you to set the OPENAI_API_KEY environment variable which you can do like this:

export OPENAI_API_KEY="sk-YOUR_OPEN_AI_KEY_HERE"

Then, refer to the documentation and examples to learn how to create prompt templates, chains, and more.

Contributing 🀝

We warmly welcome contributions from everyone! If you're interested in helping improve llm-chain, please check out our CONTRIBUTING.md file for guidelines and best practices.

License πŸ“„

llm-chain is licensed under the MIT License.

Connect with Us 🌐

If you have any questions, suggestions, or feedback, feel free to open an issue or join our community discord. We're always excited to hear from our users and learn about your experiences with llm-chain.

We hope you enjoy using llm-chain to unlock the full potential of Large Language Models in your projects. Happy coding! πŸŽ‰

Commit count: 393

cargo fmt