llm_interface

Crates.iollm_interface
lib.rsllm_interface
version0.0.2
sourcesrc
created_at2024-10-04 21:48:44.806867
updated_at2024-10-10 17:17:31.302115
descriptionThe Backend for the llm_client Crate
homepagehttps://github.com/shelbyJenkins/llm_client
repositoryhttps://github.com/shelbyJenkins/llm_client
max_upload_size
id1397138
size268,593
Shelby Jenkins (ShelbyJenkins)

documentation

README

llm_interface: The Backend for the llm_client Crate

This crate contains the build.rs, data types, and behaviors for LLMs.

  • Llama.cpp (through llama-server)
  • Various LLM APIs including support for generic OpenAI format LLMs

You can use this crate to run local LLMs and make requests to LLMs. It's set up to be easy to integrate into other projects.

See the various Builders implemented in the lib.rs file for an example of using this crate.

For a look at a higher level API and how it implements this crate, checkout the llm_client crate and it's lib.rs file.

Commit count: 50

cargo fmt