Crates.io | ofc |
lib.rs | ofc |
version | |
source | src |
created_at | 2025-02-26 22:12:02.524056+00 |
updated_at | 2025-02-26 22:48:00.609942+00 |
description | A command-line Ollama Function Caller |
homepage | |
repository | https://github.com/elijah-potter/ofc |
max_upload_size | |
id | 1570985 |
Cargo.toml error: | TOML parse error at line 17, column 1 | 17 | autolib = false | ^^^^^^^ unknown field `autolib`, expected one of `name`, `version`, `edition`, `authors`, `description`, `readme`, `license`, `repository`, `homepage`, `documentation`, `build`, `resolver`, `links`, `default-run`, `default_dash_run`, `rust-version`, `rust_dash_version`, `rust_version`, `license-file`, `license_dash_file`, `license_file`, `licenseFile`, `license_capital_file`, `forced-target`, `forced_dash_target`, `autobins`, `autotests`, `autoexamples`, `autobenches`, `publish`, `metadata`, `keywords`, `categories`, `exclude`, `include` |
size | 0 |
ofc
)The Ollama function caller, otherwise known as ofc
, is a command-line tool for prompting Ollama models locally on your system.
There are other programs out there that do similar things, but they either don't support streaming or don't give me access to important settings, like context length or temperature.
In order to use ofc
, you need to have Ollama on your system. You may install it from the Ollama website.
ofc
is installable from either crates.io
or this repository.
cargo install ofc --locked
# Or...
cargo install --git https://github.com/elijah-potter/ofc --locked
It's pretty simple. Just call ofc
with the user prompt of your desire.
ofc "What is the meaning of life?"
You may can change the model from the default (phi4) and control context size and temperature.
ofc --context 8192 --temperature 0.3 --model tinyllama "What is the best pizza?"
ofc
was inspired by ooo
.