Crates.io | ollama-inquire |
lib.rs | ollama-inquire |
version | 1.0.6 |
source | src |
created_at | 2024-03-24 12:25:40.574984 |
updated_at | 2024-03-24 13:25:52.790186 |
description | Query any LLM found on Ollama from the terminal! |
homepage | https://crates.io/crates/ollama-inquire |
repository | https://github.com/obaraelijah/ollama-inquire |
max_upload_size | |
id | 1184342 |
size | 14,648 |
Inquire-Ollama: is a command-line tool that allows users to interact with the Ollama LLM models directly from the terminal. This tool provides a simple and intuitive way to inquire questions and receive responses from Ollama models.
To install Inquire-Ollama:, you need to have Rust and Cargo installed on your system. If you haven't already installed Rust, you can do so by following the instructions here.
Once Rust is installed, you can install Inquire-Ollama: using Cargo:
cargo install ollama-inquire
After installation, you can start using Inquire-Ollama: by running:
inquire [OPTIONS] [PROMPT]
--model=[MODEL]
: Specify the model to use (default is 'mistral').--version
: Display the version of the installed Inquire-Ollama: tool.[PROMPT]
: The question or prompt to send to Ollama. Quotation marks are not required.Asking a question using the default model:
inquire "What is the capital of kenya?"
or
inquire What is the capital of France?
Specifying a different model:
inquire --model=gale "Explain the theory of relativity"
Find all available models from Ollama here.
Checking the version:
inquire --version
Seeing the help info:
inquire --help
Contributions to Inquire-Ollama: are welcome! If you have suggestions for improvements or encounter any issues, please feel free to open an issue or submit a pull request on our GitHub repository.
Inquire-Ollama: is licensed under the MIT License.