Crates.io | hiramu-cli |
lib.rs | hiramu-cli |
version | 0.1.23 |
source | src |
created_at | 2024-04-14 03:58:37.935998 |
updated_at | 2024-04-23 06:54:11.350772 |
description | A command-line interface tool for interacting with large language models (LLMs) on AWS Bedrock and generating text based on prompts. |
homepage | |
repository | |
max_upload_size | |
id | 1207994 |
size | 212,467 |
Hiramu CLI is a powerful command-line interface for interacting with language models. It provides a seamless way to ask questions and generate text using various models from different providers, including Anthropic's Claude (Haiku, Sonnet, Opus), Mistral (7B, 8x7B, Large), and Ollama.
{input}
placeholder in promptsgit diff
, DuckDB, PostgreSQL)To install Hiramu CLI, ensure you have Rust installed on your system. If you don't have Rust installed, you can follow the official installation guide: https://www.rust-lang.org/tools/install
Once Rust is installed, run the following command to install Hiramu CLI:
cargo install hiramu-cli
To ask a question to a language model, use the generate
command followed by the question. You can specify additional options to customize the behavior of the CLI.
hiramu-cli generate "What is the capital of France?" -r us-west-2 -p bedrock -m 100 -t 0.7 -M haiku -P bedrock
-r, --region <REGION>
: The region to use (default: "us-west-2").-p, --profile <PROFILE>
: The profile to use (default: "bedrock").-m, --maxtoken <MAXTOKEN>
: The maximum number of tokens to generate (default: 100).-t, --temperature <TEMPERATURE>
: The temperature for generation (default: 0.7).-M, --model <MODEL>
: The model alias to use (default: "haiku").-P, --provider <PROVIDER>
: The provider alias to use for generation (default: "bedrock").-E, --endpoint <ENDPOINT>
: The provider endpoint to use for generation (default: "http://localhost:11434").Hiramu CLI supports interactive input using the {input}
placeholder in prompts. When the placeholder is present, the CLI will prompt you to enter the input, which will be inserted into the prompt before sending it to the language model.
hiramu-cli generate "Translate the following text from English to French: {input}" -M sonnet
This feature allows you to provide dynamic input to the language model during runtime.
Hiramu CLI provides convenient aliases for different language models. The following model aliases are available:
haiku
: Anthropic Claude 3, Haiku 1xsonnet
: Anthropic Claude 3, Sonnet 1xopus
: Anthropic Claude 3, Opus 1xmistral7b
: Mistral 7B Instruct 0xmistral8x7b
: Mistral 8x7B Instruct 0xmistral-large
: Mistral LargeYou can use these aliases with the -M
or --model
option to specify the desired model for generation.
Hiramu CLI supports different providers for language model generation. The following provider aliases are available:
bedrock
: Anthropic's Bedrock platform (default)ollama
: Ollama providerYou can use these aliases with the -P
or --provider
option to specify the desired provider for generation. When using the ollama
provider, you also need to specify the endpoint using the -E
or --endpoint
option.
Here are a few examples demonstrating the usage of Hiramu CLI:
Ask a question using the default options:
hiramu-cli generate "What is the capital of France?"
Specify model and temperature:
hiramu-cli generate "What is the meaning of life?" -M sonnet -t 0.5
Translate interactively:
hiramu-cli generate "Translate from English to Spanish: {input}" -M mistral8x7b
Generate release notes by combining with git diff
:
git diff HEAD~1..HEAD | hiramu-cli generate "Summarize the changes:" -M opus
This pipes the output of git diff
into Hiramu CLI to generate a summary of the code changes.
Generate SQL queries from natural language using DuckDB:
hiramu-cli generate "SQL query to find users who signed up in the last 30 days: {input}" -M mistral7b | duckdb -c -
The generated SQL query is piped directly into DuckDB for execution.
Optimize SQL queries using PostgreSQL:
query="SELECT * FROM orders JOIN customers ON orders.customer_id = customers.id"
optimized_query=$(echo "$query" | hiramu-cli generate "Optimize this SQL query:" -M mistral-large)
psql -d mydb -c "$optimized_query"
The existing SQL query is passed to Hiramu CLI to generate an optimized version, which is then executed using PostgreSQL.
Feel free to explore different prompts, models, providers, and options to generate various types of content using Hiramu CLI.
Contributions to Hiramu CLI are welcome! If you encounter any issues or have suggestions for improvements, please open an issue on the GitHub repository.
Before submitting a pull request, ensure that the tests pass and the code is formatted with cargo fmt
. You can run the tests using the following command:
hiramu-cli generate "Once upon a time, in a far-off land, there lived a brave knight named {input}. The knight embarked on a quest to..." -m 200 -M mistral-large
Feel free to explore different prompts, models, and options to generate various types of content using Hiramu CLI.
Contributions to Hiramu CLI are welcome If you encounter any issues or have suggestions for improvements, please open an issue on the GitHub repository.
prompt
command with generate
Hiramu CLI is open-source software licensed under the Apache 2 License. See the LICENSE file for more details.