| Crates.io | azure-openai-cli |
| lib.rs | azure-openai-cli |
| version | 0.1.0 |
| created_at | 2025-06-01 13:50:43.721421+00 |
| updated_at | 2025-06-01 13:50:43.721421+00 |
| description | A modular, extensible CLI tool for interacting with Azure OpenAI and other LLM providers. Supports streaming output, provider plugins, and easy configuration. |
| homepage | https://github.com/dominic-codespoti/azure-openai-cli |
| repository | https://github.com/dominic-codespoti/azure-openai-cli |
| max_upload_size | |
| id | 1697204 |
| size | 58,081 |
A modular, extensible command-line interface (CLI) tool written in Rust for interacting with Azure OpenAI and other LLM providers. Supports streaming output, provider plugins, and easy configuration.
azure-openai-cli
├── src
│ ├── main.rs # Entry point of the application
│ ├── config.rs # Configuration management
│ ├── provider.rs # Provider trait and registry
│ └── provider/
│ └── azure.rs # Azure OpenAI provider implementation
├── Cargo.toml # Rust project configuration
└── README.md # Project documentation
cargo install azure-openai-cli
ln -s ~/.cargo/bin/azure-openai-cli ~/.cargo/bin/chat
cargo install --path .
ln -s ~/.cargo/bin/azure-openai-cli ~/.cargo/bin/chat
Make sure ~/.cargo/bin is in your PATH (add export PATH="$HOME/.cargo/bin:$PATH" to your ~/.zshrc or ~/.bashrc if needed).
You can configure the CLI using the config subcommand or by editing the config file directly.
Show current config:
azure-openai-cli config show
# or
chat config show
Set a config value:
azure-openai-cli config set <key> <value>
# or
chat config set azure_api_key sk-... # example
Supported keys: provider, azure_endpoint, azure_api_key, azure_deployment
Send a prompt to the LLM (default provider: Azure):
chat "Hello, how are you?"
With options:
chat --max-tokens 256 --temperature 0.7 "Tell me a joke about Rust."
$ chat "What is the capital of France?"
Paris.
You can also set credentials via environment variables:
export AZURE_OPENAI_API_KEY=sk-...
export AZURE_OPENAI_ENDPOINT=https://.../openai/deployments/...
To add a new provider, implement the LLMProvider trait in a new file under src/provider/, and register it in get_provider in provider.rs.
Cargo.toml with your repository, homepage, and author info.cargo login
cargo publish
This project is licensed under the MIT OR Apache-2.0 License. See the LICENSE file for more details.