| Crates.io | sonr |
| lib.rs | sonr |
| version | 0.1.7 |
| created_at | 2025-12-19 22:17:48.986164+00 |
| updated_at | 2025-12-21 23:34:33.869059+00 |
| description | High-performance semantic search tool for local codebases |
| homepage | https://github.com/skyne98/sonr |
| repository | https://github.com/skyne98/sonr |
| max_upload_size | |
| id | 1995654 |
| size | 54,727 |
sonr is a high-performance semantic search tool for local codebases. It consists of a background daemon that manages LLM inference and a CLI for performing fast, context-aware searches.
llama.cpp (via llama-server) for fast embedding and reranking.llama-server instances (one for embeddings, one for reranking) and provides a REST API.llama-server must be installed and available in your PATH.cargo install sonr sonr-daemon
sonr-daemon
The daemon will download the default models (Qwen3-0.6B based) and start listening on an available port. It writes this port to a discovery file so the CLI can find it automatically.
sonr "how does the chunking logic work?" ./src
sonr <query> [paths...]: Search for <query> in the specified paths.--limit <N>: Number of results to return (default: 5).--json: Output results in JSON format.--url <URL>: Connect to a custom daemon URL (overrides automatic discovery).--port-file <PATH>: Path to the daemon's port discovery file.The daemon exposes an MCP (Model Context Protocol) server at /mcp. You can use the CLI as a stdio-to-HTTP bridge:
sonr mcp stdio
This allows MCP clients to use the semantic_search tool with parameters:
query: Natural language search queryroot_directory: Path to search inlimit: Maximum number of results (optional)The daemon supports several flags:
--port: API port (default: 0 for OS-assigned).--port-file: Path to write the assigned port for CLI discovery.--embedding-hf-repo: HF repository for the embedding model.--reranker-hf-repo: HF repository for the reranker model.--gpu-layers: Number of layers to offload to GPU (default: 99).--cache-file: Path to persist the embedding cache.Run sonr-daemon --help for full details.