| Crates.io | dociium |
| lib.rs | dociium |
| version | 0.1.0 |
| created_at | 2025-08-02 23:26:23.69793+00 |
| updated_at | 2025-08-02 23:26:23.69793+00 |
| description | Multi-Language Documentation & Code MCP Server - Fast documentation access for Rust, Python, and Node.js |
| homepage | |
| repository | https://github.com/labiium/dociium |
| max_upload_size | |
| id | 1779212 |
| size | 317,499 |
Fast documentation access for Rust, Python, and Node.js packages via MCP (Model Context Protocol)
Get instant access to documentation and source code from your AI assistant. Works with Claude Desktop, Continue, and other MCP-compatible tools.
git clone https://github.com/labiium/dociium.git
cd dociium
cargo install --path .
Add to your MCP client configuration:
{
"servers": {
"dociium": {
"command": "dociium"
}
}
}
Ask your AI assistant:
| Tool | Use Case | Example |
|---|---|---|
search_crates |
Find Rust packages | Search for "async http client" |
get_item_doc |
Get documentation | Documentation for tokio::sync::Mutex |
crate_info |
Package details | Info about the serde crate |
list_trait_impls |
See trait implementations | What implements Iterator? |
source_snippet |
View source code | Source for Vec::push |
get_implementation |
Local Python/Node packages | Get requests.get from your venv |
Ask: "What is tokio::sync::Mutex and how do I use it?"
Your AI gets the full documentation, examples, and usage patterns.
Ask: "Show me the implementation of requests.get"
Dociium finds the function in your local environment and provides the source.
Ask: "Find me async HTTP client libraries for Rust"
Get a curated list with descriptions and popularity metrics.
Set custom cache directory:
export RDOCS_CACHE_DIR=/path/to/cache
Default cache: ~/.cache/rdocs-mcp (Linux/Mac) or %APPDATA%\rdocs-mcp (Windows)
Documentation not found?
tokio::sync::MutexCache issues?
rm -rf ~/.cache/rdocs-mcp # Clear cache and retry
Need help?
# Build from source
cargo build --release
# Run tests
cargo test
# Development mode with debug logging
RUST_LOG=debug cargo run
MIT OR Apache-2.0
Get documentation without leaving your AI conversation.