Crates.io | cargo-docset |
lib.rs | cargo-docset |
version | 0.3.1 |
source | src |
created_at | 2019-08-14 01:12:37.924859 |
updated_at | 2022-09-26 04:01:08.00911 |
description | Generates a Zeal/Dash docset for your rust package. |
homepage | |
repository | https://github.com/Robzz/cargo-docset |
max_upload_size | |
id | 156639 |
size | 66,577 |
cargo-docset
- Generate a Zeal/Dash docset for your Rust crate or workspacecargo-docset
is a tool allowing you to generate a Dash/Zeal
compatible docset for your Rust packages and their dependencies.
cargo-docset
depends on the SQLite3 library. You can either install the SQLite3 library on your system (see
rusqlite's documentation for
help), or build the version that is bundled in the libsqlite3-sys
crate by turning on the bundled-sqlite
feature
flag when building cargo-docset
.
You can install cargo docset with the usual cargo command: cargo install cargo-docset
.
Just run cargo docset
in your crate's directory to generate the docset. It will be placed in the target/docset
directory. cargo-docset generally supports the same options as cargo doc
, with a few additional ones. For more
information, run cargo docset --help
or look below in this README.
To install your shiny new docset, copy it to your Zeal/Dash docset directory (available in the preferences, on Zeal at least) and restart Zeal/Dash.
Some more advanced examples:
cargo docset --no-deps --package dependency1 --package dependency2 ...
git clone --recurse-submodules ...
) official
Rust repository: cargo +nightly docset --package std --package core --no-deps --docset-name "Rust nightly $(git rev-parse --short HEAD)" --docset-index std --platform-family rust-nightly
cargo docset --help
cargo-docset-docset
Generate a docset. This is currently the only available command, and should remain the default one
in the future if new ones are added
USAGE:
cargo-docset docset [OPTIONS]
OPTIONS:
--all-features
Activate all available features
--bin <BIN>
Document only the specified binary
--bins
Document all binaries
--docset-index <PACKAGE>
Specify or override the package whose index will be used as the docset index page
--docset-name <DOCSET_NAME>
Specify or override the name of the docset, this is the display name used by your docset
browser
--document-private-items
Generate documentation for private items
--exclude <SPEC>
Exclude packages from being processed
-F, --features <FEATURES>
Space-separated list of features to activate
-h, --help
Print help information
--lib
Document only this package's library
--manifest-path <PATH>
Path to Cargo.toml
--no-clean
Do not clean the doc directory before generating the rustdoc
--no-default-features
Do not activate the `default` feature
--no-deps
Do not document dependencies
-p, --package <SPEC>
Package to process (see `cargo help pkgid`)
--platform-family <PLATFORM_FAMILY>
Specify or override the docset platform family, this is used as the keyword you can
specify in your docset browser search bar to search this specific docset)
--target <TARGET>
Build documentation for the specified target triple
--target-dir <TARGET_DIR>
Override the workspace target directory
--workspace
Process all packages in the workspace
Currently, cargo docset
runs cargo
to generate the documentation, and then recursively walks the generated
directory. The contents of every file is inferred from the file path, and cargo-docset then fills a SQLite database with
the gathered information. The details of docset generation are available here.
cargo-docset
does not (yet, at least) try to parse the generated documentation in any way, and therefore is limited in
the granularity of the index it can provide. In particular, the generated docset does not make use of the table of
contents feature.
Also, because cargo-docset
walks through the whole doc
directory, it must clear it before attempting to generate
the docset, in case there is some previously generated documentation that we don't want to pickup in the docset there.
You should probably not be storing anything of value in that directory anyway, but keep it in mind.
See here.