WaybackRust === WaybackRust is a tool written in Rust to query the [WaybackMachine](https://archive.org/web/). Here is the functionalities : * Get all urls for a specific domain and get their current HTTP status codes (urls command). * Get all link in the robots.txt file of every snapshot in the WaybackMachine (robots command). * Get the source of all archives of a specifics page (unify command). ## Install ##### Download the statically linked binary from github releases: * download the static binary : `$ wget https://github.com/Neolex-Security/WaybackRust/releases/download/v0.2.11/waybackrust` * `$ chmod +x waybackrust` * `# mv waybackrust /usr/local/bin` * run waybackrust : `$ waybackrust ` ##### from cargo (crates.io): `cargo install waybackrust` ##### from github: * Clone this repository `git clone https://github.com/Neolex-Security/WaybackRust` * `cargo build --release` * The executable is in : `./target/release/waybackrust` ## Usage ``` Neolex Wayback machine tool for bug bounty USAGE: waybackrust [SUBCOMMAND] FLAGS: -h, --help Prints help information -V, --version Prints version information SUBCOMMANDS: help Prints this message or the help of the given subcommand(s) robots Get all disallowed entries from robots.txt unify Get the content of all archives for a given url urls Get all urls for a domain ``` ###### Urls command : ``` waybackrust-urls Get all urls for a domain USAGE: waybackrust urls [FLAGS] [OPTIONS] FLAGS: -h, --help Prints help information -n, --nocheck Don't check the HTTP status -p, --nocolor Don't colorize HTTP status --silent Disable informations prints -s, --subs Get subdomains too -V, --version Prints version information OPTIONS: -b, --blacklist The extensions you want to blacklist (ie: -b png,jpg,txt) -d, --delay Make a delay between each request -o, --output Name of the file to write the list of urls (default: print on stdout) -t, --threads Number of concurrent requests (default: 24) -w, --whitelist The extensions you want to whitelist (ie: -w png,jpg,txt) ARGS: domain name or file with domains ``` ###### Robots command : ``` waybackrust-robots Get all disallowed entries from robots.txt USAGE: waybackrust robots [FLAGS] [OPTIONS] FLAGS: -h, --help Prints help information --silent Disable informations prints -V, --version Prints version information OPTIONS: -o, --output Name of the file to write the list of uniq paths (default: print on stdout) -t, --threads The number of threads you want. (default: 10) ARGS: domain name or file with domains ``` ###### Unify command : ``` waybackrust-unify Get the content of all archives for a given url USAGE: waybackrust unify [FLAGS] [OPTIONS] FLAGS: -h, --help Prints help information --silent Disable informations prints -V, --version Prints version information OPTIONS: -o, --output Name of the file to write contents of archives (default: print on stdout) -t, --threads The number of threads you want. (default: 10) ARGS: url or file with urls ``` ## Ideas of new features If you have idea of improvement and new features in the tool please create an issue or contact me.