| Crates.io | dirnutek |
| lib.rs | dirnutek |
| version | 0.1.0 |
| created_at | 2026-01-09 01:40:27.177517+00 |
| updated_at | 2026-01-09 01:40:27.177517+00 |
| description | A command-line tool for discovering web directories and files. |
| homepage | https://github.com/NutekSecurity/dircrab |
| repository | https://github.com/NutekSecurity/dircrab |
| max_upload_size | |
| id | 2031405 |
| size | 13,946,231 |
DirNutek - A High-Speed, Professional Web FuzzerDirNutek is a blazing-fast, concurrent directory and file scanner for web servers, designed to be a professional-grade tool for web reconnaissance. It started as a fun "chillout" project to master asynchronous Rust and has evolved into a powerful fuzzer capable of discovering hidden content and potential vulnerabilities.

This roadmap outlines the development plan for DirNutek, transforming it from a powerful scanner into a full-featured, professional web fuzzing tool.
This phase established the fundamental building blocks of the fuzzer.
clap.tokio and reqwest.Semaphore to manage the number of concurrent requests.This phase focuses on expanding the fuzzer's capabilities to handle more complex and realistic fuzzing scenarios.
✅ Multiple Fuzzing Modes:
FUZZ keyword that can be placed anywhere in the URL.dirnutek -u "http://example.com/page?id=FUZZ" -w sqli.txtdirnutek -u "http://FUZZ.example.com" -w subdomains.txt-u flag will be enhanced to detect the FUZZ keyword and adapt the fuzzing strategy accordingly.✅ Custom Headers & Authentication:
-H, --header flag that can be specified multiple times.-H "Authorization: Bearer <TOKEN>"-H "User-Agent: MyCustomScanner"-H "X-Forwarded-For: FUZZ"-H, --header <HEADER>. The application will need to parse these headers and add them to every request. If FUZZ is present in a header value, the fuzzer will iterate through the wordlist for that header.✅ POST Data Fuzzing:
POST requests.-d, --data flag to specify the request body.FUZZ keyword is present in the data, the fuzzer will replace it with words from the wordlist.POST requests.dirnutek -u http://example.com/login -d '{"username":"admin","password":"FUZZ"}' -w passwords.txt -X POSTThis phase focuses on features that make DirNutek a professional-grade tool that can be integrated into larger security workflows.
✅ Advanced Reporting:
-o, --output <FILE> flag to save results to a file.--format flag to specify the output format.serde crate to serialize results into different formats.json: For easy parsing by scripts and other tools.csv: For easy import into spreadsheets.txt: Plain text (default).✅ Session Management:
--resume <STATE_FILE> flag to load the state and continue the scan.This phase includes features that will make DirNutek a truly top-tier tool.
[FUTURE] Interactive TUI Dashboard:
ratatui to create a terminal user interface.[FUTURE] Plugin/Scripting Engine:
The following sections detail the original project idea and implementation notes, which serve as the foundation for the current and future development of DirNutek.
clap crate to take command-line arguments:
-u or --url: The base URL to scan (e.g., http://testsite.com).-w or --wordlist: The path to the text file (e.g., ~/wordlists/common.txt).http://testsite.com/admin).tokio task to send an HTTP GET or HEAD request to that URL.tokio::spawn to launch hundreds (or thousands) of these requests concurrently.[200 OK] /index.html[301 Moved] /old-page -> /new-page[403 Forbidden] /admin (Finding a "Forbidden" directory is a win!)404 Not Found./api/), automatically start a new scan inside that directory (e.g., /api/users, /api/v1, etc.).--exclude-404 (Default)--include-500 (To find server errors)--only-200 (To only see valid pages)tokio::sync::Semaphore to limit concurrent requests to a specific number (e.g., 50) so you don't crash your machine or the target.ratatui to make a cool terminal dashboard showing:
DirNutekThis plan outlines a structured approach to building DirNutek, starting with the Minimum Viable Product (MVP) and then progressively adding "Show-Off" bonus features.
✅ Project Initialization:
cargo new dirnutek --binCargo.toml:
[dependencies]
clap = { version = "4.0", features = ["derive"] } # For CLI argument parsing
tokio = { version = "1", features = ["full"] } # For asynchronous runtime
reqwest = { version = "0.11", features = ["json", "rustls-tls"] } # For HTTP requests
anyhow = "1.0" # For simplified error handling
# Consider `url` crate for robust URL manipulation if needed
tokio and reqwest and add more as needed to keep compile times down. rustls-tls is generally preferred for security over native-tls.✅ CLI Argument Parsing (clap):
struct to hold command-line arguments (URL, wordlist path).#[derive(Parser)] and #[clap(author, version, about)] for metadata.Wordlist Loading:
async function to read the wordlist file.Vec<String>.Asynchronous HTTP Requests (tokio, reqwest):
async function, e.g., scan_url(client: &reqwest::Client, base_url: &str, word: &str) -> Result<(), anyhow::Error>.scan_url, construct the full URL (e.g., http://example.com/admin).reqwest::Client to send GET or HEAD requests. HEAD requests are often faster as they don't download the body, but might not always reflect the true status for all servers. Start with GET for simplicity, then optimize to HEAD if appropriate.scan_url call as a tokio::spawn task. Collect the JoinHandles.reqwest::Client should be created once and reused for all requests to benefit from connection pooling. Implement a timeout for requests to prevent hanging.Response Processing & Output:
scan_url, after receiving a response, check response.status().[STATUS_CODE] /path -> redirect_target (if 301).tokio::sync::mpsc channel to send results from scan_url tasks to a main thread for printing, ensuring ordered output and avoiding interleaved prints.✅ Concurrency Limiting (tokio::sync::Semaphore):
tokio::sync::Semaphore with a configurable maximum number of permits.scan_url task, acquire a permit from the semaphore.scan_url task completes (e.g., using _permit = semaphore.acquire().await;).--concurrency to set the semaphore limit.✅ Status Code Filtering:
--exclude-status <CODES> and --include-status <CODES>.HashSet<u16> for efficient lookup.--exclude and --include are used (e.g., --include overrides --exclude).✅ Recursive Scanning:
/admin/ or a 301 redirect to a directory), add it to a queue of URLs to be scanned.--recursive or --depth <N> to control recursion.TUI Dashboard (ratatui):
ratatui to draw a terminal UI.tokio::sync::mpsc channels to send updates (RPS, new findings, progress) from the scanning tasks to the TUI rendering loop.ratatui requires careful state management and event handling. Start with a very basic display and incrementally add complexity. Ensure the TUI doesn't block the scanning process.Robust Error Handling:
anyhow::Result for functions that can fail.reqwest errors (e.g., network issues, DNS resolution failures).Configuration:
dirnutek.toml) for default settings.Performance Optimization:
reqwest client settings (e.g., tcp_nodelay, connect_timeout).Documentation:
README.md with usage instructions and examples.cli.rs, scanner.rs, tui.rs).GET/HEAD, consider adding support for other HTTP methods (e.g., POST) as a future enhancement.reqwest handles this by default, but be aware of potential issues with self-signed certificates or older TLS versions.This plan provides a roadmap for building DirNutek. Remember to iterate, test frequently, and enjoy the process of mastering asynchronous Rust!
tokio is an asynchronous runtime for Rust, enabling concurrent operations without traditional threads. In this crate, it's primarily used to manage and execute non-blocking I/O operations and concurrent tasks efficiently.
What tokio is used for:
How and When tokio is used:
#[tokio::main] (in src/main.rs):
main function into an asynchronous entry point, setting up the tokio runtime to execute the async code.tokio::spawn (in src/main.rs and src/lib.rs):
tokio runtime.src/main.rs, it's used to spawn a dedicated "printer" task (printer_handle) that receives and prints messages from the scanning process.src/lib.rs (within the start_scan function), it's used to spawn multiple perform_scan tasks, allowing the application to send many HTTP requests concurrently.tokio::sync::mpsc::Sender (in src/lib.rs and src/main.rs):
Sender (tx) is passed around to perform_scan tasks, allowing them to send formatted scan results (e.g., "[200 OK] http://example.com [10W, 50C, 2L]") back to the central printer task in main.rs.tokio::sync::{Mutex, Semaphore} (in src/lib.rs and src/main.rs):
Mutex: An asynchronous mutual exclusion primitive, similar to a standard mutex but designed for async contexts.Semaphore: A counting semaphore used to limit the number of concurrent operations.Mutex: Used to protect shared data structures like visited_urls (to prevent rescanning) and scan_queue (to manage URLs to be scanned) from concurrent access by multiple tasks.Semaphore: Used in start_scan to limit the number of active perform_scan tasks (and thus concurrent HTTP requests), preventing the application from overwhelming the target server or its own resources.tokio::time::sleep (in src/lib.rs and tests):
perform_scan, it's used to implement a scan_delay between requests if configured, to avoid hammering the target server.start_scan, it's used within the tokio::select! block as a fallback to periodically check the queue if no tasks have completed.tokio::select! (in src/lib.rs):
start_scan, it's used to efficiently wait for either a spawned task to complete (potentially adding new URLs to the queue) or for a short timeout, preventing the main loop from busy-waiting when the queue is empty but tasks are still running.tokio::fs::File, tokio::io::{AsyncBufReadExt, BufReader} (in src/main.rs):
main.rs, these are used to asynchronously read the wordlist file, allowing the application to remain responsive while loading potentially large files.tokio::net::TcpListener (in tests):
test_perform_scan_timeout) to set up a mock server that can simulate network behavior like timeouts, allowing for robust testing of the perform_scan function.tokio::task::JoinSet (in src/lib.rs):
start_scan, it's used to keep track of all the perform_scan tasks that have been spawned. This allows the start_scan loop to know if there are still active tasks and to await their completion before finishing.reqwest is a powerful and ergonomic HTTP client for Rust, built on top of tokio. In this crate, it's exclusively used for making HTTP requests to target URLs.
What reqwest is used for:
How and When reqwest is used:
reqwest::Client (in src/main.rs and src/lib.rs):
reqwest::Client is created using Client::builder().build().unwrap(). This client is designed to be reused across multiple requests for efficiency, as it manages connection pooling and other resources.src/main.rs, a Client is initialized once at the start of the program. This client is then passed to the start_scan function.src/lib.rs, within the perform_scan function, the Client is used to execute the actual HTTP requests.Client is created for each test case to make HTTP requests to the mock server.Making HTTP Requests (in src/lib.rs):
Client instance provides methods corresponding to HTTP verbs (e.g., client.get(), client.post(), client.head(), etc.). These methods return a RequestBuilder which is then used to send the request with .send().await?. For OPTIONS requests, client.request(reqwest::Method::OPTIONS, target_url.as_str()) is used to explicitly specify the method.perform_scan function, for each target_url and http_method, an appropriate reqwest method is called to send the HTTP request.Configuring Redirect Behavior (in src/main.rs and src/lib.rs tests):
Client::builder() is configured with .redirect(reqwest::redirect::Policy::none()).dirnutek. By default, reqwest would automatically follow HTTP redirects (like 301, 302). However, dirnutek needs to explicitly observe these redirect status codes to report them and potentially use the redirect target for further scanning. Disabling automatic redirects ensures that the perform_scan function receives the initial response status code.Handling Responses (in src/lib.rs):
res object (of type reqwest::Response) is used to extract information such as:
res.status(): To get the HTTP status code.res.headers(): To access response headers (e.g., to get the Location header for redirects).res.text().await?: To asynchronously read the response body as text.perform_scan, after an HTTP request is sent, the status code is checked for filtering and reporting. The response body is read to count words, characters, and lines, which are also used for filtering and output.Error Handling:
reqwest operations return Result types, allowing for robust error handling. The ? operator is used to propagate errors.// 404 is a valid HTTP response, not an error in reqwest highlights that reqwest considers any valid HTTP response (even 4xx or 5xx status codes) as a successful transport of the response. The application logic then interprets these status codes to determine if the scan result is "interesting" or not.