| Crates.io | multifiledownloader |
| lib.rs | multifiledownloader |
| version | 0.2.0 |
| created_at | 2025-05-24 00:40:59.648747+00 |
| updated_at | 2025-05-25 02:36:53.143378+00 |
| description | A Concurrent and Configurable Multi-File downloader cli tool. |
| homepage | https://github.com/tralahm/multifiledownloader-rs |
| repository | https://github.com/tralahm/multifiledownloader-rs |
| max_upload_size | |
| id | 1686885 |
| size | 100,616 |
A high-performance, concurrent multi-file downloader written in Rust with progress tracking and error handling.
🚀 Concurrent downloads with configurable worker count
📊 Progress bars for individual files and overall progress
🔄 Resume support for partially downloaded files
🗑️ Clean destination directory before downloading
📂 Customizable destination directory (supports tilde expansion)
🔄 Automatic shell completion support
📊 Human-readable download statistics
🛠️ Robust error handling and logging
📝 dotenv support for configuration
Download multiple files concurrently:
multifiledownloader -w 8 --dest ~/Downloads --urls "https://example.com/file1.txt,https://example.com/file2.txt"
Specify custom destination directory:
multifiledownloader -d ~/Downloads/custom-dir -u "url1,url2,url3"
Clean destination directory before downloading:
multifiledownloader --clean -u "url1,url2"
Set custom number of workers:
multifiledownloader -w 4 -u "url1,url2"
You can read URLs from a file where each URL is on a new line:
# Create a file with URLs
$ cat > urls.txt << EOF
https://example.com/file1.txt
https://example.com/file2.txt
EOF
# Download using the file
$ multifiledownloader -w 8 --dest ~/Downloads --urls "$(cat urls.txt | tr '\n' ',' | sed 's/,$//g')"
Generate shell completion scripts for your shell:
# Bash
multifiledownloader --completion bash | sudo tee /usr/local/etc/bash_completion.d/multifiledownloader
# Zsh or to a directory in your $fpath
multifiledownloader --completion zsh | sudo tee /usr/local/share/zsh/site-functions/_multifiledownloader
# Fish
multifiledownloader --completion fish | sudo tee /usr/local/share/fish/vendor_completions.d/multifiledownloader.fish
# PowerShell
multifiledownloader --completion powershell | Out-File -FilePath $PROFILE\multifiledownloader.ps1
# Elvish
multifiledownloader --completion elvish | tee $HOME/.elvish/completions/multifiledownloader.elv
| Option | Description | Default |
|---|---|---|
| -w, --workers | Number of concurrent download workers | CPU cores count |
| -d, --dest | Destination directory for downloaded files | current directory |
| -u, --urls | Comma-separated list of URLs to download | required |
| -c, --clean | Clean destination directory before downloading | false |
| --completion | Generate shell completion script | - |
| -h, --help | Show help message | - |
| -V, --version | Show version information | - |
cargo install multifiledownloader
git clone https://github.com/tralahm/multifiledownloader-rs.git
cd multifiledownloader-rs
cargo build --release
cp target/release/multifiledownloader /usr/local/bin/
Permission Errors
--clean flag to remove existing files before downloadingNetwork Issues
Progress Bar Issues
To enable debug logging:
RUST_LOG=debug multifiledownloader -u "url1,url2"
This project is licensed under the MIT License - see the LICENSE file for details.