multifiledownloader

Crates.iomultifiledownloader
lib.rsmultifiledownloader
version0.2.0
created_at2025-05-24 00:40:59.648747+00
updated_at2025-05-25 02:36:53.143378+00
descriptionA Concurrent and Configurable Multi-File downloader cli tool.
homepagehttps://github.com/tralahm/multifiledownloader-rs
repositoryhttps://github.com/tralahm/multifiledownloader-rs
max_upload_size
id1686885
size100,616
Tralah M Brian (TralahM)

documentation

README

Multi File Downloader

GitHub license Crates.io CI

A high-performance, concurrent multi-file downloader written in Rust with progress tracking and error handling.

Features

  • 🚀 Concurrent downloads with configurable worker count

  • 📊 Progress bars for individual files and overall progress

  • 🔄 Resume support for partially downloaded files

  • 🗑️ Clean destination directory before downloading

  • 📂 Customizable destination directory (supports tilde expansion)

    • The destination directory is created if it does not exist automatically
  • 🔄 Automatic shell completion support

  • 📊 Human-readable download statistics

  • 🛠️ Robust error handling and logging

  • 📝 dotenv support for configuration

Usage

Basic Usage

Download multiple files concurrently:

multifiledownloader -w 8 --dest ~/Downloads --urls "https://example.com/file1.txt,https://example.com/file2.txt"

Advanced Usage

  • Specify custom destination directory:

    multifiledownloader -d ~/Downloads/custom-dir -u "url1,url2,url3"
    
  • Clean destination directory before downloading:

    multifiledownloader --clean -u "url1,url2"
    
  • Set custom number of workers:

    multifiledownloader -w 4 -u "url1,url2"
    

Reading URLs from a File

You can read URLs from a file where each URL is on a new line:

# Create a file with URLs
$ cat > urls.txt << EOF
https://example.com/file1.txt
https://example.com/file2.txt
EOF

# Download using the file
$ multifiledownloader -w 8 --dest ~/Downloads --urls "$(cat urls.txt | tr '\n' ',' | sed 's/,$//g')"

Shell Completion

Generate shell completion scripts for your shell:

# Bash
multifiledownloader --completion bash | sudo tee /usr/local/etc/bash_completion.d/multifiledownloader

# Zsh or to a directory in your $fpath
multifiledownloader --completion zsh | sudo tee /usr/local/share/zsh/site-functions/_multifiledownloader

# Fish
multifiledownloader --completion fish | sudo tee /usr/local/share/fish/vendor_completions.d/multifiledownloader.fish

# PowerShell
multifiledownloader --completion powershell | Out-File -FilePath $PROFILE\multifiledownloader.ps1

# Elvish
multifiledownloader --completion elvish | tee $HOME/.elvish/completions/multifiledownloader.elv

Options

Option Description Default
-w, --workers Number of concurrent download workers CPU cores count
-d, --dest Destination directory for downloaded files current directory
-u, --urls Comma-separated list of URLs to download required
-c, --clean Clean destination directory before downloading false
--completion Generate shell completion script -
-h, --help Show help message -
-V, --version Show version information -

Installation

Using Cargo

cargo install multifiledownloader

From Source

git clone https://github.com/tralahm/multifiledownloader-rs.git
cd multifiledownloader-rs
cargo build --release
cp target/release/multifiledownloader /usr/local/bin/

Troubleshooting

Common Issues

  1. Permission Errors

    • Ensure you have write permissions to the destination directory
    • Use --clean flag to remove existing files before downloading
  2. Network Issues

    • Check if URLs are accessible
    • Use fewer workers if experiencing connection timeouts
  3. Progress Bar Issues

    • Progress bars may not display correctly in some terminals
    • Try using a different terminal emulator if experiencing issues

Debugging

To enable debug logging:

RUST_LOG=debug multifiledownloader -u "url1,url2"

License

This project is licensed under the MIT License - see the LICENSE file for details.

Commit count: 10

cargo fmt