Crates.io | story-dl |
lib.rs | story-dl |
version | 0.6.0 |
source | src |
created_at | 2019-11-06 21:01:41.118393 |
updated_at | 2020-07-24 01:25:25.083169 |
description | Story web scraping |
homepage | |
repository | https://gitlab.com/Txuritan/story-dl |
max_upload_size | |
id | 178811 |
size | 537,314 |
story-dl is a program allowing you to download stories from a multitude of different sites.
This scraper does not support the scraping of characters, pairings, tags, warnings, or anything other than the required information. Or at least not yet.
As of v0.5.0, stroy-dl contains a modified but untested version of Élisabeth Henry's epub-builder.
story-dl tries to have a simple command line, with just enough to change what you need to but simple enough so that it doesn't require 20 man pages.
Download story as EPub:
story-dl -u <URL> -o epub
Download all in import file as EPub:
story-dl -f import.json -o epub
import.json
:
[
"<URL>",
{
"url": "<URL>"
}
]
Add this to your Cargo.toml
.
story_dl = { version = "0.3", default-features = false }
This will add story-dl but disabled all crates required by the command version.
Choose what website will be scraped, lets use FanFiction
for this example.
// Import required structs.
use story_dl::fanfiction;
#[tokio::main]
async fn main() {
// Convert story url string into the required Uri.
let url = "<story url>".parse().expect("Not a valid URL");
// Start the scraper, this will return the finished story.
let story = fanfiction::scrape(&url).await.expect("Error scraping story");
// Scraped information
println!("Title: {}", story.name);
println!("Authors: {}", story.authors.join(", "));
println!("Chapters: {}", story.chapters);
}
Internally uses a modified version of Gonzih's CrabQuery.
Thanks to Quite Wraith on Archive Of Our Own and FanFiction.net as two of their stories, "Little Cog" and "Fellow Traveler", are used as test data for site scrapping.