web-scraper-flows

Crates.ioweb-scraper-flows
lib.rsweb-scraper-flows
version0.1.0
sourcesrc
created_at2023-06-09 07:47:23.561067
updated_at2023-06-09 07:47:23.561067
descriptionWeb scraper integration for flows.network
homepage
repository
max_upload_size
id886039
size6,016
DarumaDocker!! (DarumaDocker)

documentation

https://docs.rs/web-scraper-flows

README

This is a library for integrating Web Scraper in your flow function for flows.network.

Visit Web Scraper

Below examples show a lambda service that responds with the text content of a web page for a url passed as query parameter.

use std::collections::HashMap;

use lambda_flows::{request_received, send_response};
use serde_json::Value;
use web_scraper_flows::get_page_text;

#[no_mangle]
#[tokio::main(flavor = "current_thread")]
pub async fn run() {
    request_received(handler).await;
}

async fn handler(qry: HashMap<String, Value>, _body: Vec<u8>) {
    let url = qry.get("url").expect("No url provided").as_str().unwrap();

    match get_page_text(url).await {
        Ok(text) => send_response(
            200,
            vec![(
                String::from("content-type"),
                String::from("text/plain; charset=UTF-8"),
            )],
            text.as_bytes().to_vec(),
        ),
        Err(e) => send_response(
            400,
            vec![(
                String::from("content-type"),
                String::from("text/plain; charset=UTF-8"),
            )],
            e.as_bytes().to_vec(),
        ),
    }
}

The whole document is here.

Commit count: 0

cargo fmt