roboto

Crates.ioroboto
lib.rsroboto
version0.1.1
sourcesrc
created_at2024-03-30 17:38:47.312532
updated_at2024-03-31 14:50:50.308472
descriptionParse and use Robots.txt files
homepage
repositoryhttps://github.com/alexrudy/roboto
max_upload_size
id1191224
size30,224
Alex Rudy (alexrudy)

documentation

README

Roboto: Parse and use robots.txt files

Roboto provides a type-safe way to parse and use robots.txt files. It is based on the Robots Exclusion Protocol and is used to approximately try control the behavior of web crawlers and other web robots.

Installation

Add this to your Cargo.toml:

[dependencies]
roboto = "0.1"

Usage

use roboto::Robots;

let robots = r#"
User-agent: *
Disallow: /private
Disallow: /tmp
"#.parse::<Robots>().unwrap();

let user_agent = "googlebot".parse().unwrap();

assert_eq!(robots.is_allowed(&user_agent, "/public"), true);
Commit count: 17

cargo fmt