| Crates.io | feedparser-rs |
| lib.rs | feedparser-rs |
| version | 0.4.3 |
| created_at | 2025-12-16 02:13:24.796317+00 |
| updated_at | 2026-01-15 01:59:23.285541+00 |
| description | High-performance RSS/Atom/JSON Feed parser |
| homepage | |
| repository | https://github.com/bug-ops/feedparser-rs |
| max_upload_size | |
| id | 1987075 |
| size | 736,121 |
High-performance RSS/Atom/JSON Feed parser written in Rust.
This is the core parsing library that powers the Python and Node.js bindings.
cargo add feedparser-rs
Or add to your Cargo.toml:
[dependencies]
feedparser-rs = "0.2"
[!IMPORTANT] Requires Rust 1.88.0 or later (edition 2024).
use feedparser_rs::parse;
let xml = r#"
<?xml version="1.0"?>
<rss version="2.0">
<channel>
<title>My Blog</title>
<item>
<title>Hello World</title>
<link>https://example.com/1</link>
</item>
</channel>
</rss>
"#;
let feed = parse(xml.as_bytes())?;
assert_eq!(feed.feed.title.as_deref(), Some("My Blog"));
assert_eq!(feed.entries.len(), 1);
# Ok::<(), feedparser_rs::FeedError>(())
Fetch feeds directly from URLs with automatic compression handling:
use feedparser_rs::parse_url;
let feed = parse_url("https://example.com/feed.xml", None, None, None)?;
println!("Title: {:?}", feed.feed.title);
println!("Entries: {}", feed.entries.len());
// Subsequent fetch with caching (uses ETag/Last-Modified)
let feed2 = parse_url(
"https://example.com/feed.xml",
feed.etag.as_deref(),
feed.modified.as_deref(),
None
)?;
if feed2.status == Some(304) {
println!("Not modified, use cached version");
}
# Ok::<(), feedparser_rs::FeedError>(())
[!TIP] Use conditional GET with ETag/Last-Modified to minimize bandwidth when polling feeds.
To disable HTTP support and reduce dependencies:
[dependencies]
feedparser-rs = { version = "0.2", default-features = false }
| Feature | Description | Default |
|---|---|---|
http |
URL fetching with reqwest (gzip/deflate/brotli) | Yes |
The library uses a "bozo" flag (like Python's feedparser) to indicate parsing errors while still returning partial results:
use feedparser_rs::parse;
let malformed = b"<rss><channel><title>Broken</title></rss>";
let feed = parse(malformed)?;
assert!(feed.bozo);
assert!(feed.bozo_exception.is_some());
// Still can access parsed data
assert_eq!(feed.feed.title.as_deref(), Some("Broken"));
# Ok::<(), feedparser_rs::FeedError>(())
To prevent resource exhaustion (DoS protection), the parser enforces configurable limits:
use feedparser_rs::{parse_with_limits, ParserLimits};
let limits = ParserLimits {
max_entries: 100,
max_nesting_depth: 20,
..Default::default()
};
let feed = parse_with_limits(xml.as_bytes(), limits)?;
# Ok::<(), feedparser_rs::FeedError>(())
[!NOTE] Default limits are generous for typical feeds. Use
ParserLimits::strict()for untrusted input.
Measured on Apple M1 Pro:
| Feed Size | Time | Throughput |
|---|---|---|
| Small (2 KB) | 10.7 µs | 187 MB/s |
| Medium (20 KB) | 93.6 µs | 214 MB/s |
| Large (200 KB) | 939 µs | 213 MB/s |
Format detection: 128 ns
Run benchmarks:
cargo bench
feedparser-rs on npmfeedparser-rs on PyPIMinimum Supported Rust Version: 1.88.0 (edition 2024).
MSRV increases are considered breaking changes and will result in a minor version bump.
Licensed under either of:
at your option.