| Crates.io | store-stream |
| lib.rs | store-stream |
| version | 0.1.0 |
| created_at | 2024-12-24 12:15:33.515766+00 |
| updated_at | 2024-12-24 12:15:33.515766+00 |
| description | Parted storage for large files to S3 |
| homepage | |
| repository | https://github.com/wavey-ai/store-stream |
| max_upload_size | |
| id | 1493839 |
| size | 66,456 |
A Rust library that provides a high-level interface for interacting with S3-compatible object storage services. This implementation includes support for multipart uploads, byte range fetching, and efficient handling of large objects.
It is possible to stream multi-gb files to storage which will be saved at intervals of the minimum part size and can be addressed via byte range queries before the upload has completed.
let storage = Storage::new(
"https://your-endpoint.com".to_string(),
"your-key-id".to_string(),
"your-secret-key".to_string(),
5 * 1024 * 1024 // 5MB minimum part size
);
The library supports streaming uploads using Tokio channels:
use tokio::sync::mpsc;
use bytes::Bytes;
let (tx, rx) = mpsc::channel(16);
// Spawn a task to send data
tokio::spawn(async move {
let data = Bytes::from("Hello, S3!");
tx.send(data).await.unwrap();
});
// Upload the data
storage.upload("bucket-name", "object-key", rx).await?;
Fetch specific byte ranges from objects:
let bytes = storage
.get_byte_range("bucket-name", "object-key", 0, Some(100))
.await?;
// Create a bucket
storage.create_bucket("new-bucket").await?;
// Check if bucket exists
let exists = storage.bucket_exists("bucket-name").await?;
// List bucket contents
let result = storage.list_bucket("bucket-name").await?;
for object in result.objects {
println!("Key: {}, Size: {}", object.key, object.size);
}
MIT