Crates.io | page-hunter |
lib.rs | page-hunter |
version | 0.3.0 |
source | src |
created_at | 2024-05-16 02:34:29.714186 |
updated_at | 2024-10-13 19:57:39.231723 |
description | The pagination powerhouse, built with Rust |
homepage | https://github.com/jmtamayo/page-hunter |
repository | https://github.com/jmtamayo/page-hunter |
max_upload_size | |
id | 1241712 |
size | 126,481 |
Page Hunter library is a Rust-based pagination tool that provides a way to manage and navigate through pages of data. It offers a set of resources that encapsulates all the necessary pagination information such as the current page, total pages, previous page, next page and the items on the current page.
The library also includes validation methods to ensure the integrity of the pagination data. It's designed to be flexible and easy to integrate into any Rust project that requires pagination functionality and standard data validation.
To use page-hunter from GitHub repository with specific version, set the dependency in Cargo.toml file as follows:
[dependencies]
page-hunter = { git = "https://github.com/JMTamayo/page-hunter.git", version = "0.3.0", features = ["serde"] }
You can depend on it via cargo by adding the following dependency to your Cargo.toml
file:
[dependencies]
page-hunter = { version = "0.3.0", features = ["utoipa", "pg-sqlx"] }
serde
: Add Serialize and Deserialize support for Page
and Book
based on serde. This feature is useful for implementing pagination models as a request or response body in REST APIs, among other implementations.utoipa
: Add ToSchema support for Page
and Book
based on utoipa. This feature is useful for generating OpenAPI schemas for pagination models. This feature depends on the serde
feature and therefore you only need to implement utoipa
to get both.pg-sqlx
: Add support for pagination with SQLx for PostgreSQL database.mysql-sqlx
: Add support for pagination with SQLx for MySQL database.sqlite-sqlx
: Add support for pagination with SQLx for SQLIte database.The page-hunter library provides two main models to manage pagination:
Page
: Represents a page of records with the current page, total pages, previous page, next page, and the items on the current page.Book
: Represents a book of pages with a collection of Page
instances.The library also provides a set of functions to paginate records into a Page
model and bind records into a Book
model. The following examples show how to use the page-hunter library:
If you need to paginate records and get a specific Page
:
use page_hunter::*;
let records: Vec<u32> = vec![1, 2, 3, 4, 5];
let page: usize = 0;
let size: usize = 2;
let pagination_result: PaginationResult<Page<u32>> =
paginate_records(&records, page, size);
To create a new instance of a Page
from known parameters:
use page_hunter::*;
let items: Vec<u32> = vec![1, 2];
let page: usize = 0;
let size: usize = 2;
let total_elements: usize = 5;
let page_model_result: PaginationResult<Page<u32>> = Page::new(
&items,
page,
size,
total_elements,
);
On feature serde
enabled, you can serialize and deserialize a Page
as follows:
use page_hunter::*;
let items: Vec<u32> = vec![1, 2];
let page: usize = 0;
let size: usize = 2;
let total_elements: usize = 5;
let page_model: PaginationResult<Page<u32>> = Page::new(
&items,
page,
size,
total_elements,
).unwrap_or_else(|error| {
panic!("Error creating page model: {:?}", error);
});
let serialized_page: String = serde_json::to_string(&page_model).unwrap_or_else(|error| {
panic!("Error serializing page model: {:?}", error);
});
let deserialized_page: Page<u32> = serde_json::from_str(&serialized_page).unwrap_or_else(|error| {
panic!("Error deserializing page model: {:?}", error);
});
When you create a new Page
instance from the constructor or deserialization, the following rules are validated for the fields on the page:
None
.None
.If any of these rules are violated, a PaginationError
will be returned.
If you need to bind records into a Book
model:
use page_hunter::*;
let records: Vec<u32> = vec![1, 2, 3, 4, 5];
let size: usize = 2;
let book_result: PaginationResult<Book<u32>> =
bind_records(&records, size);
To create a new Book
instance from known parameters:
use page_hunter::*;
let sheets: Vec<Page<u32>> = vec![
Page::new(&vec![1, 2], 0, 2, 5).unwrap(),
Page::new(&vec![3, 4], 1, 2, 5).unwrap(),
];
let book: Book<u32> = Book::new(&sheets);
On feature serde
enabled, you can serialize and deserialize a Book
as follows:
use page_hunter::*;
let sheets: Vec<Page<u32>> = vec![
Page::new(&vec![1, 2], 0, 2, 5).unwrap(),
Page::new(&vec![3, 4], 1, 2, 5).unwrap(),
];
let book: Book<u32> = Book::new(&sheets);
let serialized_book: String = serde_json::to_string(&book).unwrap_or_else(|error| {
panic!("Error serializing book model: {:?}", error);
});
let deserialized_book: Book<u32> = serde_json::from_str(&serialized_book).unwrap_or_else(|error| {
panic!("Error deserializing book model: {:?}", error);
});
On feature utoipa
enabled, you can generate OpenAPI schemas for Page
and Book
models as follows:
use page_hunter::*;
use utoipa::ToSchema;
use serde::{Deserialize, Serialize};
#[derive(Clone, ToSchema)]
pub struct Person {
id: u16,
name: String,
last_name: String,
still_alive: bool,
}
pub type PeoplePage = Page<Person>;
pub type PeopleBook = Book<Person>;
#[derive(OpenApi)]
#[openapi(
components(schemas(PeoplePage, PeopleBook))
)]
pub struct ApiDoc;
Take a look at the examples folder where you can find practical uses in REST API implementations with some web frameworks.
To paginate records from a PostgreSQL database:
use page_hunter::*;
use sqlx::postgres::{PgPool, Postgres};
use sqlx::{FromRow, QueryBuilder};
use uuid::Uuid;
#[tokio::main]
async fn main() {
#[derive(Clone, Debug, FromRow)]
pub struct Country {
id: Uuid,
name: String,
}
let pool: PgPool = PgPool::connect(
"postgres://username:password@localhost/db"
).await.unwrap_or_else(|error| {
panic!("Error connecting to database: {:?}", error);
});
let query: QueryBuilder<Postgres> = QueryBuilder::new(
"SELECT * FROM db.geo.countries"
);
let page: Page<Country> =
query.paginate(&pool, 0, 10).await.unwrap_or_else(|error| {
panic!("Error paginating records: {:?}", error);
});
}
To paginate records from a MySQL database:
use page_hunter::*;
use sqlx::mysql::{MySqlPool, MySql};
use sqlx::{FromRow, QueryBuilder};
use uuid::Uuid;
#[tokio::main]
async fn main() {
#[derive(Clone, Debug, FromRow)]
pub struct Country {
id: Uuid,
name: String,
}
let pool: MySqlPool = MySqlPool::connect(
"mysql://username:password@localhost/db"
).await.unwrap_or_else(|error| {
panic!("Error connecting to database: {:?}", error);
});
let query: QueryBuilder<MySql> = QueryBuilder::new(
"SELECT * FROM countries"
);
let page: Page<Country> =
query.paginate(&pool, 0, 10).await.unwrap_or_else(|error| {
panic!("Error paginating records: {:?}", error);
});
}
To paginate records from a SQLite database:
use page_hunter::*;
use sqlx::sqlite::{SqlitePool, Sqlite};
use sqlx::{FromRow, QueryBuilder};
use uuid::Uuid;
#[tokio::main]
async fn main() {
#[derive(Clone, Debug, FromRow)]
pub struct Country {
id: Uuid,
name: String,
}
let pool: SqlitePool = SqlitePool::connect(
"sqlite://countries.db"
).await.unwrap_or_else(|error| {
panic!("Error connecting to database: {:?}", error);
});
let query: QueryBuilder<Sqlite> = QueryBuilder::new(
"SELECT * FROM countries"
);
let page: Page<Country> =
query.paginate(&pool, 0, 10).await.unwrap_or_else(|error| {
panic!("Error paginating records: {:?}", error);
});
}
To test page-hunter
, follow these recommendations:
Create local.env
file at workspace folder to store the required environment variables. For example,
DB_HOST=localhost
DB_USER=test
DB_PASSWORD=docker
DB_NAME=test
PG_DB_PORT=5432
MYSQL_DB_PORT=3306
SQLITE_DB_PATH=$PWD/page-hunter/tests/migrations/sqlite/test.db
SQLITE_MIGRATIONS_PATH=page-hunter/tests/migrations/sqlite
MYSQL_MIGRATIONS_PATH=page-hunter/tests/migrations/mysql
PG_MIGRATIONS_PATH=page-hunter/tests/migrations/postgres
make install-tools
This command installs sqlx and cargo-llvm-cov.
Run PostgreSQL-MySQL databases as Docker containers and create SQLite database file using the following commands:
make pg-db-docker
make mysql-db-docker
make sqlite-db-local
make run-postgres-migrations
make revert-postgres-migrations
make run-mysql-migrations
make revert-mysql-migrations
make run-sqlite-migrations
make revert-sqlite-migrations
make test
make test-llvm-cov
The Page Hunter project is open source and therefore any interested software developer can contribute to its improvement. To contribute, take a look at the following recommendations: