Crates.io | boss |
lib.rs | boss |
version | 0.0.3 |
source | src |
created_at | 2019-11-30 14:56:24.457164 |
updated_at | 2019-12-02 17:06:49.225252 |
description | Baseball Open Source Software |
homepage | https://docs.rs/boss |
repository | |
max_upload_size | |
id | 185541 |
size | 1,669,322 |
A pure Rust baseball data aggregation and analytics library. Supports data aggregation from a number of sources including the MLB stats API, MLB gameday files. Eventually, other sources such as RetroSheet and NCAA will be added.
BOSS is designed from the ground up to be extremely efficient. All text fields that can be converted to an enum have been carefully mapped. The challenge with baseball data isn't the computational complexity of data gathering, it is the sheer size of the data set. One of BOSS' primary design goals is to be as efficient as possible.
TODO
[dependencies]
boss = "0.1"
baseballr by Bill Petti
pitchrx by Carson Sievert
Building a baseball data engine in Rust will enable everyday fans to perform data-intensive workloads, as well as efficient data gathering. Ambitiously, aiming for a baseball data platform that will rival what MLB clubs have internally, from an analytics perspective. Clearly, MLB clubs will have access to more, and likely better, data, however, by leveraging Rust we should be able to build the most performant baseball data engine in the world.
This project is also a learning project for the author and should change a lot as the author better hones his Rust skills.
BOSS relies on three crates for the bullk of its workload.
Rayon is used to add parallelism. At some point, I'm hoping this evolves into Async Parallel Generators (or something like that) where Rayon is aware of all the yield points in any of its iterations so it can bounce around as needed.
Parallel out of the box. Player bios are memoized (cached) once they've been downloaded once, drastically reducing the number of network calls.
Captures historical player weight info (only for XML version, might add to JSON version later)
Flattens out all the data and serializes to an easy to use CSV file that can be ported to Tableau or other
Would love to build a Rust wrapper for the Tableay Hyper API, but don't how to do that... yet.
The data pieces that take the most memory are the play descriptions, however these are very repetitive and should be highly compressible. If we ever build a non-flat materialization, we'll probably want to compress the descriptions.