gpt-model

Crates.iogpt-model
lib.rsgpt-model
version0.1.0
sourcesrc
created_at2024-02-24 01:23:06.146746
updated_at2024-02-24 01:23:06.146746
descriptionPure-Rust inference wrapper for GPT-2 large language models.
homepage
repositoryhttps://gitlab.com/caer/gpt
max_upload_size
id1151167
size76,498
Caer (caer)

documentation

README

100% pure Rust inference wrapper for the GPT-2 (and possibly later) model family.

Getting a GPT Model

The GPT-2 model packaged within the crate's repository uses the original model trained by OpenAI, with minor modifications to support Tensorflow 2.0, and to support conversion to the ONNX model format.

When getting started with this crate, we recommend using our prebuilt version of the 124M (smallest) GPT-2 model; the model, encoder, and byte-pair encoding vocabulary for this model may all be downloaded from here.

Repository Structure

  • src/: Main crate contents, including a pure Rust implementation of the GPT-2 byte-pair encoder (tokenizer) and a Rust wrapper for loading and invoking an ONNX GPT-2 model.
  • gpt-2-model/: Python scripts and Docker files to download and export Tensorflow and ONNX versions of the GPT-2 model.
  • gpt-2-model/saved_models: Exported GPT-2 models. The latest prebuilt version of the 124M (smallest) GPT-2 model is shipped with this repo, as part of Git LFS.

License and Contributions

Except where otherwise noted, this project is Copyright (C) 2022-24 Brandon Sanders [me@caer.cc], and licensed under the AGPL-3.0-only.

The files within the gpt-2-model directory are Copyright (C) 2019 OpenAI and (C) 2022-24 Brandon Sanders, and licensed under an MIT-style license.

Contributions are always welcome!

Commit count: 0

cargo fmt