llwm

Crates.iollwm
lib.rsllwm
version0.0.0
sourcesrc
created_at2024-12-01 03:59:29.519159
updated_at2024-12-01 03:59:29.519159
descriptionRun LLM inference with WebAssembly.
homepage
repository
max_upload_size
id1467210
size16,286
Kante Yin (kerthcet)

documentation

README

llama.wasm

Run LLM inference with WebAssembly. This is for experiment.

Commit count: 0

cargo fmt