Crates.io | ollama-kernel |
lib.rs | ollama-kernel |
version | 0.4.0 |
source | src |
created_at | 2024-11-14 21:48:53.716952 |
updated_at | 2024-12-02 15:30:19.895945 |
description | Ollama Jupyter Kernel |
homepage | |
repository | https://github.com/runtimed/runtimed |
max_upload_size | |
id | 1448384 |
size | 85,916 |
Experimental Jupyter Kernel for Ollama, allowing you to talk to local models like Llama 3, Qwen, Mistral, and more via Jupyter Notebooks and other clients.
cargo install ollama-kernel
ollama-kernel --install
Once installed, Jupyter should automatically detect the kernel.
This was made out of love for Ollama, a local model server for LLMs.