Crates.io | ollama-kernel |
lib.rs | ollama-kernel |
version | 0.6.0 |
created_at | 2024-11-14 21:48:53.716952+00 |
updated_at | 2025-09-19 21:34:26.513337+00 |
description | Ollama Jupyter Kernel |
homepage | |
repository | https://github.com/runtimed/runtimed |
max_upload_size | |
id | 1448384 |
size | 87,312 |
Experimental Jupyter Kernel for Ollama, allowing you to talk to local models like Llama 3, Qwen, Mistral, and more via Jupyter Notebooks and other clients.
cargo install ollama-kernel
ollama-kernel --install
Once installed, Jupyter should automatically detect the kernel.
This was made out of love for Ollama, a local model server for LLMs.