# Ollama Kernel Experimental Jupyter Kernel for Ollama, allowing you to talk to local models like Llama 3, Qwen, Mistral, and more via Jupyter Notebooks and other clients. ## Installation ```bash cargo install ollama-kernel ollama-kernel --install ``` ## Usage Once installed, Jupyter should automatically detect the kernel. This was made out of love for [Ollama](https://ollama.ai/), a local model server for LLMs.