| Crates.io | vibelang |
| lib.rs | vibelang |
| version | 0.1.1 |
| created_at | 2025-07-25 16:16:18.46193+00 |
| updated_at | 2025-07-28 12:21:29.537797+00 |
| description | Programmatically instantiate Web Agents from Vibelang files |
| homepage | |
| repository | https://github.com/Mec-iS/vibelang-rs |
| max_upload_size | |
| id | 1767802 |
| size | 155,179 |
Create programmatically any agent you need from an annotated payload, using VibeLang.
VibeLang is a format to describe LLM interactions to generic clients, base on *Meaning Typed Prompting. As presented in this paper.
It works for now it only works with Ollama but simple clients to any OpenAI-style API could be implemented.
examples/)cargo run -- your_file.vibegenerated/main.rs to make the desired calls to the LLM using the pregenerated codecd generate && cargo run. Enjoy$ cargo build
Run an example:
$ cargo run -- examples/knowledge_retrieval.vibe --output-dir ./generated
Run tests:
$ cargo test
# OR
$ cargo test --test test_unit_extra
For now this only support Ollama.
$ curl -fsSL https://ollama.com/install.sh | sh
Check localhost:11434 in your browser.
$ ollama pull <MODEL>
$ ollama serve
$ ollama run <MODEL>
Models available: link.
Set:
export OLLAMA_MODEL=llama3.1
or any other model you have downloaded to change model.