| Crates.io | rig-bailian |
| lib.rs | rig-bailian |
| version | 0.1.5 |
| created_at | 2025-11-06 09:50:28.344117+00 |
| updated_at | 2026-01-14 04:04:35.053439+00 |
| description | Rig adapter for BaiLian: integrates the BaiLian AI service with the Rig ecosystem (request/response types, streaming, error handling). |
| homepage | https://github.com/ooiai/rig-extend |
| repository | |
| max_upload_size | |
| id | 1919373 |
| size | 90,388 |
Rig adapter for Alibaba BaiLian (DashScope). This crate integrates BaiLian’s OpenAI‑compatible APIs into the Rig ecosystem with a consistent, strongly‑typed interface for:
Use this adapter to swap BaiLian in and out with other providers supported by Rig with minimal code changes.
Documentation: https://docs.rs/rig-bailian
Client::from_env() and Client::builder(...).agent(model), .embeddings(model), and rerank model helpersderive(Embed)Key constants:
QWEN3_MAX: a convenience model id for chat completionsTEXT_EMBEDDING_V4: a convenience model id for embeddingsGTE_RERANK_V2: a convenience rerank model idBAILIAN_API_BASE_URL: default base URL (https://dashscope.aliyuncs.com/compatible-mode/v1)From crates.io (recommended):
[dependencies]
rig-bailian = "0.1"
rig-core = "0.28.0" # Rig core
rig-derive = "0.1.10" # Optional: for derive macros like Embed
From a workspace/path (if you’re developing locally):
[dependencies]
rig-bailian = { path = "../rig-bailian" }
rig-core = "0.28.0"
rig-derive = "0.1.10"
BAILIAN_API_KEY (required): Your DashScope API key.BAILIAN_BASE_URL (optional): Override API base. Defaults to:
https://dashscope.aliyuncs.com/compatible-mode/v1.Example:
export BAILIAN_API_KEY="sk-xxxxxxxx"
# export BAILIAN_BASE_URL="https://dashscope.aliyuncs.com/compatible-mode/v1"
Below are minimal snippets for chat, embeddings, and reranking.
use rig::completion::Prompt;
use rig::prelude::*;
#[tokio::main]
async fn main() -> anyhow::Result<()> {
// Configure via env: BAILIAN_API_KEY, optional BAILIAN_BASE_URL
let client = rig_bailian::Client::from_env();
// Build an agent with context
let response = client
.agent(rig_bailian::QWEN3_MAX)
.context("You are a concise, helpful assistant.")
.prompt("Say hello in one sentence.")
.await?;
println!("BaiLian agent: {response}");
Ok(())
}
use rig::Embed;
use rig::prelude::*;
use rig_derive::Embed;
#[derive(Embed, Debug)]
struct Doc {
#[embed]
text: String,
}
#[tokio::main]
async fn main() -> anyhow::Result<()> {
let client = rig_bailian::Client::from_env();
let embeddings = client
.embeddings(rig_bailian::TEXT_EMBEDDING_V4)
.document(Doc { text: "Hello, world!".into() })?
.document(Doc { text: "Goodbye, world!".into() })?
.build()
.await?;
println!("{embeddings:?}");
Ok(())
}
use rig::prelude::*;
#[tokio::main]
async fn main() -> anyhow::Result<()> {
let client = rig_bailian::Client::from_env();
let docs = vec![
"Transformers are attention-based architectures.".to_string(),
"Reranking orders documents by relevance.".to_string(),
];
let model = client.rerank_model(rig_bailian::GTE_RERANK_V2, None);
let results = model
.rerank("what is a transformer?", &docs, Some(2), true)
.await?;
for r in results {
println!("#{}/{} => {}", r.index, r.relevance_score, r.text);
}
Ok(())
}
More end‑to‑end samples are available in the examples directory of this crate:
agent_wirh_bailian.rsbailian_embeddings.rsbailian_rereank.rsRun from this crate directory:
# Make sure BAILIAN_API_KEY is set
cargo run --example agent_wirh_bailian
cargo run --example bailian_embeddings
cargo run --example bailian_rereank
rig-core crate version for compatibility (examples use rig-core = "0.28.0").MIT. See the LICENSE file (or package metadata) for details.