| Crates.io | ceylon-next |
| lib.rs | ceylon-next |
| version | 0.1.1 |
| created_at | 2025-11-19 07:21:36.1226+00 |
| updated_at | 2025-11-19 07:47:38.414227+00 |
| description | A powerful AI agent framework with goal-oriented capabilities, memory management, and tool integration |
| homepage | |
| repository | https://github.com/ceylonai/next |
| max_upload_size | |
| id | 1939568 |
| size | 675,368 |
A powerful and flexible Rust framework for building AI agents with goal-oriented capabilities, memory management, and tool integration.
Add Ceylon to your Cargo.toml:
[dependencies]
ceylon-next = "0.1.0"
tokio = { version = "1", features = ["rt-multi-thread", "macros"] }
use ceylon_next::agent::Agent;
use ceylon_next::tasks::TaskRequest;
#[tokio::main]
async fn main() {
// Create a new agent
let mut agent = Agent::new("MyAssistant", "openai::gpt-4");
// Create a task
let task = TaskRequest::new("What is the capital of France?");
// Run the agent
let response = agent.run(task).await;
println!("Response: {:?}", response.result());
}
Set your API key as an environment variable:
export OPENAI_API_KEY="your-api-key-here"
Extend your agent's capabilities with custom tools:
use ceylon_next::agent::Agent;
use ceylon_next::tools::ToolTrait;
use serde_json::json;
// Define a custom tool
struct CalculatorTool;
impl ToolTrait for CalculatorTool {
fn name(&self) -> String {
"calculator".to_string()
}
fn description(&self) -> String {
"Performs basic arithmetic operations".to_string()
}
fn input_schema(&self) -> serde_json::Value {
json!({
"type": "object",
"properties": {
"operation": {"type": "string", "enum": ["add", "subtract", "multiply", "divide"]},
"a": {"type": "number"},
"b": {"type": "number"}
},
"required": ["operation", "a", "b"]
})
}
fn execute(&self, input: serde_json::Value) -> serde_json::Value {
let op = input["operation"].as_str().unwrap();
let a = input["a"].as_f64().unwrap();
let b = input["b"].as_f64().unwrap();
let result = match op {
"add" => a + b,
"subtract" => a - b,
"multiply" => a * b,
"divide" => a / b,
_ => 0.0,
};
json!({"result": result})
}
}
#[tokio::main]
async fn main() {
let mut agent = Agent::new("Calculator Agent", "openai::gpt-4");
agent.add_tool(CalculatorTool);
let task = TaskRequest::new("What is 15 multiplied by 7?");
let response = agent.run(task).await;
println!("{:?}", response.result());
}
Agents automatically maintain conversation history:
use ceylon_next::agent::Agent;
use ceylon_next::tasks::TaskRequest;
#[tokio::main]
async fn main() {
let mut agent = Agent::new("MemoryAgent", "openai::gpt-4");
// First conversation
let task1 = TaskRequest::new("My name is Alice");
agent.run(task1).await;
// Second conversation - agent remembers context
let task2 = TaskRequest::new("What is my name?");
let response = agent.run(task2).await;
// Agent should respond with "Alice"
// Search memory
let memories = agent.search_memory("Alice").await;
println!("Found {} relevant conversations", memories.len());
}
Ceylon supports 13+ LLM providers out of the box:
| Provider | Example Model String | API Key Env Var |
|---|---|---|
| OpenAI | openai::gpt-4 |
OPENAI_API_KEY |
| Anthropic | anthropic::claude-3-5-sonnet-20241022 |
ANTHROPIC_API_KEY |
| Ollama | ollama::llama3.2 |
(local) |
| DeepSeek | deepseek::deepseek-coder |
DEEPSEEK_API_KEY |
| X.AI (Grok) | xai::grok-beta |
XAI_API_KEY |
| Google Gemini | google::gemini-pro |
GOOGLE_API_KEY |
| Groq | groq::mixtral-8x7b-32768 |
GROQ_API_KEY |
| Azure OpenAI | azure::gpt-4 |
AZURE_OPENAI_API_KEY |
| Cohere | cohere::command |
COHERE_API_KEY |
| Mistral | mistral::mistral-large-latest |
MISTRAL_API_KEY |
| Phind | phind::Phind-CodeLlama-34B-v2 |
PHIND_API_KEY |
| OpenRouter | openrouter::anthropic/claude-3-opus |
OPENROUTER_API_KEY |
| ElevenLabs | elevenlabs::eleven_monolingual_v1 |
ELEVENLABS_API_KEY |
Ceylon uses Cargo features to enable optional functionality:
[dependencies]
# Default: std features, vector memory, and CLI runner
ceylon-next = "0.1.0"
# Minimal installation (no tokio, no LLM, suitable for WASM)
ceylon-next = { version = "0.1.0", default-features = false }
# With specific vector providers
ceylon-next = { version = "0.1.0", features = ["vector-openai"] }
ceylon-next = { version = "0.1.0", features = ["vector-huggingface-local"] }
# All vector providers
ceylon-next = { version = "0.1.0", features = ["full-vector"] }
std (default): Standard features including tokio, LLM support, SQLite memory, and MessagePack serializationvector: Base vector memory functionalityvector-openai: OpenAI embeddings for vector memoryvector-ollama: Ollama embeddings for vector memoryvector-huggingface: HuggingFace API embeddingsvector-huggingface-local: Local HuggingFace embeddings using Candlefull-vector: All vector providersrunner: Interactive CLI runnerwasm: WebAssembly supportCreate agents that can break down complex tasks:
use ceylon_next::agent::Agent;
use ceylon_next::goal::Goal;
use ceylon_next::tasks::TaskRequest;
#[tokio::main]
async fn main() {
let mut agent = Agent::new("ProjectManager", "openai::gpt-4");
// Create a goal with success criteria
let mut goal = Goal::new(
"Launch Product",
"Successfully launch the new product to market"
);
goal.add_criterion("Product is tested and bug-free");
goal.add_criterion("Marketing materials are ready");
goal.add_criterion("Launch event is scheduled");
// Add sub-goals
goal.add_sub_goal(Goal::new("Development", "Complete development"));
goal.add_sub_goal(Goal::new("Marketing", "Create marketing campaign"));
goal.add_sub_goal(Goal::new("Launch", "Execute launch"));
// Track progress
println!("Progress: {}%", goal.get_progress());
}
The repository includes numerous examples:
Run examples from the repository:
# Clone the repository
git clone https://github.com/ceylonai/next.git
cd next
# Run an example
cargo run --example 01_basic_agent --manifest-path ceylon/Cargo.toml
Ceylon is organized into several core modules:
agent: Core agent implementation and lifecycle managementtools: Tool system and built-in toolsmemory: Memory backends (in-memory, SQLite, vector)llm: LLM provider integrations and abstractionsgoal: Goal-oriented task managementrunner: Interactive CLI runnertasks: Task definitions and executionWe welcome contributions! Please see our GitHub repository for more information.
Licensed under either of:
at your option.
Ceylon is built on top of the excellent llm crate for LLM provider integrations.