| Crates.io | letta |
| lib.rs | letta |
| version | 0.1.2 |
| created_at | 2025-07-02 01:14:05.340274+00 |
| updated_at | 2025-07-02 01:14:05.340274+00 |
| description | A robust Rust client for the Letta REST API |
| homepage | |
| repository | https://github.com/orual/letta-rs |
| max_upload_size | |
| id | 1734314 |
| size | 875,417 |
A Rust client library for the Letta REST API, providing idiomatic Rust bindings for building stateful AI agents with persistent memory and context.
Unlike the Letta-provided TypeScript and Python libraries, this was not generated from the OpenAPI spec, but implemented by hand (with substantial LLM assistance). As such it exposes things in slightly different, mildly opinionated ways, and includes a number of Rust-oriented affordances.
PaginatedStreamAdd this to your Cargo.toml:
[dependencies]
letta = "0.1.2"
The letta crate includes an optional CLI tool for interacting with Letta servers:
# Install from crates.io
cargo install letta --features cli
# Or build from source
git clone https://github.com/orual/letta-rs
cd letta-rs
cargo install --path . --features cli
After installation, the letta-client command will be available in your PATH.
Set your API key (for cloud deployments):
export LETTA_API_KEY=your-api-key
Or specify the base URL for local servers:
export LETTA_BASE_URL=http://localhost:8283
# Check server health
letta-client health
# List all agents
letta-client agent list
# Create a new agent
letta-client agent create -n "My Assistant" -m letta/letta-free
# Send a message to an agent (with streaming)
letta-client message send -a <agent-id> "Hello, how are you?"
# View agent memory
letta-client memory view -a <agent-id>
# Upload a document to a source
letta-client sources create -n "docs" -e letta/letta-free
letta-client sources files upload <source-id> -f document.pdf
# Get help for any command
letta-client --help
letta-client agent --help
The CLI supports multiple output formats:
--output summary (default) - Human-readable format--output json - JSON output for scripting--output pretty - Pretty-printed JSON| letta client | letta server |
|---|---|
| 0.1.2 | 0.8.8 |
| 0.1.0-0.1.1 | 0.8.x |
use letta::{ClientConfig, LettaClient};
use letta::types::{CreateAgentRequest, AgentType, ModelEndpointType, EmbeddingEndpointType};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// Create client for local Letta server
let config = ClientConfig::new("http://localhost:8283")?;
let client = LettaClient::new(config)?;
// Create an agent
let agent_request = CreateAgentRequest {
name: "My Assistant".to_string(),
agent_type: Some(AgentType::MemGPT),
llm_config: Some(json!({
"model_endpoint_type": ModelEndpointType::Openai,
"model_endpoint": "https://api.openai.com/v1",
"model": "gpt-4",
})),
embedding_config: Some(json!({
"embedding_endpoint_type": EmbeddingEndpointType::Openai,
"embedding_endpoint": "https://api.openai.com/v1",
"embedding_model": "text-embedding-ada-002",
})),
..Default::default()
};
let agent = client.agents().create(agent_request).await?;
println!("Created agent: {}", agent.id);
// Send a message to the agent
let response = client
.messages()
.send(&agent.id, "Hello! How are you today?", None)
.await?;
// Stream responses
let mut stream = response.into_stream();
while let Some(chunk) = stream.next().await {
match chunk? {
MessageChunk::AssistantMessage(msg) => {
print!("{}", msg.message);
}
MessageChunk::FunctionCall(call) => {
println!("Function: {} with args: {}", call.name, call.arguments);
}
_ => {}
}
}
Ok(())
}
use letta::types::{CreateAgentRequest, AgentType, Block, LLMConfig};
// Create agent using builder pattern
let request = CreateAgentRequest::builder()
.name("My Assistant")
.agent_type(AgentType::MemGPT)
.description("A helpful coding assistant")
.model("letta/letta-free") // Shorthand for LLM config
.embedding("letta/letta-free") // Shorthand for embedding config
.build();
let agent = client.agents().create(request).await?;
// Create custom memory blocks with builder
let human_block = Block::human("Name: Alice\nRole: Software Engineer")
.label("human");
let persona_block = Block::persona("You are a helpful coding assistant.")
.label("persona");
// Add to archival memory
client
.memory()
.insert_archival_memory(&agent.id, "Important fact: Rust is memory safe")
.await?;
// Search archival memory
let memories = client
.memory()
.search_archival_memory(&agent.id, "Rust safety", Some(10))
.await?;
for memory in memories {
println!("Found: {}", memory.text);
}
// Get paginated list of agents
let mut agent_stream = client
.agents()
.paginated()
.limit(10)
.build();
while let Some(agent) = agent_stream.next().await {
let agent = agent?;
println!("Agent: {} ({})", agent.name, agent.id);
}
use letta::types::{CreateToolRequest, Tool};
// Create a custom tool
// Note: this example is simplified, see the tool documentation for details.
let tool = CreateToolRequest {
name: "get_weather".to_string(),
description: Some("Get current weather for a location".to_string()),
source_code: r#"
def get_weather(location: str) -> str:
"""Get weather for a location."""
return f"The weather in {location} is sunny and 72°F"
"#.to_string(),
source_type: Some("python".to_string()),
..Default::default()
};
let created_tool = client.tools().create(tool).await?;
// Add tool to agent
client
.agents()
.add_tool(&agent.id, &created_tool.id)
.await?;
// No authentication required for local server
let config = ClientConfig::new("http://localhost:8283")?;
let client = LettaClient::new(config)?;
// Use API key for cloud deployment
let config = ClientConfig::new("https://api.letta.com")?
.with_api_key("your-api-key");
let client = LettaClient::new(config)?;
// Add custom headers like X-Project
let config = ClientConfig::new("http://localhost:8283")?
.with_header("X-Project", "my-project")?;
The library provides comprehensive error handling with detailed context:
use letta::error::LettaError;
match client.agents().get(&agent_id).await {
Ok(agent) => println!("Found agent: {}", agent.name),
Err(LettaError::Api { status, message, .. }) => {
eprintln!("API error {}: {}", status, message);
}
Err(e) => eprintln!("Other error: {}", e),
}
# Clone the repository
git clone https://github.com/yourusername/letta
cd letta
# Build the library
cargo build
# Run tests
cargo test
# Build documentation
cargo doc --open
# Start local Letta server for testing
cd local-server
docker compose up -d
# Run integration tests
cargo test
This project is licensed under the MIT License - see the LICENSE file for details.