| Crates.io | ferriclink-core |
| lib.rs | ferriclink-core |
| version | 0.2.0 |
| created_at | 2025-10-02 13:49:28.229708+00 |
| updated_at | 2025-10-03 09:48:18.555707+00 |
| description | A Rust library for building AI applications, inspired by LangChain |
| homepage | https://ferrum-labs.github.io |
| repository | https://github.com/ferrum-labs/ferriclink |
| max_upload_size | |
| id | 1864403 |
| size | 408,556 |
Core abstractions for the FerricLink ecosystem, inspired by LangChain Core. This crate provides the fundamental building blocks for building AI applications with language models, tools, vector stores, and more.
Add FerricLink Core to your Cargo.toml:
[dependencies]
ferriclink-core = { version = "0.1", features = ["all"] }
use ferriclink_core::{
messages::AnyMessage,
language_models::{mock_chat_model, GenerationConfig},
runnables::Runnable,
};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// Create a mock chat model
let chat_model = mock_chat_model("gpt-4o-mini");
// Create a conversation
let messages = vec![
AnyMessage::human("Hello, how are you?"),
];
// Generate a response
let response = chat_model.generate_chat(
messages,
Some(GenerationConfig::new().with_temperature(0.7)),
None,
).await?;
println!("Response: {}", response.text());
Ok(())
}
messages)Conversation handling with different message types:
HumanMessage - User inputAIMessage - AI responsesSystemMessage - System instructionsToolMessage - Tool outputsAnyMessage - Union type for all messageslanguage_models)Abstractions for language models:
BaseLanguageModel - Core language model traitBaseLLM - Text generation modelsBaseChatModel - Chat/conversation modelsGenerationConfig - Configuration for text generationrunnables)Composable execution system:
Runnable<Input, Output> - Core runnable traitRunnableConfig - Configuration for runsRunnableSequence - Chain multiple runnablesRunnableParallel - Run multiple runnables in parallelvectorstores)Embedding storage and search:
VectorStore - Core vector store traitInMemoryVectorStore - In-memory implementationVectorSearchResult - Search results with similarity scorestools)Function calling system:
BaseTool - Core tool traitTool - Executable toolsToolCall - Tool invocationToolResult - Tool execution resultsToolCollection - Manage multiple toolscallbacks)Monitoring and tracing:
CallbackHandler - Event handling traitConsoleCallbackHandler - Console outputMemoryCallbackHandler - In-memory storageCallbackManager - Manage multiple handlersdocuments)Text processing:
Document - Text with metadataDocumentCollection - Multiple documentsToDocument - Convert to documentsFromDocument - Convert from documentsembeddings)Text embedding abstractions:
Embeddings - Core embedding traitEmbedding - Vector representationMockEmbeddings - Testing implementationretrievers)Document retrieval:
BaseRetriever - Core retriever traitVectorStoreRetriever - Vector-based retrievalMultiRetriever - Combine multiple retrieversThis crate is part of the FerricLink workspace. See the main README for development instructions.
# Build the crate
cargo build
# Build in release mode
cargo build --release
# Run tests
cargo test
# Check code
cargo check
# Run clippy
cargo clippy
This crate is currently in version 0.1.0 and is under active development. The API may change between minor versions until we reach 1.0.0.
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
Contributions are welcome! Please see the main README for contribution guidelines.