| Crates.io | text-to-cypher |
| lib.rs | text-to-cypher |
| version | 0.1.9 |
| created_at | 2025-12-23 06:04:40.745081+00 |
| updated_at | 2026-01-18 13:49:34.835964+00 |
| description | A library and REST API for translating natural language text to Cypher queries using AI models |
| homepage | https://github.com/FalkorDB/text-to-cypher |
| repository | https://github.com/FalkorDB/text-to-cypher |
| max_upload_size | |
| id | 2000792 |
| size | 565,640 |
A high-performance Rust library and API service that translates natural language text to Cypher queries for graph databases, featuring integration with genai and FalkorDB. Use as a library in your Rust applications or deploy the all-in-one Docker solution with integrated FalkorDB database, web browser interface, text-to-cypher API, and Model Context Protocol (MCP) server support!
Library Support: Now available as a Rust library! Use text-to-cypher directly in your Rust applications without the REST API overhead.
All-in-One Docker Solution: Our Docker image includes everything you need in a single container:
.env file with fallback to request parametersAdd text-to-cypher to your Cargo.toml:
[dependencies]
# For library usage only (without REST server)
text-to-cypher = { version = "0.1", default-features = false }
# For full server capabilities
text-to-cypher = "0.1"
Basic Example:
use text_to_cypher::{TextToCypherClient, ChatRequest, ChatMessage, ChatRole};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// Create a client
let client = TextToCypherClient::new(
"gpt-4o-mini",
"your-api-key",
"falkor://127.0.0.1:6379"
);
// Create a chat request
let request = ChatRequest {
messages: vec![
ChatMessage {
role: ChatRole::User,
content: "Find all actors who appeared in movies released after 2020".to_string(),
}
]
};
// Convert text to Cypher and execute
let response = client.text_to_cypher("movies", request).await?;
println!("Generated query: {}", response.cypher_query.unwrap());
println!("Result: {}", response.cypher_result.unwrap());
println!("Answer: {}", response.answer.unwrap());
Ok(())
}
More Examples:
See the library usage example for comprehensive examples including:
TextToCypherClientRun the example:
# Ensure FalkorDB is running
docker run -d -p 6379:6379 falkordb/falkordb:latest
# Set your API key
export OPENAI_API_KEY=your-key-here
# Run the example (library mode - no server dependencies)
cargo run --example library_usage --no-default-features
See TypeScript Usage Guide for detailed instructions on using text-to-cypher from TypeScript/JavaScript applications via REST API, Node.js native bindings, or WebAssembly.
See Python Usage Guide for detailed instructions on using text-to-cypher from Python applications via REST API or PyO3 bindings.
The easiest way to get started is using our all-in-one Docker image that includes FalkorDB database, web browser interface, text-to-cypher API, and MCP server:
# Run the complete stack with all services
docker run -p 6379:6379 -p 3000:3000 -p 8080:8080 -p 3001:3001 \
-e DEFAULT_MODEL=gpt-4o-mini -e DEFAULT_KEY=your-api-key \
falkordb/text-to-cypher:latest
# Or using environment file
docker run -p 6379:6379 -p 3000:3000 -p 8080:8080 -p 3001:3001 \
--env-file .env \
falkordb/text-to-cypher:latest
# Or mounting .env file for full MCP server functionality
docker run -p 6379:6379 -p 3000:3000 -p 8080:8080 -p 3001:3001 \
-v $(pwd)/.env:/app/.env:ro \
falkordb/text-to-cypher:latest
# Custom ports using environment variables
docker run -p 6379:6379 -p 3000:3000 -p 9090:9090 -p 4001:4001 \
-e REST_PORT=9090 -e MCP_PORT=4001 \
-e DEFAULT_MODEL=gpt-4o-mini -e DEFAULT_KEY=your-api-key \
falkordb/text-to-cypher:latest
Once running, access the services at:
localhost:6379 (Redis protocol)http://localhost:3000 (Interactive graph database browser)http://localhost:8080 (REST API)http://localhost:8080/swagger-ui/ (API documentation)localhost:3001 (Model Context Protocol server)http://localhost:8080/api-doc/openapi.jsonIf you prefer to run locally without Docker:
# Prerequisites: You'll need FalkorDB running separately
docker run -d -p 6379:6379 falkordb/falkordb:latest
# Install Rust (if not already installed)
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
# Clone and run the text-to-cypher service
git clone https://github.com/FalkorDB/text-to-cypher.git
cd text-to-cypher
cp .env.example .env # Edit with your configuration
cargo run
The local development setup requires:
The API includes comprehensive Swagger UI documentation available at /swagger-ui/ when running the server.
The application supports flexible configuration via environment variables or .env file:
DEFAULT_MODEL: Default AI model to use (e.g., "openai:gpt-4")DEFAULT_KEY: Default API key for the AI serviceREST_PORT: REST API server port (default: 8080)MCP_PORT: MCP server port for AI assistant integrations (default: 3001)
/sse on this portFALKORDB_CONNECTION: FalkorDB connection string (default: "falkor://127.0.0.1:6379")Create a .env file from the provided example:
cp .env.example .env
# Edit .env with your preferred default model and API key
Important: The MCP server will only start if:
DEFAULT_MODEL and DEFAULT_KEY are configured.env file physically exists (not just environment variables)For Docker deployments:
--env-file .env or -e flags for HTTP server only (MCP server also starts if both MODEL and KEY are provided)-v $(pwd)/.env:/app/.env:ro to ensure MCP server starts with mounted .env fileThe integrated Docker solution runs four concurrent services:
http://localhost:8080/swagger-ui/http://localhost:8080/api-doc/openapi.jsontext_to_cypher tool for natural language to Cypher conversionDEFAULT_MODEL and DEFAULT_KEY are configuredThe project provides an all-in-one Docker image that includes FalkorDB database, web browser interface, text-to-cypher API, and MCP server:
# Pull the latest image
docker pull ghcr.io/falkordb/text-to-cypher:latest
# Option 1: Complete stack with all services (recommended)
docker run -d \
--name text-to-cypher-stack \
-p 6379:6379 \
-p 3000:3000 \
-p 8080:8080 \
-p 3001:3001 \
-e DEFAULT_MODEL=gpt-4o-mini \
-e DEFAULT_KEY=your-api-key \
--restart unless-stopped \
ghcr.io/falkordb/text-to-cypher:latest
# Option 2: Using environment file
docker run -d \
--name text-to-cypher-stack \
-p 6379:6379 \
-p 3000:3000 \
-p 8080:8080 \
-p 3001:3001 \
--env-file .env \
--restart unless-stopped \
ghcr.io/falkordb/text-to-cypher:latest
# Option 3: Mount .env file for full MCP functionality
docker run -d \
--name text-to-cypher-stack \
-p 6379:6379 \
-p 3000:3000 \
-p 8080:8080 \
-p 3001:3001 \
-v $(pwd)/.env:/app/.env:ro \
--restart unless-stopped \
ghcr.io/falkordb/text-to-cypher:latest
# View logs from all services
docker logs -f text-to-cypher-stack
| Method | All Services | Use Case |
|---|---|---|
-e DEFAULT_MODEL=... -e DEFAULT_KEY=... |
✅ | Environment-based config |
--env-file .env |
✅ | File-based configuration |
-v $(pwd)/.env:/app/.env:ro |
✅ | Mounted configuration file |
Note: All four services (FalkorDB database, web interface, text-to-cypher API, and MCP server) will start when both DEFAULT_MODEL and DEFAULT_KEY are configured, regardless of how the environment variables are provided.
| Service | Port | Description |
|---|---|---|
| FalkorDB Database | 6379 | Redis protocol access to graph database |
| FalkorDB Web Interface | 3000 | Interactive web browser for graph exploration |
| Text-to-Cypher HTTP API | 8080 | REST API with Swagger documentation |
| MCP Server | 3001 | Model Context Protocol server for AI integrations |
Configure the application using environment variables or .env file:
DEFAULT_MODEL: Default AI model (e.g., "gpt-4o-mini", "anthropic:claude-3")DEFAULT_KEY: Default API key for the AI serviceFALKOR_URL: FalkorDB connection URL (default: "falkor://127.0.0.1:6379")The MCP server provides a standardized interface for AI assistants to convert natural language questions into Cypher queries. This enables seamless integration with AI tools that support the Model Context Protocol.
To test and interact with the MCP server, you can use the MCP Inspector:
docker run -p 6379:6379 -p 3000:3000 -p 8080:8080 -p 3001:3001 \
-e DEFAULT_MODEL=gpt-4o-mini -e DEFAULT_KEY=your-api-key \
ghcr.io/falkordb/text-to-cypher:latest
npx -y @modelcontextprotocol/inspector
Connect MCP Inspector to the server:
http://localhost:6274)stdionc["localhost", "3001"]Or if using a direct connection:
ssehttp://localhost:3001/sseAvailable Tools: The MCP server exposes the following tool:
text_to_cypherConverts natural language questions into Cypher queries for graph databases.
Parameters:
graph_name (required): Name of the graph database to queryquestion (required): Natural language question to convert to CypherExample Usage in MCP Inspector:
{
"graph_name": "movies",
"question": "Find all actors who appeared in movies released after 2020"
}
Example Workflow:
text_to_cypher tool in MCP Inspector"social_network""Who are the friends of John with more than 5 mutual connections?"Pro Tip: You can also interact with the FalkorDB directly through the web interface at http://localhost:3000 to create and explore graphs visually!
The MCP server enables AI assistants to:
docker run -d -p 6379:6379 falkordb/falkordb:latest)# Run the complete integrated stack
docker run -p 6379:6379 -p 3000:3000 -p 8080:8080 -p 3001:3001 \
-e DEFAULT_MODEL=gpt-4o-mini -e DEFAULT_KEY=your-api-key \
ghcr.io/falkordb/text-to-cypher:latest
# Start FalkorDB separately
docker run -d -p 6379:6379 falkordb/falkordb:latest
# Run text-to-cypher service
cargo run
Once running, access the services at:
http://localhost:3000 (Interactive graph browser)http://localhost:8080http://localhost:8080/swagger-ui/http://localhost:8080/api-doc/openapi.jsonlocalhost:6379 (Redis protocol)localhost:3001 (Model Context Protocol)# Build locally using the build script
./docker-build.sh
# Or build manually
docker build -t text-to-cypher:latest .
# Pull from GitHub Container Registry
docker pull ghcr.io/falkordb/text-to-cypher:latest
# Available tags: latest, v1.0.0, v0.1.0-beta.x, etc.
cargo build --release
The library includes comprehensive unit tests with 33+ test cases covering:
TextToCypherClient construction, request/response serialization, chat typesRun all tests:
# Run library tests only
cargo test --lib
# Run all tests including integration tests
cargo test
# Run with output
cargo test -- --nocapture
The project maintains high code quality standards:
# Format code
cargo fmt
# Run linting (with pedantic and nursery clippy rules)
cargo clippy -- -W clippy::pedantic -W clippy::nursery -W clippy::cargo -A clippy::missing-errors-doc -A clippy::missing-panics-doc -A clippy::multiple-crate-versions
# Run tests
cargo test
text-to-cypher/
├── src/
│ ├── main.rs # Main application and HTTP server
│ ├── chat.rs # Chat message types and handling
│ ├── error.rs # Error types and handling
│ ├── formatter.rs # Query result formatting
│ ├── mcp/ # Model Context Protocol server
│ ├── schema/ # Graph schema discovery
│ └── template.rs # Template engine for prompts
├── templates/ # AI prompt templates
│ ├── system_prompt.txt # System prompt for AI
│ ├── user_prompt.txt # User query template
│ └── last_request_prompt.txt # Final response template
├── Dockerfile # All-in-one Docker image with FalkorDB
├── supervisord.conf # Process management configuration
├── entrypoint.sh # Docker container startup script
├── .dockerignore # Docker build context filtering
└── docker-build.sh # Convenient Docker build script
curl -X POST "http://localhost:8080/text_to_cypher" \
-H "Content-Type: application/json" \
-d '{
"graph_name": "movies",
"chat_request": {
"messages": [
{
"role": "User",
"content": "Find all actors who appeared in movies released after 2020"
}
]
},
"model": "gpt-4o-mini",
"key": "your-api-key"
}'
http://localhost:3000 in your browserThe API supports streaming responses for real-time progress updates:
const eventSource = new EventSource('/text_to_cypher', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
graph_name: "social_network",
chat_request: {
messages: [{ role: "User", content: "Who are John's friends?" }]
}
})
});
eventSource.onmessage = (event) => {
const progress = JSON.parse(event.data);
console.log('Progress:', progress);
};
# 1. Start the complete stack
docker run -p 6379:6379 -p 3000:3000 -p 8080:8080 -p 3001:3001 \
-e DEFAULT_MODEL=gpt-4o-mini -e DEFAULT_KEY=your-api-key \
ghcr.io/falkordb/text-to-cypher:latest
# 2. Create a graph using FalkorDB web interface (http://localhost:3000)
# Add some sample data: people, relationships, etc.
# 3. Query using natural language via the API
curl -X POST "http://localhost:8080/text_to_cypher" \
-H "Content-Type: application/json" \
-d '{
"graph_name": "social_network",
"chat_request": {
"messages": [
{
"role": "User",
"content": "Find all people who have more than 3 friends"
}
]
},
"model": "gpt-4o-mini",
"key": "your-api-key"
}'
# 4. Use MCP server for AI assistant integrations (port 3001)
# Connect your AI assistant to http://localhost:3001
This library is designed to be published to crates.io, making it easy to use in any Rust project.
To publish a new version to crates.io:
Ensure you have a crates.io account and are logged in:
# First time only: Create account at https://crates.io/ and get API token
cargo login
Update the version in Cargo.toml following Semantic Versioning:
[package]
version = "0.1.1" # Increment as needed
Update CHANGELOG (if exists) with version changes and release notes.
Ensure all tests pass (including doc tests):
cargo test
cargo test --doc
Run code quality checks:
# Format code
cargo fmt
# Run clippy with pedantic lints
cargo clippy --lib -- -W clippy::pedantic -W clippy::nursery -D warnings
Build and test both library and server modes:
# Test library-only mode (minimal dependencies)
cargo build --lib --no-default-features
# Test with server features (default)
cargo build
# Test the example
cargo run --example library_usage --no-default-features
Do a dry-run publish to verify package contents:
cargo publish --dry-run
Review the output to ensure:
Create a git tag for the version:
git tag -a v0.1.1 -m "Release version 0.1.1"
git push origin v0.1.1
Publish to crates.io:
cargo publish
Note: Publishing is permanent - you cannot delete or replace a published version.
Verify the published crate:
# Check on crates.io
open https://crates.io/crates/text-to-cypher
# Test installing from crates.io
cargo install text-to-cypher --version 0.1.1
Once published, users can easily add text-to-cypher to their projects:
[dependencies]
# Library-only usage (no REST server)
text-to-cypher = { version = "0.1", default-features = false }
# With REST server capabilities
text-to-cypher = "0.1"
The library is published with:
Services not starting:
DEFAULT_MODEL and DEFAULT_KEY are properly configureddocker logs -f <container-name>MCP Server not starting:
DEFAULT_MODEL and DEFAULT_KEY environment variables are set.env file exists in the working directoryFalkorDB connection issues:
localhost:6379 (Redis protocol)Web interface not accessible:
-p 3000:3000http://localhost:3000 directlyhttp://localhost:8080/swagger-ui/http://localhost:3000 for graph explorationdocker logs -f <container-name> to view all service logsThis project implements best practices from current research and industry leaders:
This project is licensed under the MIT License.