| Crates.io | openserve |
| lib.rs | openserve |
| version | 2.0.3 |
| created_at | 2025-06-08 22:49:08.18629+00 |
| updated_at | 2025-06-09 02:17:58.376688+00 |
| description | A modern, high-performance, AI-enhanced file server built in Rust |
| homepage | https://github.com/nikjois/openserve-rs |
| repository | https://github.com/nikjois/openserve-rs |
| max_upload_size | |
| id | 1705314 |
| size | 335,836 |
OpenServe is a modern, high-performance, AI-enhanced file server built in Rust. It combines the speed and safety of Rust with intelligent features powered by OpenAI, providing a comprehensive solution for file management, search, and collaboration.
cargo install openserve
openserve --help
git clone https://github.com/nikjois/openserve-rs.git
cd openserve-rs
cargo build --release
git clone https://github.com/nikjois/openserve-rs.git
cd openserve-rs
docker-compose up -d
# Start the server
openserve --port 8080 --host 0.0.0.0 /path/to/files
# With AI features enabled
openserve --ai --openai-api-key your-key-here /path/to/files
# With custom configuration
openserve --config config.yml /path/to/files
# Server Configuration
OPENSERVE_HOST=0.0.0.0
OPENSERVE_PORT=8080
OPENSERVE_SERVE_PATH=/app/files
# AI Configuration
OPENAI_API_KEY=your-api-key-here
OPENSERVE_AI_ENABLED=true
# Database Configuration
DATABASE_URL=sqlite:///app/data/openserve.db
REDIS_URL=redis://localhost:6379
# Logging
RUST_LOG=info
server:
host: "0.0.0.0"
port: 8080
serve_path: "./files"
max_upload_size: 104857600 # 100MB
enable_tls: false
ai:
enabled: true
api_key: "your-openai-api-key"
model: "gpt-4o-mini"
max_tokens: 2048
temperature: 0.7
auth:
enabled: true
jwt_secret: "your-secret-key"
session_timeout: 3600
allow_registration: false
storage:
database_url: "sqlite://./data.db"
redis_url: "redis://localhost:6379"
cache_size: 1000
index_path: "./index"
telemetry:
log_level: "info"
log_format: "json"
metrics_enabled: true
tracing_enabled: false
# List directory contents
GET /api/files?path=/some/path
# Upload file
POST /api/files/upload
Content-Type: multipart/form-data
# Download file
GET /api/files/download?path=/file.txt
# Delete file
DELETE /api/files?path=/file.txt
# Get file metadata
GET /api/files/metadata?path=/file.txt
# Search files
GET /api/search?q=query&limit=10
# Semantic search
GET /api/search?q=query&semantic=true
# Get search statistics
GET /api/search/stats
# Analyze file content
POST /api/ai/analyze
{
"path": "/document.txt",
"options": {
"summarize": true,
"extract_entities": true,
"generate_tags": true
}
}
# Chat with files
POST /api/ai/chat
{
"message": "What is this document about?",
"context_paths": ["/document.txt"]
}
# Login
POST /api/auth/login
{
"username": "user",
"password": "password"
}
# Register (if enabled)
POST /api/auth/register
{
"username": "user",
"email": "user@example.com",
"password": "password"
}
OpenServe can also be used as a library in your Rust projects:
use openserve::prelude::*;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let config = Config::default();
let server = Server::new(config).await?;
server.run().await?;
Ok(())
}
# Run all tests
cargo test
# Run with coverage
cargo tarpaulin --out Html
# Run benchmarks
cargo bench
# Integration tests
cargo test --test integration
openserve-rs/
├── src/
│ ├── ai/ # AI service integration
│ ├── config/ # Configuration management
│ ├── error/ # Error handling
│ ├── handlers/ # HTTP request handlers
│ ├── middleware/ # Custom middleware
│ ├── models/ # Data models
│ ├── services/ # Business logic
│ ├── utils/ # Utility functions
│ ├── lib.rs # Library root
│ └── main.rs # Application entry point
├── tests/ # Integration tests
├── benches/ # Benchmarks
├── docker/ # Docker configuration
└── docs/ # Documentation
# Development build
cargo build
# Release build
cargo build --release
# With all features
cargo build --all-features
# Cross-compilation
cargo build --target x86_64-unknown-linux-musl
git checkout -b feature/amazing-feature)git commit -m 'Add amazing feature')git push origin feature/amazing-feature)# Build and deploy
docker-compose up -d
# Scale services
docker-compose up -d --scale openserve=3
# View logs
docker-compose logs -f openserve
# Install service
sudo cp scripts/openserve.service /etc/systemd/system/
sudo systemctl enable openserve
sudo systemctl start openserve
# Health endpoint
curl http://localhost:8080/health
# Metrics endpoint
curl http://localhost:8080/metrics
# Ready endpoint
curl http://localhost:8080/ready
Port already in use
# Find process using port
lsof -i :8080
# Kill process
kill -9 <PID>
Permission denied
# Check file permissions
ls -la /path/to/files
# Fix permissions
chmod -R 755 /path/to/files
AI features not working
# Check API key
echo $OPENAI_API_KEY
# Test API connection
curl -H "Authorization: Bearer $OPENAI_API_KEY" https://api.openai.com/v1/models
# View logs
docker-compose logs -f openserve
# Increase log level
RUST_LOG=debug cargo run
# Structured logging
RUST_LOG=info,openserve=debug cargo run
OpenServe is designed for high performance:
Security is a top priority:
This project is licensed under the MIT License - see the LICENSE file for details.
Made with Rust