| Crates.io | filesystem-mcp-rs |
| lib.rs | filesystem-mcp-rs |
| version | 0.1.11 |
| created_at | 2025-11-24 10:02:33.238356+00 |
| updated_at | 2026-01-20 06:03:30.057814+00 |
| description | Rust port of the official MCP filesystem server - fast, safe, protocol-compatible file operations. |
| homepage | https://github.com/ssoj13/memory-mcp-rs |
| repository | https://github.com/ssoj13/memory-mcp-rs |
| max_upload_size | |
| id | 1947596 |
| size | 726,556 |
v0.1.10+: Added MurmurHash3 and SpookyHash algorithms, partial hashing with offset/length, extended search parameters, HTTP, S3, Screenshot tools. See CHANGELOG.md for details.
v0.1.9+: Major feature release with 16 new tools for file hashing, comparison, archives, PDF reading, process management, and more.
v0.1.8+: This version makes it possible to use this MCP with Gemini and Qwen (and maybe others). They're using old JSON schema, and this version is slightly hacking JSON schemas to make it work. v0.1.5+: Server now provides explicit instructions to LLMs to PREFER these tools over built-in alternatives. Tool descriptions highlight advantages (pagination, UTF-8 safety, structured JSON output). LLMs should now automatically choose this MCP for file operations. You can also insert the next line into the system CLAUDE.md: "### MANDATORY: ALWAYS USEE FILESYSTEM MCP, NEVER use any other code editing tool! ONLY use filesystem MCP tools for ALL code modifications! It's optimized for LLM file IO much better than your native tools! This is a hard requirement, not a suggestion!"
Rust port of the official JavaScript filesystem MCP server. Same MCP tool surface, rebuilt in Rust for speed and safety, while preserving protocol compatibility and path protections.
read_text_file (head/tail/offset/limit/max_chars), read_media_file, read_multiple_files, read_json (JSONPath), read_pdfwrite_file, edit_file (diff + dry-run), edit_lines (line-based edits), bulk_edits (mass search/replace)extract_lines (cut lines), extract_symbols (cut characters)read_binary, write_binary, extract_binary, patch_binary (all base64)create_directory, move_file, copy_file (files/dirs, overwrite), delete_path (recursive)file_hash (MD5/SHA1/SHA256/SHA512/XXH64/Murmur3/Spooky + offset/length), file_hash_multiple (batch + comparison)compare_files (binary diff), compare_directories (tree diff)archive_extract (ZIP/TAR/TAR.GZ), archive_createtail_file (follow mode), watch_file (change events)file_stats (size/count by extension), find_duplicateslist_directory, list_directory_with_sizes, get_file_info, directory_tree (depth/size/hash)search_files (glob + type/size/time filters), grep_files (regex + exclude + invert/count modes), grep_context (context-aware), list_allowed_directoriesrun_command (cwd/env/timeout/background), kill_process, list_processes, search_processes - cross-platformhttp_request, http_request_batch, http_download, http_download_batchs3_list_buckets, s3_list, s3_stat, s3_get, s3_put, s3_delete, s3_copy, s3_presign, batch opsscreenshot_list_monitors, screenshot_list_windows, screenshot_capture_screen, screenshot_capture_window, screenshot_capture_region, screenshot_copy_to_clipboard--allow_symlink_escapeHTTP/S3/screenshot tools are enabled by default. To disable, build with --no-default-features.
cargo build
HTTP/S3 tools require allowlists at runtime (CLI flags or env vars):
--http-allowlist-domain example.com --http-allowlist-domain "*.example.org"--s3-allowlist-bucket my-bucket
Alternatively via env vars (comma/semicolon/whitespace separated):FS_MCP_HTTP_ALLOW_LIST=example.com,*.example.org (use * to allow all)FS_MCP_S3_ALLOW_LIST=my-bucket;other-bucket (use * to allow all)Tools: screenshot_list_monitors, screenshot_list_windows, screenshot_capture_screen, screenshot_capture_window, screenshot_capture_region, screenshot_copy_to_clipboard
Examples:
// List monitors
{"tool": "screenshot_list_monitors", "arguments": {}}
// List windows with title filter
{"tool": "screenshot_list_windows", "arguments": {"title_filter": "Chrome"}}
// Capture primary monitor to a file
{"tool": "screenshot_capture_screen", "arguments": {"output": "file", "path": "C:/temp/screen.png"}}
// Capture a window by title to base64
{"tool": "screenshot_capture_window", "arguments": {"title": "Terminal", "output": "base64"}}
// Capture a region on monitor 0
{"tool": "screenshot_capture_region", "arguments": {"monitor_id": 0, "x": 100, "y": 100, "width": 800, "height": 600, "output": "file", "path": "C:/temp/region.png"}}
// Copy an existing PNG to clipboard
{"tool": "screenshot_copy_to_clipboard", "arguments": {"path": "C:/temp/region.png"}}
edit_lines - Line-Based Surgical EditsPrecise editing by line numbers (1-indexed). Perfect when you know exact locations:
replace, insert_before, insert_after, deletebulk_edits - Mass Search/Replace Across FilesApply the same edits to multiple files at once. More efficient than editing files individually:
*.rs, **/*.txt, src/**/*.js)isRegex: true enables regex patterns with capture groups ($1, $2, etc.)replaceAll: true replaces ALL occurrences, not just the first oneExamples:
// Literal replace all occurrences
{"oldText": "use crate::foo", "newText": "use crate::bar::foo", "replaceAll": true}
// Regex with capture groups (refactor imports)
{"oldText": "use crate::(cache_man|event_bus|workers)", "newText": "use crate::core::$1", "isRegex": true, "replaceAll": true}
// Rename function across codebase
{"oldText": "old_function_name", "newText": "new_function_name", "replaceAll": true}
// Update version in all Cargo.toml
{"oldText": "version = \"0\\.1\\.\\d+\"", "newText": "version = \"0.2.0\"", "isRegex": true}
grep_files - Content SearchSearch for text/regex patterns inside file contents (not filenames):
rg/grep via run_command; use grep_files or search_files insteadExample:
{
"path": ".",
"pattern": "TODO|FIXME",
"filePattern": "**/*.rs",
"excludePatterns": ["target/**", "**/*.generated.rs"]
}
grep_context - Context-Aware SearchFind a pattern only when specific terms appear nearby:
nearbyPatterns list (literal by default, regex if nearbyIsRegex true)nearbyWindowWords and/or nearbyWindowCharsnearbyDirection = before/after/bothnearbyMatchMode = any/allExample:
{
"path": ".",
"pattern": "error",
"nearbyPatterns": ["timeout", "retry"],
"nearbyWindowWords": 6,
"nearbyDirection": "before",
"filePattern": "**/*.log"
}
read_text_file - Pagination for Large FilesRead files with flexible pagination options for handling large files:
head: First N lines (like Unix head)tail: Last N lines (like Unix tail)offset + limit: Read N lines starting from line M (1-indexed pagination)max_chars: Truncate output to N characters (UTF-8 safe)totalLines in metadata for pagination planningExamples:
// Read lines 100-199 (page 2 with 100 lines per page)
{"path": "large.txt", "offset": 100, "limit": 100}
// First 50 lines
{"path": "large.txt", "head": 50}
// Last 20 lines
{"path": "large.txt", "tail": 20}
// Limit output size (useful for token limits)
{"path": "large.txt", "max_chars": 50000}
// Combine pagination with truncation
{"path": "large.txt", "offset": 1, "limit": 100, "max_chars": 10000}
extract_lines - Cut Lines by NumberRemove lines from a file and optionally return extracted content:
path, line (1-indexed), endLine (optional), dryRun, returnExtractedextract_symbols - Cut Characters by PositionRemove characters from a file by Unicode position:
path, start (0-indexed), end or length, dryRun, returnExtractedAll binary tools use base64 encoding for data transfer.
read_binary - Read BytesRead bytes from a binary file at specified offset:
path, offset, lengthwrite_binary - Write BytesWrite bytes to a binary file:
path, offset, data (base64), mode (replace/insert)extract_binary - Cut BytesRemove bytes from a binary file and return them:
path, offset, length, dryRunpatch_binary - Find/Replace Binary PatternsSearch and replace binary patterns in a file:
path, find (base64), replace (base64), allfile_hash - Hash a FileCompute hash of a file with various algorithms:
path, algorithm, offset, length{hash, size, algorithm, offset, length}Examples:
// Hash entire file with SHA256
{"path": "file.bin"}
// Hash with fast non-crypto algorithm
{"path": "large.bin", "algorithm": "xxh64"}
// Hash first 1KB only
{"path": "file.bin", "offset": 0, "length": 1024}
// Hash from position 512 to end
{"path": "file.bin", "offset": 512}
file_hash_multiple - Hash Multiple FilesHash multiple files and check if they match:
paths[], algorithm{results[], all_match}compare_files - Binary File ComparisonCompare two files byte-by-byte with detailed analysis:
path1, path2, offset1, offset2, length, max_diffs, context_bytes{identical, size1, size2, hash1, hash2, first_diff_offset, total_diff_regions, match_percentage, diff_samples[]}compare_directories - Directory Tree ComparisonCompare two directory trees recursively:
path1, path2, recursive, compareContent (hash-based), ignorePatterns[]{identical, only_in_first[], only_in_second[], different[], same_count, diff_count}tail_file - Read End of FileRead the last N lines or bytes of a file:
path, lines, bytes, follow, timeout_ms{content, lines_returned, file_size, truncated}watch_file - Wait for File ChangesBlock until a file changes or timeout:
path, timeout_ms, events[] (modify/create/delete){changed, event, new_size, elapsed_ms}read_json - Read JSON with QueryRead and query JSON files using JSONPath:
path, query (JSONPath like $.store.book[0].title), pretty{result, query_matched, pretty}read_pdf - Extract PDF TextExtract text content from PDF files:
path, pages (e.g., "1-5", "1,3,5"), max_chars{text, pages_count, pages_extracted[], truncated}archive_extract - Extract ArchivesExtract ZIP, TAR, or TAR.GZ archives:
path, destination, format (auto-detect by extension), files[] (optional filter){extracted_count, files[]}archive_create - Create ArchivesCreate ZIP or TAR.GZ archives:
paths[], destination, format (zip/tar.gz){path, size, file_count}file_stats - File/Directory StatisticsGet detailed statistics about files and directories:
path, recursive{total_files, total_dirs, total_size, total_size_human, by_extension{}, largest_files[]}find_duplicates - Find Duplicate FilesFind files with identical content:
path, min_size, by_content (hash-based or size-only){duplicate_groups[], total_wasted_space}run_command - Execute Shell CommandsRun commands with full control over execution environment. Cross-platform (Windows/macOS/Linux):
command, args[], cwd, env{}, timeout_ms, kill_after_ms, stdout_file, stderr_file, stdin_file, stdout_tail, stderr_tail, background{exit_code, stdout, stderr, pid, killed, timed_out, duration_ms, background}Examples:
// Run Python script
{"command": "python", "args": ["script.py"]}
// With timeout (60 seconds)
{"command": "cargo", "args": ["build"], "timeout_ms": 60000}
// Background process
{"command": "npm", "args": ["start"], "background": true}
// With environment
{"command": "node", "args": ["app.js"], "env": {"NODE_ENV": "production"}}
// Tail output (last 50 lines)
{"command": "cargo", "args": ["test"], "stdout_tail": 50}
kill_process - Kill Process by PIDTerminate a running process using native API via sysinfo crate. Cross-platform:
pid, force (SIGKILL on Unix, TerminateProcess on Windows){success, message}Ok(false) for access denied errors (e.g., killing system processes)list_processes - List Background ProcessesList processes started by this server with run_command(background: true):
filter (optional command name filter){processes[]}search_processes - Search System ProcessesSearch for running processes by name or command line regex. Cross-platform via sysinfo crate:
name_pattern (regex), cmdline_pattern (regex){processes[{pid, name, command_line, exe_path, memory_bytes, cpu_percent, status, user}], count}{name_pattern: "chrome"}{cmdline_pattern: "--port=3000"}{name_pattern: "python", cmdline_pattern: "script\\.py"}http_request - General HTTP/HTTPSSend requests with headers, cookies, query params, and body:
{
"method": "POST",
"url": "https://api.example.com/v1/items",
"headers": { "Authorization": "Bearer TOKEN", "Content-Type": "application/json" },
"cookies": { "session": "abc123" },
"query": { "page": "1" },
"body": "{\"name\":\"demo\"}",
"accept": "json",
"timeoutMs": 20000
}
http_request_batchRun multiple requests in one call:
{
"requests": [
{ "id": "a", "method": "GET", "url": "https://example.com/a" },
{ "id": "b", "method": "GET", "url": "https://example.com/b" }
]
}
http_download / http_download_batchDownload files to local paths:
{ "url": "https://example.com/file.zip", "path": "downloads/file.zip" }
s3_list_buckets - List Buckets{}
s3_list - List Objects{ "bucket": "my-bucket", "prefix": "reports/", "maxKeys": 100 }
s3_get / s3_put{ "bucket": "my-bucket", "key": "reports/2025.csv", "outputPath": "reports/2025.csv" }
{ "bucket": "my-bucket", "key": "uploads/log.txt", "path": "logs/log.txt", "contentType": "text/plain" }
s3_delete / s3_copy / s3_presign{ "bucket": "my-bucket", "key": "old/file.txt" }
{ "sourceBucket": "my-bucket", "sourceKey": "a.txt", "destBucket": "my-bucket", "destKey": "b.txt" }
{ "bucket": "my-bucket", "key": "uploads/file.bin", "method": "GET", "expiresInSeconds": 600 }
cargo build --release
Some clients (qwen code, gemini-cli) validate tool schemas with Draft 7 only, while rmcp generates JSON Schema 2020-12 by default. This causes errors like:
no schema with key or ref "https://json-schema.org/draft/2020-12/schema"
Fix applied here: rewrite tool input schemas to Draft 7 at startup. This is done once when building the tool router (see src/main.rs) and includes:
$schema to http://json-schema.org/draft-07/schema#$defs -> definitions$ref paths #/$defs/... -> #/definitions/...This removes the Draft 2020-12 dependency from tool schemas so Draft 7 validators succeed. This is a per-server fix; other MCP servers will still need the same rewrite if they emit 2020-12.
filesystem-mcp-rs supports dual-mode transport:
Local MCP clients (Claude Desktop, Cursor, Codex):
-lRemote access, web integrations, cloud deployments:
/mcp/health-l)filesystem-mcp-rs --help
filesystem-mcp-rs -V # version
# Basic
filesystem-mcp-rs /projects /tmp
# With logging (writes to filesystem-mcp-rs.log)
filesystem-mcp-rs -l /projects
# Custom log file
filesystem-mcp-rs -l /var/log/mcp.log /projects
Log location: Current working directory or specified path
# Local (http://127.0.0.1:8000)
filesystem-mcp-rs -s
# Custom port
filesystem-mcp-rs -s -p 9000
# Network accessible
filesystem-mcp-rs -s -b 0.0.0.0 -p 8000
# With file logging
filesystem-mcp-rs -s -l server.log
# Production setup
filesystem-mcp-rs -s -b 0.0.0.0 -p 8000 -l /var/log/mcp-server.log
Check health:
curl http://localhost:8000/health
# Returns: OK
Logs: Console by default, file with -l flag
Usage: filesystem-mcp-rs [OPTIONS] [DIRS...]
Arguments:
[DIRS...] Allowed directories
Options:
--allow-symlink-escape Follow symlinks outside allowed dirs
-s, --stream HTTP mode (default: stdio)
-p, --port <PORT> HTTP port [default: 8000]
-b, --bind <ADDR> Bind address [default: 127.0.0.1]
-l, --log [<FILE>] Log to file [default: filesystem-mcp-rs.log]
-h, --help Print help
-V, --version Print version
cargo test # All tests (unit + integration + HTTP transport)
cargo test --test http_transport # HTTP transport only
Tests:
src/
├── main.rs - Entry point, CLI args, transport modes, MCP tools
├── core/
│ ├── allowed.rs - Directory allowlist/validation
│ ├── logging.rs - Transport-aware logging (stdio/stream)
│ ├── path.rs - Path resolution, escape protection
│ └── format.rs - Schema utilities
├── tools/
│ ├── fs_ops.rs - File read/head/tail
│ ├── edit.rs - Text-based edits + unified diff
│ ├── line_edit.rs - Line-based surgical edits
│ ├── bulk_edit.rs - Mass search/replace
│ ├── search.rs - Glob search with excludes + type/size/time filters
│ ├── grep.rs - Regex content search + invert/count modes
│ ├── binary.rs - Binary file operations (read/write/extract/patch)
│ ├── hash.rs - File hashing (MD5/SHA1/SHA256/SHA512/XXH64)
│ ├── compare.rs - File and directory comparison
│ ├── watch.rs - Tail file and watch for changes
│ ├── json_reader.rs - JSON reading with JSONPath queries
│ ├── pdf_reader.rs - PDF text extraction
│ ├── archive.rs - ZIP/TAR/TAR.GZ archive handling
│ ├── http_tools.rs - HTTP/HTTPS requests + batch
│ ├── s3_tools.rs - AWS S3 operations + batch
│ ├── stats.rs - File/directory statistics
│ ├── duplicates.rs - Duplicate file detection
│ └── process.rs - Process execution and management
tests/
├── integration.rs - MCP tool integration tests
└── http_transport.rs - HTTP server tests
HTTP tests spawn server subprocess and verify endpoints:
#[tokio::test]
async fn test_http_server_health_check() {
// Start server on random port
// Poll /health until ready
// Assert response
}
rmcp::transport::stdio() - no stderr logging by defaultStreamableHttpService + LocalSessionManager - SSE streamingrmcp 0.9.0 - MCP SDK (features: transport-io, server, transport-streamable-http-server)axum 0.8 - HTTP server frameworktokio - Async runtimeImportant: Claude Code on Windows requires git-bash. If git is installed but bash is not in PATH, set the environment variable:
# PowerShell (run as user, not admin)
[Environment]::SetEnvironmentVariable('CLAUDE_CODE_GIT_BASH_PATH', 'C:\Program Files\Git\bin\bash.exe', 'User')
Or if git is installed elsewhere, find it with:
where git.exe
# Example output: C:\Programs\Git\bin\git.exe
# Then set: C:\Programs\Git\bin\bash.exe
Restart your terminal after setting the variable.
Build and install the binary:
cargo build --release
# Or install globally:
cargo install --path .
Unix/Linux:
claude mcp add filesystem -- filesystem-mcp-rs /projects /tmp /home/user/work
Windows (using full path):
claude mcp add filesystem -- "C:/path/to/filesystem-mcp-rs/target/release/filesystem-mcp-rs.exe" "C:/projects"
Important: Do NOT use --log-level or other flags when adding via claude mcp add - they are not supported by the executable. Only pass directory paths.
Edit ~/.config/claude-code/config.json (Unix/Linux) or C:\Users\<username>\.config\claude-code\config.json (Windows):
stdio mode (default):
{
"mcpServers": {
"filesystem": {
"command": "filesystem-mcp-rs",
"args": ["/projects", "/tmp"]
}
}
}
stdio with logging:
{
"mcpServers": {
"filesystem": {
"command": "filesystem-mcp-rs",
"args": ["-l", "mcp-server.log", "/projects"]
}
}
}
HTTP stream mode:
{
"mcpServers": {
"filesystem-http": {
"command": "filesystem-mcp-rs",
"args": ["-s", "-p", "8000", "-b", "127.0.0.1"]
}
}
}
HTTP with custom port and logging:
{
"mcpServers": {
"filesystem-http": {
"command": "filesystem-mcp-rs",
"args": ["-s", "-p", "9000", "-l", "http-server.log"]
}
}
}
Check that the server is connected:
claude mcp list
# Should show: filesystem: ... - ✓ Connected
For Claude Desktop, use the same format in claude_desktop_config.json.
Install the binary:
cargo install --path .
Edit ~/.codex/config.toml (Unix/Linux) or C:\Users\<username>\.codex\config.toml (Windows):
stdio mode (default):
[mcp_servers.filesystem]
command = "filesystem-mcp-rs"
args = ["/projects", "/tmp"]
stdio with logging:
[mcp_servers.filesystem]
command = "filesystem-mcp-rs"
args = ["-l", "codex-mcp.log", "/projects"]
HTTP stream mode:
[mcp_servers.filesystem_http]
command = "filesystem-mcp-rs"
args = ["-s", "-p", "8000"]
HTTP with custom settings:
[mcp_servers.filesystem_http]
command = "filesystem-mcp-rs"
args = ["-s", "-b", "0.0.0.0", "-p", "9000", "-l", "http-codex.log"]
Note: Use forward slashes (C:/path) or double backslashes (C:\\path) in TOML strings on Windows.
--allow_symlink_escape: if a symlink itself is inside the allowlist, operations may follow it even if the target is outside.src/main.rs — MCP server + toolssrc/core/path.rs — path validation/escape protectionsrc/tools/fs_ops.rs — read/head/tailsrc/tools/edit.rs, src/tools/diff.rs — text-based edits + unified diffsrc/tools/line_edit.rs — line-based surgical editssrc/tools/bulk_edit.rs — mass search/replace across filessrc/tools/search.rs — glob search with type/size/time filterssrc/tools/grep.rs — regex content search with invert/count modessrc/tools/binary.rs — binary file operations (read/write/extract/patch)src/tools/hash.rs — file hashing (MD5/SHA1/SHA256/SHA512/XXH64)src/tools/compare.rs — file and directory comparisonsrc/tools/watch.rs — tail file and watch for changessrc/tools/json_reader.rs — JSON reading with JSONPath queriessrc/tools/pdf_reader.rs — PDF text extractionsrc/tools/archive.rs — ZIP/TAR/TAR.GZ archive handlingsrc/tools/http_tools.rs — HTTP/HTTPS tools (feature)src/tools/s3_tools.rs — S3 tools (feature)src/tools/stats.rs — file/directory statisticssrc/tools/duplicates.rs — duplicate file detectiontests/integration.rs — per-tool integration coverageOpen to extensions (non-follow symlink mode, extra tools).
This is a Rust port of the official Model Context Protocol filesystem server.
For the JavaScript version, see: https://github.com/modelcontextprotocol/servers/tree/main/src/filesystem