The rrlmgraph-mcp companion package exposes the project knowledge graph via the Model Context Protocol (MCP), enabling any MCP-capable editor or AI agent to query graph context without running R code directly.
Architecture
┌─────────────────┐ JSON-RPC / stdio ┌─────────────────┐
│ VS Code / │ ◄────────────────────────────► │ rrlmgraph-mcp │
│ GitHub Copilot │ │ (Node.js MCP │
│ (MCP client) │ │ server) │
└─────────────────┘ └────────┬────────┘
│ SQLite
┌────────▼────────┐
│ .rrlmgraph/ │
│ graph.sqlite │
└─────────────────┘
The SQLite database is written by export_to_sqlite() in
rrlmgraph and read at query time by the MCP server — no
R process needs to be running during IDE sessions.
Installation
Install the MCP server globally via npm:
# In a terminal:
# npm install -g rrlmgraph-mcpOr use npx for a one-off run:
# npx rrlmgraph-mcp --db-path path/to/.rrlmgraph/graph.sqliteExporting the graph to SQLite
library(rrlmgraph)
graph <- build_rrlm_graph("path/to/your/project")
export_to_sqlite(graph, "path/to/your/project/.rrlmgraph/graph.sqlite")Configuring VS Code
Add the following to .vscode/mcp.json (or your
user-level settings.json under
github.copilot.chat.mcp.servers):
{
"servers": {
"rrlmgraph": {
"type": "stdio",
"command": "npx",
"args": [
"rrlmgraph-mcp",
"--db-path",
"${workspaceFolder}/.rrlmgraph/graph.sqlite"
]
}
}
}After saving, GitHub Copilot Chat will list rrlmgraph as an available MCP tool. Ask it questions like:
“Which functions call
prepare_data()?” “What context is relevant for the data validation pipeline?”
Available MCP tools
| Tool | Description |
|---|---|
query_context |
Relevance-guided BFS context window for a coding task |
get_node_info |
Full metadata, callers, callees, and source for a named node |
list_functions |
Functions ranked by PageRank, with optional file-path filter |
find_callers |
All functions that call a given function (incoming CALLS edges) |
find_callees |
All functions called by a given function (outgoing CALLS edges) |
search_nodes |
Full-text keyword search across node names, bodies, and docs |
add_task_trace |
Record accepted / rejected task outcome to improve future retrieval |
rebuild_graph |
Trigger an Rscript rebuild of the project graph in-process |
Keeping the database fresh
Re-run export_to_sqlite() after editing R source files,
or use update_graph_incremental() followed by
export_to_sqlite() for large projects:
graph <- update_graph_incremental(
graph,
changed_files = "R/my_function.R"
)
export_to_sqlite(graph, "path/to/your/project/.rrlmgraph/graph.sqlite")A targets pipeline or a Git pre-commit hook that runs this two-step command keeps the graph in sync with every commit.