The Nx MCP server is a Model Context Protocol implementation that connects AI agents to systems they can't easily reach on their own: Nx Cloud CI pipelines, self-healing fixes, task monitoring, and up-to-date Nx documentation.
For an overview of how Nx enhances AI assistants and powerful use cases, see the Enhance Your LLM guide.
Installation
Section titled “Installation”There are a few ways to setup the Nx MCP server.
via .mcp.json
Section titled “via .mcp.json”{ "servers": { "nx-mcp": { "type": "stdio", "command": "npx", "args": ["nx", "mcp"] } }}{ "servers": { "nx-mcp": { "type": "stdio", "command": "npx", "args": ["nx-mcp@latest"] } }}Via Nx Console Extension
Section titled “Via Nx Console Extension”If you're using Cursor or VS Code, install the Nx Console extension which automatically manages the MCP server for you.
Client-Specific Setup
Section titled “Client-Specific Setup”If your preferred client is not listed, configure your client to run
npx nx mcp(Nx >= 21.4) ornpx nx-mcp@latest(older versions) to start the Nx MCP server.
Claude Code
Section titled “Claude Code”claude mcp add nx-mcp npx nx mcpclaude mcp add nx-mcp npx nx-mcp@latestVS Code
Section titled “VS Code”code --add-mcp '{"name":"nx-mcp","command":"npx","args":["nx","mcp"]}'code --add-mcp '{"name":"nx-mcp","command":"npx","args":["nx-mcp"]}'Alternatively, configure it in your VS Code settings or use the Nx Console extension for automatic setup.
Cursor
Section titled “Cursor”- Install Nx Console from the marketplace
- You'll receive a notification to "Improve Copilot/AI agent with Nx-specific context"
- Click "Yes" to configure the MCP server
If you miss the notification, run the nx.configureMcpServer command from the command palette (Ctrl/Cmd + Shift + P).
JetBrains IDEs
Section titled “JetBrains IDEs”- Install Nx Console from the marketplace
- You'll receive a notification to "Improve Copilot/AI agent with Nx-specific context"
- Click "Yes" to configure the MCP server
If you miss the notification, run the Nx: Setup MCP Server command from the command palette (Ctrl/Cmd + Shift + A).
Command-Line Options
Section titled “Command-Line Options”The nx mcp command (or npx nx-mcp for older versions) accepts the following options:
| Option | Alias | Description |
|---|---|---|
[workspacePath] | -w | Path to the Nx workspace root. Defaults to the current working directory if not provided. |
--transport <type> | Transport protocol to use: stdio (default), sse, or http | |
--port <port> | -p | Port to use for the HTTP/SSE server (default: 9921). Only valid with --transport sse or --transport http. |
--tools <patterns...> | -t | Filter which tools are enabled. Accepts glob patterns including negation (e.g., "*", "!nx_docs", "cloud_*") |
--minimal | Hide workspace analysis tools (default: true). Use --no-minimal to expose all tools. See Minimal Mode. | |
--disableTelemetry | Disable sending of telemetry data | |
--debugLogs | Enable debug logging | |
--help | Display help information |
HTTP Transport
Section titled “HTTP Transport”If you want to host the server instead of communicating via stdio, use the --transport and --port flags:
npx nx mcp --transport sse --port 9921npx nx-mcp@latest --transport sse --port 9921The HTTP transport supports multiple concurrent connections, allowing different clients to connect simultaneously with independent sessions.
Filtering Tools
Section titled “Filtering Tools”You can limit which tools are available using the --tools option with glob patterns:
# Enable all tools (default)npx nx mcp --tools "*"
# Disable specific toolsnpx nx mcp --tools "*" "!nx_docs"
# Enable only cloud analytics toolsnpx nx mcp --tools "cloud_*"
# Enable workspace tools onlynpx nx mcp --tools "nx_workspace" "nx_project_details" "nx_docs"# Enable all tools (default)npx nx-mcp@latest --tools "*"
# Disable specific toolsnpx nx-mcp@latest --tools "*" "!nx_docs"
# Enable only cloud analytics toolsnpx nx-mcp@latest --tools "cloud_*"
# Enable workspace tools onlynpx nx-mcp@latest --tools "nx_workspace" "nx_project_details" "nx_docs"This is useful when you want to restrict the LLM's capabilities or reduce noise in the tool list.
Minimal Mode (Default)
Section titled “Minimal Mode (Default)”By default, the MCP server runs in minimal mode (--minimal), which hides workspace analysis and generator tools. These capabilities are now handled more efficiently by agent skills, which provide domain knowledge as incrementally-loaded instructions rather than tool-call-based data dumps. This reduces token usage and keeps the tool list focused on what MCP does best: connectivity to Nx Cloud and running processes.
To restore all tools (e.g., for clients that don't support skills), pass --no-minimal:
npx nx mcp --no-minimalnpx nx-mcp@latest --no-minimalWhen the Nx Console extension manages the MCP server in VS Code or Cursor, it automatically enables minimal mode if it detects that agent skills are installed in the workspace.
Available Tools
Section titled “Available Tools”Documentation Tools
Section titled “Documentation Tools”| Tool | Description |
|---|---|
nx_docs | Returns documentation sections relevant to user queries about Nx configuration and best practices |
Task Monitoring Tools
Section titled “Task Monitoring Tools”| Tool | Description |
|---|---|
nx_current_running_tasks_details | Lists currently running Nx TUI processes and their task statuses |
nx_current_running_task_output | Returns terminal output for specific running tasks |
Visualization Tools
Section titled “Visualization Tools”| Tool | Description |
|---|---|
nx_visualize_graph | Opens interactive project or task graph visualizations (requires a running IDE instance with Nx Console) |
Nx Cloud CI Tools
Section titled “Nx Cloud CI Tools”These tools enable AI agents to interact with CI pipelines and self-healing capabilities:
| Tool | Description |
|---|---|
ci_information | Retrieves CI pipeline execution information from Nx Cloud for the current branch. Returns pipeline status, failed tasks, and self-healing status. |
update_self_healing_fix | Apply or reject a self-healing CI fix from Nx Cloud. Records the decision on the suggested fix. |
Nx Cloud Analytics Tools
Section titled “Nx Cloud Analytics Tools”These tools provide analytics and insights into your CI/CD data:
| Tool | Description |
|---|---|
cloud_analytics_pipeline_executions_search | Analyzes historical pipeline execution data to identify trends and patterns in CI/CD workflows |
cloud_analytics_pipeline_execution_details | Retrieves detailed data for a specific pipeline execution to investigate performance bottlenecks |
cloud_analytics_runs_search | Analyzes historical run data to track performance trends and team productivity patterns |
cloud_analytics_run_details | Retrieves detailed data for a specific run to investigate command execution performance |
cloud_analytics_tasks_search | Analyzes aggregated task performance statistics including success rates and cache hit rates |
cloud_analytics_task_executions_search | Analyzes individual task execution data to investigate performance trends over time |
Extended Tools (--no-minimal)
Section titled “Extended Tools (--no-minimal)”The following tools are hidden by default in minimal mode. They're available when running with --no-minimal, which is useful for AI clients that don't support skills or instruction files.
Workspace Tools
Section titled “Workspace Tools”| Tool | Description |
|---|---|
nx_workspace | Returns a readable representation of the project graph and nx.json configuration. Also returns any project graph errors if present. |
nx_workspace_path | Returns the path to the Nx workspace root |
nx_project_details | Returns complete project configuration in JSON format for a specific project, including targets, dependencies, and metadata |
nx_available_plugins | Lists available Nx plugins from the core team and local workspace plugins |
Generator Tools
Section titled “Generator Tools”| Tool | Description |
|---|---|
nx_generators | Returns a list of available code generators in your workspace |
nx_generator_schema | Returns the detailed JSON schema for a specific Nx generator, including all available options |
nx_run_generator | Opens the Nx Console Generate UI with prefilled options (requires a running IDE instance with Nx Console) |
Available Resources
Section titled “Available Resources”When connected to an Nx Cloud-enabled workspace, the Nx MCP server automatically exposes recent CI Pipeline Executions (CIPEs) as MCP resources. Resources appear in your AI tool's resource picker, allowing the LLM to access detailed information about CI runs including:
- Failed tasks and their error messages
- Terminal output from task executions
- Affected files in the pipeline run
- Build timing and performance data
Tool Usage Examples
Section titled “Tool Usage Examples”Accessing Documentation
Section titled “Accessing Documentation”Get accurate guidance without hallucinations:
How do I configure Nx release for conventional commits?What are the caching options for tasks?The nx_docs tool retrieves relevant, up-to-date documentation based on your query.
Analyzing CI Failures
Section titled “Analyzing CI Failures”When a CI build fails, your AI assistant can:
What failed in my last CI run?Help me fix the build errorUsing the Nx Cloud tools, the AI can:
- Access detailed information about the failed build
- Retrieve terminal output from failed tasks
- Understand what changed and suggest fixes
Autonomous CI Monitoring
Section titled “Autonomous CI Monitoring”With the ci_information and update_self_healing_fix tools, AI agents can monitor CI pipelines and interact with self-healing. For a ready-to-use implementation, see the CI monitoring skill in Claude Code:
Monitor CI and fix any failuresThe AI agent can:
- Poll CI pipeline status via
ci_information - Receive failure context and self-healing suggestions
- Apply verified fixes via
update_self_healing_fix - Continue iterating until CI passes
Workspace Exploration and Code Generation
Section titled “Workspace Exploration and Code Generation”For workspace exploration, code generation, and task execution, agents use skills rather than MCP tools. Skills teach agents how to use the Nx CLI directly — running commands like nx show projects, nx show project myapp, and nx generate — which is more token-efficient than fetching large JSON payloads via MCP.
If your AI client doesn't support skills, use --no-minimal to expose the full set of workspace and generator tools via MCP.