How to Easily Use the New MCP Server and Client Nodes in n8n

n8n just released native support for MCP server and MCP client nodes—bringing the Model Context Protocol (MCP) into your workflows. In this post, I’ll break down what MCP is and show you how to set up the new MCP server and client nodes in n8n.


What is MCP?

MCP stands for Model Context Protocol, introduced by Anthropic, creators of the Claude family of models (including Sonnet 3.7). It’s a new standard for how large language models (LLMs) interact with external tools and systems. Think of it as an intelligent, structured way for your LLM to know what’s available, what it can do, and how to do it.

Since its release in late 2024, MCP has gained traction across the AI industry, with OpenAI, various SaaS platforms, and desktop tools jumping onboard. The goal is to create a more standardized, scalable method for integrating LLMs into applications, rather than reinventing the wheel with custom APIs or brittle prompt engineering.


Key Concepts in MCP

  • MCP Host: The LLM-powered app that needs context or tool access. Example: Claude Desktop.
  • MCP Server: Provides a set of tools (functions, workflows, etc.) that the host can access.
  • MCP Client: Connects the host to the server, facilitating tool execution.

The MCP server acts like an API with documentation, telling the LLM what tools it can use, how to use them, and making them available in real time.


Setting Up the MCP Server in n8n

Let’s walk through creating an MCP server inside n8n.

  1. Add the MCP Server Node
    In a new workflow, search for and add the MCP Server Trigger node. This serves as the entry point for any host that wants to use your tools. For now, it has no configuration options—just add and activate the workflow.
  2. Add a Tool
    For this example, let’s add a simple calculator function—LLMs struggle with math, so tools like this are a great fit. Add the calculator node to the workflow and connect it.
  3. Get Your Webhook URL
    Once activated, copy the production webhook URL from the trigger node. This is what the MCP host (e.g. Claude Desktop) will use to access your server.

Great addition! That JSON snippet is super useful for devs trying to set up Claude Desktop with their n8n MCP server.

The best spot to include it is right after the “Configure MCP Servers” step in the “Connecting Claude Desktop” section. Here’s how you can revise that part of the post to flow naturally with your JSON snippet:


Connecting Claude Desktop

To let Claude use your n8n tools:

  1. Install Claude Desktop
    You’ll need:
    • Claude Desktop (from Anthropic)
    • Node.js installed
    • The “Super Gateway” to support SSE (since Claude doesn’t natively support it yet)
  2. Enable Developer Mode
    In Claude Desktop, enable Developer Mode from the Help menu.
  3. Configure MCP Servers
    • Click Edit Config in the Developer Settings.
    • Add your MCP server details using the JSON below.
    • Replace the webhook URL with the one from your n8n MCP Server Trigger.
    { "mcpServers": { "n8n": { "command": "npx", "args": [ "-y", "supergateway", "--sse", "REPLACE-ME-WITH-N8N-WEBHOOK-URL" ] } } }
  4. Pro tip: If Claude doesn’t seem to connect, try running the Super Gateway manually using: npx -y supergateway --sse YOUR_WEBHOOK_URL This often reveals Node-related permission issues that Claude Desktop hides.

Testing the Tool

Once configured, Claude should show your MCP server and available tools—in this case, the calculator. Try a test query like 50 * 10, and Claude should use the calculator tool and return the correct result.

You can confirm it worked by checking the workflow execution logs in n8n.


Setting Up the MCP Client in n8n

Now let’s reverse the flow. What if your AI agent (the host) needs to connect out to an external MCP server?

  1. Add the MCP Client Node
    Create a new workflow. Add the MCP Client node.
  2. Connect to the MCP Server
    In the client node, create a credential with the SSE endpoint (the webhook URL from your MCP server workflow).
  3. Select Tools
    Once connected, you’ll see the tools offered by that server. Select the one you want—e.g., the calculator—and send a test query.

Again, you can confirm everything is working by checking the logs on both the client and server workflows.


Final Thoughts

This new MCP integration in n8n opens up some powerful possibilities:

  • Build tools in n8n that LLMs can invoke in real time.
  • Use n8n as a backend for Claude or other AI agents.
  • Easily expose secure, custom tools—without writing traditional APIs.

This is still in beta, and the n8n team is actively looking for feedback. If you try it out, head over to the n8n community and share your experience.


Leave a Reply

x
Advertisements