Stop Building in n8n: How to Simply Prompt Workflows with Claude
Stop Building in n8n: How to Build Workflows with Claude Code
The most efficient way to build n8n workflows in 2025 isn't using the drag-and-drop editor—it is building them programmatically using Claude Code combined with the n8n MCP server. This setup allows you to prompt your way to fully fleshed-out agents, giving an AI permission to directly create, edit, and fix nodes inside your n8n instance via API.
I've been testing this extensively, and here is the reality: manual building is dead. When you build with Claude Code, you aren't just making a JSON file in a vacuum. You are creating a foundation where the AI knows your frontend, your backend, your database, and your payments. This is how we move from simple automations to real AI products.
Here is exactly how to set up this machine and start prompting your way to complex agents.
Why You Should Stop Building Inside the n8n Editor
Most people get this wrong. They view n8n as a canvas where you drag nodes around until something works. That's fine for a simple email trigger, but it falls apart when you try to build robust AI systems.
When we build using Claude Code (Anthropic's CLI tool), we are creating an ecosystem. Since Claude stores the context of your project, it will know every workflow, node, and connection you've built. This means later, when you want to add a React frontend or a Stripe payment integration, Claude already knows exactly how your backend logic works.
We are no longer dealing with isolated JSON files. We are building scalable software. To do this, we need three specific tools to work together:
- Claude Code: The CLI interface for the AI.
- The n8n MCP Server: Acts as the bridge allowing Claude to "see" and control your n8n instance.
- n8n Skills: Custom instructions that teach Claude exactly how to use the MCP server effectively.
How to Set Up the n8n MCP Server
This isn't an official release from n8n yet; it's a community tool created by a developer named Jankowski (check his repo). It gives Claude access to over 500 nodes, their documentation, and 2,700+ workflow templates.
When I tell Claude to "build a workflow," it utilizes this server to look up the node schemas and real-world examples before writing a single line of code.
Step 1: Initialize the Configuration
First, open your Claude Code project folder (e.g., n8n-cc-tutorial). You need to create a specific configuration file named mcp.json.
Paste the following configuration into that file (you can grab the raw code from the GitHub repo, but here is the structure):
{
"mcpServers": {
"n8n": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-n8n"
],
"env": {
"N8N_API_KEY": "YOUR_KEY_HERE",
"N8N_HOST": "YOUR_n8n_URL"
}
}
}
}
Important for Windows Users: If you are on Windows (not WSL, just standard Windows), you need to change the command from npx to cmd and change the arguments to /c npx mcp. Mac users can leave it as is.
Step 2: Connect Your Credentials
You need to give the MCP server permission to talk to your instance.
- API URL: Copy your n8n URL (everything before
/home). If you are hosting on cloud, it looks likehttps://your-instance.app.n8n.cloud. Make sure there is no trailing slash. - API Key: Go to your n8n dashboard → Settings → n8n API → Create API Key. Copy that key.
Paste these into your mcp.json file and save it.
Step 3: Verify the Connection
Restart Claude Code (type /exit and start it again). Once it loads, type /mcp. You should see a status message confirming that the n8n-mcp is connected. If you see that, you have successfully bridged the gap between the AI and your automation engine.
Installing n8n Custom Skills
If the MCP server is the tool, Skills are the instruction manual.
Skills are custom instructions for Claude Code that do not live in your system prompt, meaning they don't bloat your context window (saving you tokens and money).
I recommend doing a manual installation to ensure they work globally across all your projects:
- Clone the n8n Skills repository into your project folder.
- Tell Claude: "I need to install the skills from this repo. Here are the manual instructions..." (Paste the instructions from the repo readme).
- Once installed, tell Claude: "Go ahead and remove the repo from my project. I just need the skills globally available."
This is crucial. By making them global, you can spin up a new project next week for a completely different client, and Claude will still know exactly how to architect n8n workflows without you reinstalling everything.
Case Study: Building a Multimodal Nutrition Tracker
To prove this works, I had Claude build a nutrition tracker from scratch.
The Goal: A system where I can send a voice note or text to Telegram saying "I ate two slices of pizza," and the AI calculates macros, logs it to a database (Supabase), and confirms it back to me.
The Prompt:
"I want a multimodal meal tracking system via Telegram. Use Plan Mode. I want to tell it what I ate, have it estimate calories/macros, and log it to Supabase."
The Result:
Claude used the MCP server to search for relevant nodes (search_node). It proposed a flow:
- Telegram Trigger: Checks if the input is voice or text.
- Whisper Model: Transcribes audio if necessary.
- OpenAI: Analyzes the text for nutritional data.
- Supabase: Logs the JSON data.
- Telegram Output: Sends a confirmation.
It didn't just give me the JSON code—it actually created the workflow inside my n8n instance automatically via the API.
Trouble-Shooting with Context
This is where it gets interesting.
When I first ran the workflow, the Supabase node failed. In the old days, I would have to screenshot the error, paste it into ChatGPT, copy the fix, and manually update the node.
With this setup, I just pasted the error log into the terminal. Because Claude has the context and the MCP connection, it recognized that the Supabase node was missing the data fields. It edited the node directly without me touching the UI.
I ran the test again: "I had one slice of pizza." Result: Success. Logged to Supabase, confirmation received on Telegram.
Frequently Asked Questions
What is the n8n MCP Server?
It stands for Model Context Protocol. Think of it as a universal translator that allows AI models (like Claude) to understand and interact with external systems (like your n8n instance) safely and structurally.
Can Claude actually edit my workflows?
Yes. If you configure the API key correctly in your mcp.json, Claude can create new workflows and update existing nodes. It can fix bugs directly in your instance without you needing to drag and drop anything.
Do I need to pay for Claude Code?
Claude Code is fully available. You pay for the API credits (tokens) you use. Because using Skills keeps instructions out of the main context window, this method is actually more token-efficient than pasting massive JSON files into a standard chat window.
The Takeaway
We are moving past the era of manual drag-and-drop automation. By setting up the n8n MCP server and Skills, you aren't just saving time on setup—you are building a scalable ecosystem where your AI understands the full context of your product.
This is just step one. In the next guide, I'll show you how to take this backend and attach a real frontend and payment processing to it.
If you want to dive deeper into these builds, join the free Chase AI community for templates, prompts, and live breakdowns.


