Loading video player...
This video shows how to connect *n8n MCP* (Model Context Protocol) to *Gemini CLI* so an AI model can create, update, and validate n8n workflows automatically. Youβll learn how to run n8n-MCP locally with Docker, connect it to your n8n instance using the n8n API, and give Gemini structured access to n8n nodes, schemas, operations, and real workflow templates. This setup allows AI to generate*valid n8n workflows, instead of guessing JSON or breaking node configurations. If youβre trying to use AI to build n8n automations, this is the missing piece. ## What this tutorial covers * What n8n MCP is and how it works * Why AI fails at n8n workflow generation without MCP * Running n8n MCP locally with Docker * Connecting Gemini CLI to n8n MCP * Generating an n8n API key * Full MCP configuration for automatic workflow creation * Creating an n8n workflow using AI * Automatically pushing workflows into n8n * Fixing and iterating workflows with AI This is a hands-on technical tutorial, not a high-level overview. ## Files & Resources * n8n-MCP repository: π https://github.com/czlonkowski/n8n-mcp * Gemini CLI documentation π https://geminicli.com/docs/ ## Chapters 00:00 β Introduction 00:17 β Gemini CLI setup (Docker + local install) 01:22 β n8n-MCP setup (Docker) 02:28 β Connecting Gemini CLI to n8n-MCP 04:38 β Automatic workflow creation inside n8n