Iโm currently exploring ways to build a fully automated, self-hosted setup where I can describe workflows in natural language and have them created in n8n via MCP (Model Context Protocol). I know there are great Claude integrations out there, but Iโd like to avoid Claude and focus on other options โ like GPT (ChatGPT Plus), Gemini CLI, or even OpenAI Codex.
My goals:
โข Generate n8n workflows via natural language (e.g. โEvery morning at 7, get all new AI tools โ save to Airtable โ send email โ publish blog postโ)
โข Keep the stack low-cost or free (max. 20 CHF/month)
โข Have it running 24/7 and accessible from anywhere (not just on local Wi-Fi)
โข Use tools like ChatGPT Plus, Gemini CLI, and the SuperAssistant MCP (already tested)
Iโm still unsure about the infrastructure:
โข Oracle Cloud Free Tier, a cheap VPS, or maybe just a dedicated laptop?
โข Has anyone tried this with Hostinger or other shared hosting services?
โข What are the gotchas when running n8n and an MCP server together?
Iโm also comparing GitHub projects like:
โข czlonkowski/n8n-mcp
โข salacoste/mcp-n8n-workflow-builder
โข Others that let you generate or update workflows directly via an LLM
If youโve built something similar, or have strong opinions about the best way to structure such a setup, Iโd love to hear your thoughts:
โข What tools are you using?
โข Where do you host it?
โข Whatโs your favorite combo of LLM + n8n + MCP?
Any advice or shared setups would be incredibly helpful. ๐
Top comments (1)
Thank you TANISHA Singh!