ComfyUI's architecture can seem opaque at first glance, particularly for those approaching it from another platform or without a deep dive into its internals. This has led to some overly complex integration attempts. In reality, however, ComfyUI offers a clean, extensible interface. Every running instance acts as a full-featured API server - in fact, the ComfyUI frontend is essentially just a visual client layered on top of this API.
Overview On Architecture
ComfyUI is fundamentally a node-based interface and inference engine for generative AI, designed with a clear separation between frontend and backend. Under the hood, every ComfyUI process spins up a Python HTTP server that exposes both REST and WebSocket endpoints, meaning the familiar graphical canvas you interact with is simply one of many clients talking to this API-first backend.
Because the UI itself is just a "front end" for these same endpoints, you can swap it out entirely or layer additional tooling on top without touching the core. For example, the backend defines routes like /ws
(a WebSocket endpoint for real-time status and currently executing-node updates) and /prompt
(a REST endpoint to enqueue a workflow), which are the exact primitives any programmatic client or custom UI uses to interact with ComfyUI.
Beyond the core node types, ComfyUI's architecture is highly extensible via "API Nodes," special nodes that connect to closed-source or third-party AI services through external HTTP APIs. These API Nodes behave just like any other node in a workflow - once installed, they’re automatically discoverable through the same server endpoints, underscoring the system’s API-first design and seamless plugin support.
The API
There are two primary usage patterns for interacting with ComfyUI at the API level:
Editor-like Interface
In this mode, you explicitly build and manipulate workflow graphs before executing them. ComfyUI’s GitHub repository includes a script_examples
folder with Python scripts that demonstrate how to:
- Discover all available node types and their parameters.
- Assemble a workflow graph as a JSON object (mapping node IDs to
class_type
,inputs
, and optional metadata). - Save or load these JSON graphs to and from disk.
- Submit the graph to the server via HTTP (e.g. to
/prompt
) for execution.
Because everything - node definitions, graph structures, and execution commands - is pure JSON over HTTP, you can implement a full-featured visual editor or headless pipeline in any language or framework that supports HTTP and JSON ([GitHub][5]).
Runtime-like Interface
This simpler mode treats ComfyUI as a pure inference engine. Typically, you maintain a library of pre-built workflow JSON templates, and at runtime:
Step 1. POST your chosen graph JSON to http://<host>:<port>/prompt
:
POST /prompt Content-Type: application/json { "prompt": { …complete workflow JSON… }, "client_id": "optional-client-id" }
On success, you receive a prompt_id
to track the job.
Step 2. Track progress by opening a WebSocket connection to ws://<host>:<port>/ws
. This provides messages about which node is currently executing, along with overall status updates - ideal for real-time visibility into long-running workflows.
Step 3. Retrieve results (e.g. output image filenames and metadata) via:
GET /history/<prompt_id>
This returns a JSON payload listing saved images, their directories, and any workflow metadata needed to fetch or display the outputs.
This runtime-style pattern is ideal for integrating ComfyUI into production applications, batch-processing pipelines, or serverless functions, as it requires only minimal client logic.
Primary endpoints in use:
-
http://127.0.0.1:8188/prompt
- send prompt into execution queue http://127.0.0.1:8188/history
-
http://127.0.0.1:8188/history/{prompt_id}
– returns results when complete -
http://127.0.0.1:8188/view
- retrieve images http://127.0.0.1:8188/queue
For deeper insight, studying the server source code is highly instructive:
- https://github.com/comfyanonymous/ComfyUI/blob/master/server.py
- https://github.com/comfyanonymous/ComfyUI/blob/master/main.py#L221
Summary
To summarize, working with ComfyUI programmatically is straightforward - simply use the /prompt
endpoint to submit workflows, as illustrated below.
Step 1. Get the node catalogue (optional but useful): GET /object_info
returns the entire node class library, including inputs, outputs, default values, and documentation - perfect for editor autocompletion or scripting.
Step 2. Build or load a workflow JSON:
- Create it using the GUI or
- Assemble it programmatically (see
script_examples/build_json.py
).
Step 3. Submit the workflow:
curl -X POST http://127.0.0.1:8188/prompt \ -H 'Content-Type: application/json' \ -d '{"prompt": YOUR_WORKFLOW_JSON, "client_id": "my-tool"}'
On success, the server returns: {"prompt_id": "<uuid>"}
.
Step 4. Track execution progress (real-time):
Open a WebSocket to ws://127.0.0.1:8188/ws?clientId=my-tool
and listen for messages:
{ "type": "executing", "data": {"node": 42} }
Useful for logging, progress bars, or live UIs.
Step 5. Fetch results:
Once execution is complete, request GET /history/<prompt_id>
to retrieve a list of output files and metadata. To download or stream an image, use /view?filename=...
.
This flow - build JSON → POST → (optional WebSocket) → GET history - is versatile enough to power both editor-based prototyping and production-scale batch jobs.
Quick Reference
Verb | Endpoint | Purpose |
---|---|---|
POST | /prompt | Queue a workflow |
GET | /history/<id> | Fetch results and metadata |
GET | /object_info | Explore available nodes |
GET | /queue | View queued or running jobs |
WS | /ws | Live status updates |
References
- Hosting a ComfyUI Workflow via API by 9elements
- Official ComfyUI API Example
- Discussion on ComfyUI Prompt API Endpoints: Github Issue 2110 and Github Issue 6607
- A video walkthrough on ComfyUI API by Methodox: Divooka Visual Programming
- Use ComfyUI in Divooka: Methodox Wiki
Appendix - Requests
Because it's very useful and rarely available in such shocking detail, here I am providing full requests for quick reference.
Request and responses:
# POST http://146.190.245.196:8188/prompt { "prompt": { "3": { "inputs": { "seed": 1044669037100678, "steps": 20, "cfg": 8, "sampler_name": "euler", "scheduler": "normal", "denoise": 1, "model": [ "4", 0 ], "positive": [ "6", 0 ], "negative": [ "7", 0 ], "latent_image": [ "5", 0 ] }, "class_type": "KSampler", "_meta": { "title": "KSampler" } }, "4": { "inputs": { "ckpt_name": "v1-5-pruned-emaonly-fp16.safetensors" }, "class_type": "CheckpointLoaderSimple", "_meta": { "title": "Load Checkpoint" } }, "5": { "inputs": { "width": 512, "height": 512, "batch_size": 1 }, "class_type": "EmptyLatentImage", "_meta": { "title": "Empty Latent Image" } }, "6": { "inputs": { "text": "beautiful scenery nature glass bottle landscape, , purple galaxy bottle,", "clip": [ "4", 1 ] }, "class_type": "CLIPTextEncode", "_meta": { "title": "CLIP Text Encode (Prompt)" } }, "7": { "inputs": { "text": "text, watermark", "clip": [ "4", 1 ] }, "class_type": "CLIPTextEncode", "_meta": { "title": "CLIP Text Encode (Prompt)" } }, "8": { "inputs": { "samples": [ "3", 0 ], "vae": [ "4", 2 ] }, "class_type": "VAEDecode", "_meta": { "title": "VAE Decode" } }, "9": { "inputs": { "filename_prefix": "ComfyUI", "images": [ "8", 0 ] }, "class_type": "SaveImage", "_meta": { "title": "Save Image" } } } } # Response { "prompt_id": "26bc628d-7ab2-43e6-8a67-d6a793e7fbcc", "number": 1, "node_errors": {} }
# GET http://146.190.245.196:8188/history/26bc628d-7ab2-43e6-8a67-d6a793e7fbcc { "26bc628d-7ab2-43e6-8a67-d6a793e7fbcc": { "prompt": [ 1, "26bc628d-7ab2-43e6-8a67-d6a793e7fbcc", { "3": { "inputs": { "seed": 1044669037100678, "steps": 20, "cfg": 8.0, "sampler_name": "euler", "scheduler": "normal", "denoise": 1.0, "model": [ "4", 0 ], "positive": [ "6", 0 ], "negative": [ "7", 0 ], "latent_image": [ "5", 0 ] }, "class_type": "KSampler", "_meta": { "title": "KSampler" } }, "4": { "inputs": { "ckpt_name": "v1-5-pruned-emaonly-fp16.safetensors" }, "class_type": "CheckpointLoaderSimple", "_meta": { "title": "Load Checkpoint" } }, "5": { "inputs": { "width": 512, "height": 512, "batch_size": 1 }, "class_type": "EmptyLatentImage", "_meta": { "title": "Empty Latent Image" } }, "6": { "inputs": { "text": "beautiful scenery nature glass bottle landscape, , purple galaxy bottle,", "clip": [ "4", 1 ] }, "class_type": "CLIPTextEncode", "_meta": { "title": "CLIP Text Encode (Prompt)" } }, "7": { "inputs": { "text": "text, watermark", "clip": [ "4", 1 ] }, "class_type": "CLIPTextEncode", "_meta": { "title": "CLIP Text Encode (Prompt)" } }, "8": { "inputs": { "samples": [ "3", 0 ], "vae": [ "4", 2 ] }, "class_type": "VAEDecode", "_meta": { "title": "VAE Decode" } }, "9": { "inputs": { "filename_prefix": "ComfyUI", "images": [ "8", 0 ] }, "class_type": "SaveImage", "_meta": { "title": "Save Image" } } }, {}, [ "9" ] ], "outputs": { "9": { "images": [ { "filename": "ComfyUI_00002_.png", "subfolder": "", "type": "output" } ] } }, "status": { "status_str": "success", "completed": true, "messages": [ [ "execution_start", { "prompt_id": "26bc628d-7ab2-43e6-8a67-d6a793e7fbcc", "timestamp": 1745105352150 } ], [ "execution_cached", { "nodes": [ "4", "5", "6", "7" ], "prompt_id": "26bc628d-7ab2-43e6-8a67-d6a793e7fbcc", "timestamp": 1745105352151 } ], [ "execution_success", { "prompt_id": "26bc628d-7ab2-43e6-8a67-d6a793e7fbcc", "timestamp": 1745105352778 } ] ] }, "meta": { "9": { "node_id": "9", "display_node": "9", "parent_node": null, "real_node_id": "9" } } } }
# GET http://146.190.245.196:8188/view?filename=ComfyUI_00002_.png&subfolder=&type=output
# POST http://127.0.0.1:8188/upload/image Multipart with `image` payload. { "name": "Test.jpg", "subfolder": "", "type": "input" }
Top comments (0)