Open Source · ComfyUI Custom Node
$ prompt-iterator --version 1.0.0

EVERY
PROMPT.
AUTOMATED.

A ComfyUI node that automatically steps through every prompt in a file on each generation run. Increment, decrement, randomize, or hold fixed. No rewiring. Manual switching. Queue and walk away.

Installation

01Clone into custom_nodes
cd ComfyUI/custom_nodes
git clone https://github.com/OATH-Studio/comfy-Prompt-iterator
02Create your prompt file
# Create prompts.txt in any location
# One prompt per line, no special formatting:
# prompts.txt
# "a beautiful landscape at sunset"
# "cyberpunk city rain night"
# "portrait studio soft light"
03Restart ComfyUI
# Find the node under loaders/text → Prompt Iterator
# Wire: Prompt Iterator → KSampler (positive input)
04Set control_after_generate
# increment  →  walks every prompt in order
# randomize  →  random prompt each run
# Queue N runs and walk away.

Workflow Connection

CheckpointLoader
Prompt Iterator
KSampler
SaveImage

Wire the positive_prompt output into your KSampler positive input. Each generated image is automatically named after the prompt that created it.

Features

01

Prompt List File

Load prompts from a simple text file with one prompt per line. Supports unlimited variations — just update the file and requeue.

02

Increment Mode

Steps forward one prompt after each generation. Queue 12 runs and all 12 prompts run automatically — perfect for A/B/C testing variations.

03

Decrement Mode

Steps backward through the list. Useful when you want to re-test in reverse order or walk back from a known good result.

04

Randomize Mode

Picks a random prompt from the file on each run. Good for discovery runs where you want unpredictable variation across generations.

05

Fixed Mode

Locks to the current selection and never advances. ComfyUI caches the patched model so it does not reload unnecessarily between runs.

06

IS_CHANGED Aware

Uses IS_CHANGED correctly — non-fixed modes always re-execute so the index actually advances. Fixed mode returns a stable hash so caching still works.

Mode Behaviour

Each mode determines which prompt is selected on the next run. State is tracked per node instance so multiple nodes in the same graph iterate independently.

ModeRun 1Run 2Run 3Run 4Wrap
fixedPrompt APrompt APrompt APrompt ANever advances. Caching stays active.
incrementPrompt APrompt BPrompt CPrompt DWraps to first prompt after last.
decrementPrompt DPrompt CPrompt BPrompt AWraps to last after index 0.
randomizePrompt FPrompt BPrompt EPrompt CIndependent random pick each run.

Node Inputs

InputTypeDefaultDescription
modelMODELBase diffusion model passthrough.
clipCLIPBase CLIP encoder passthrough.
prompt_fileSTRINGprompts.txtPath to text file with one prompt per line.
positive_promptSTRINGfirst promptCurrently selected positive prompt. Used as the starting point for iteration.
negative_promptSTRINGNegative prompt (optional). Can also be iterated via separate iterator node.
control_after_generateLISTincrementfixed · increment · decrement · randomize. Applied after each run.

Node Outputs

OutputTypeDescription
modelMODELPatched model with selected prompt applied.
clipCLIPPatched CLIP with selected prompt applied.
positive_promptSTRINGPositive prompt text used this run. Wire into your KSampler input.
current_indexINTZero-based index of the prompt used this run.
total_promptsINTTotal prompts available in the selected file.

Prompt File Format

Create a simple text file with one prompt per line. The node reads the file on startup and builds an index for iteration. Blank lines and comments (starting with #) are automatically skipped.

Example File (prompts.txt)

# Variation 1 - Soft lighting
a beautiful landscape at sunset with golden hour lighting

# Variation 2 - Dramatic clouds
cyberpunk city rain night with dramatic storm clouds

# Variation 3 - Portrait style
portrait studio soft light professional photography

# Variation 4 - Abstract
abstract fluid art vibrant colors swirling patterns

Result

prompts.txt4 prompts loaded

The node loads all non-blank, non-comment lines and presents them as an iterable list. Edit the file anytime — changes take effect on next generation run.

FAQ

Can I have multiple Prompt Iterator nodes in one graph?

+

Yes. Each node instance tracks its own index independently using ComfyUI's unique_id hidden input. Two nodes set to increment will walk through their respective prompt files separately without interfering with each other.

What happens when it reaches the last prompt?

+

It wraps. Increment goes back to index 0, decrement goes back to the last index. You can queue more runs than you have prompts and it will cycle through again from the start.

How do I format my prompt file?

+

Simple text file with one prompt per line. Blank lines are ignored. Comments starting with # are skipped. No special formatting required — just clean text.

Can I iterate negative prompts too?

+

Yes, add another Prompt Iterator node and connect its positive_prompt output to your KSampler negative input. Each node maintains independent state.

Does fixed mode reload the file every run?

+

No. Fixed mode returns a stable hash from IS_CHANGED so ComfyUI sees no change and uses its cached output. The prompt is not re-read on every run, which is the correct behaviour for a static selection.

OATH Studio

Need Custom
AI Tooling?

This node is a small example of what we build. We design and develop custom AI pipelines, local inference tooling, ComfyUI integrations, and production workflows for studios and independent creators who want control over their stack.

  • Custom ComfyUI nodes and workflow automation
  • Local LLM integration and prompt engineering
  • vLLM / Ollama deployment and optimisation
  • End-to-end AI image and video pipelines
  • On-premise — your hardware, your data
Get In Touch

Technical details

Categoryloaders/text
Prompt loaderBasicTextNode
File readerPython os.path / pathlib
Caching hookIS_CHANGED
Instance stateUNIQUE_ID hidden input
Dependenciesnone

Free · Open Source · MIT License

QUEUE IT.
WALK AWAY.

Built by OATH Studio. We make open tools for AI artists and studios, and take on custom development work for teams who need something specific. No cloud dependencies. No subscriptions.