Understanding Workflows

Last updated: Jan 2026

Overview

Understanding how workflows are structured and executed is essential for building effective automations. This guide covers the key aspects of workflow architecture in ORCFLO.

We'll explore how data moves between nodes, the differences between execution modes, and how to use the visual canvas effectively.

Workflow Structure

Every workflow has the same basic structure: an entry point, processing steps, and an exit point. This structure is consistent whether your workflow has 3 nodes or 30.

Essential Components

ComponentDescription
Input NodeEntry point that defines input fields (text, files, numbers)
Processing NodesLLM Steps and Decision Points that transform data
Output NodeExit point that formats and returns the final result

Minimal Workflow

The simplest possible workflow consists of just Input and Output. However, most workflows include at least one LLM Step to process data with AI.

Execution Modes

ORCFLO supports two execution modes that control how nodes are processed. Choose the mode that best fits your workflow's requirements.

1

Sequential Mode

Nodes execute one at a time in connection order. Simple and predictable. Use when steps depend on each other's output.

2

DAG Mode (Directed Acyclic Graph)

Independent nodes run in parallel. Faster execution for workflows with branches that don't depend on each other.

When to Use Each Mode

ModeBest For
SequentialLinear pipelines, debugging, when order matters
DAGIndependent branches, parallel processing, performance-critical workflows

Start with DAG

Defaults to DAG mode when building new workflows. Switch to sequential only when you have independent branches that would benefit from sequential execution.

Data Flow

Data flows automatically between connected nodes in ORCFLO. When you connect nodes, the output of the source node becomes available as input to the target node.

How Data Flows

  1. Input Collection: The Input node gathers all defined fields
  2. Automatic Passing: Connected nodes receive all upstream data automatically
  3. Processing: Each node transforms or adds to the data
  4. Output: Final data can be collected in the Output node from multiple sources

Automatic Context

In ORCFLO, you don't need to explicitly pass data between nodes. When nodes are connected, data flows automatically. The AI in LLM Steps receives all upstream data as context.

Node Connections

Connections are the edges that link nodes together. They determine both execution order and data flow through your workflow.

Connection Types

TypeDescription
StandardNormal flow between any two compatible nodes
Conditional (True/False)Branches from Decision Point nodes

Creating Connections

To create a connection between two nodes:

  1. Hover over the source node to reveal its output handle (right side)
  2. Click and drag from the handle
  3. Drop on the target node's input handle (left side)

Connection Rules

A node's output can connect to multiple targets (branching), and a node can receive input from multiple sources (merging).

Workflow Lifecycle

Workflows go through several stages from creation to execution. Understanding this lifecycle helps you build and maintain workflows effectively.

1

Draft

Workflow is being built. Changes are auto-saved.

2

Ready

Workflow is complete and can be executed.

3

Executing

Workflow is currently running. Real-time updates appear.

4

Completed

Execution finished successfully. Results available.

5

Failed

Execution encountered an error. Error details available.

Execution Inspector

The execution inspector is your debugging tool. Click any node during or after execution to see detailed information about that step.

Inspector Panels

PanelShows
InputsData received by the node from upstream connections
OutputsData produced by the node, passed to downstream nodes
MetricsTokens used, execution time, credit cost
FilesInput and output files associated with the step
ErrorsError messages and stack traces if the step failed

Key Takeaways

  • Workflows have a consistent structure: Input, Processing, Output
  • Choose Sequential mode for simplicity, DAG mode for parallel execution
  • Data flows automatically between connected nodes
  • Use the execution inspector to debug and understand data flow