Understanding Workflows
Last updated: Jan 2026
Overview
Understanding how workflows are structured and executed is essential for building effective automations. This guide covers the key aspects of workflow architecture in ORCFLO.
We'll explore how data moves between nodes, the differences between execution modes, and how to use the visual canvas effectively.
Workflow Structure
Every workflow has the same basic structure: an entry point, processing steps, and an exit point. This structure is consistent whether your workflow has 3 nodes or 30.
Essential Components
| Component | Description |
|---|---|
| Input Node | Entry point that defines input fields (text, files, numbers) |
| Processing Nodes | LLM Steps and Decision Points that transform data |
| Output Node | Exit point that formats and returns the final result |
Minimal Workflow
The simplest possible workflow consists of just Input and Output. However, most workflows include at least one LLM Step to process data with AI.Execution Modes
ORCFLO supports two execution modes that control how nodes are processed. Choose the mode that best fits your workflow's requirements.
Sequential Mode
Nodes execute one at a time in connection order. Simple and predictable. Use when steps depend on each other's output.
DAG Mode (Directed Acyclic Graph)
Independent nodes run in parallel. Faster execution for workflows with branches that don't depend on each other.
When to Use Each Mode
| Mode | Best For |
|---|---|
| Sequential | Linear pipelines, debugging, when order matters |
| DAG | Independent branches, parallel processing, performance-critical workflows |
Start with DAG
Defaults to DAG mode when building new workflows. Switch to sequential only when you have independent branches that would benefit from sequential execution.Data Flow
Data flows automatically between connected nodes in ORCFLO. When you connect nodes, the output of the source node becomes available as input to the target node.
How Data Flows
- Input Collection: The Input node gathers all defined fields
- Automatic Passing: Connected nodes receive all upstream data automatically
- Processing: Each node transforms or adds to the data
- Output: Final data can be collected in the Output node from multiple sources
Automatic Context
In ORCFLO, you don't need to explicitly pass data between nodes. When nodes are connected, data flows automatically. The AI in LLM Steps receives all upstream data as context.Node Connections
Connections are the edges that link nodes together. They determine both execution order and data flow through your workflow.
Connection Types
| Type | Description |
|---|---|
| Standard | Normal flow between any two compatible nodes |
| Conditional (True/False) | Branches from Decision Point nodes |
Creating Connections
To create a connection between two nodes:
- Hover over the source node to reveal its output handle (right side)
- Click and drag from the handle
- Drop on the target node's input handle (left side)
Connection Rules
A node's output can connect to multiple targets (branching), and a node can receive input from multiple sources (merging).Workflow Lifecycle
Workflows go through several stages from creation to execution. Understanding this lifecycle helps you build and maintain workflows effectively.
Draft
Workflow is being built. Changes are auto-saved.
Ready
Workflow is complete and can be executed.
Executing
Workflow is currently running. Real-time updates appear.
Completed
Execution finished successfully. Results available.
Failed
Execution encountered an error. Error details available.
Execution Inspector
The execution inspector is your debugging tool. Click any node during or after execution to see detailed information about that step.
Inspector Panels
| Panel | Shows |
|---|---|
| Inputs | Data received by the node from upstream connections |
| Outputs | Data produced by the node, passed to downstream nodes |
| Metrics | Tokens used, execution time, credit cost |
| Files | Input and output files associated with the step |
| Errors | Error messages and stack traces if the step failed |
Key Takeaways
- Workflows have a consistent structure: Input, Processing, Output
- Choose Sequential mode for simplicity, DAG mode for parallel execution
- Data flows automatically between connected nodes
- Use the execution inspector to debug and understand data flow