AI Model
Last updated: Mar 2026
AI Model
Run any AI model in your workflow
The AI Model is the core building block for adding AI to your workflows. Drag it onto the canvas, pick any model from any supported provider, and give it instructions for what you want done. It handles everything from simple text generation to complex multi-step reasoning.
You can mix and match models from different providers within a single workflow. For example, use a fast model for initial classification, then route complex items to a more capable model for deeper analysis. Each AI Model operates independently, so you have full flexibility over which model handles each part of your pipeline.
Supported Providers
Choose from five leading AI providers. Each offers a range of models optimized for different tasks, speeds, and budgets.
OpenAI
GPT-5.2, GPT-4o, o3, o4-mini
Anthropic
Claude Opus 4, Sonnet 4, Haiku 4.5
Gemini 2.5 Pro, 2.5 Flash, 2.0 Flash
Mistral
Mistral Large 3, Magistral, Codestral
xAI
Grok models
Always Up to Date
New models are added regularly. The model selector always shows the latest available options, so you will always have access to the newest releases from each provider.
Configuration
After adding an AI Model to your workflow, you can configure it with the following settings.
Model Selection
Click the model badge on the node to open the model picker. Browse models by provider, see their capabilities at a glance, and select the one that fits your task. You can change models at any time without losing your other configuration.
Task Instructions
Write clear, natural-language instructions describing what you want the AI to do. This is where you define the role, behavior, and task. The more specific your instructions, the better the results.
You are a customer support specialist. Analyze the incoming message and:
1. Identify the customer's core issue
2. Determine the urgency level (low, medium, high)
3. Draft a helpful, empathetic response
Keep the tone professional but friendly.Tools
Optionally attach tools that the model can call during execution. Tools let the AI interact with external services like APIs, databases, or web searches to gather information or take actions as part of its response.
Temperature
Controls how creative or predictable the output is. Lower values produce consistent, focused answers. Higher values produce more varied and creative responses.
| Value | Behavior | Best For |
|---|---|---|
| 0 | Deterministic | Data extraction, classification, factual tasks |
| 0.5 | Balanced | General-purpose tasks, summaries |
| 1.0 | Creative | Brainstorming, creative writing, ideation |
Timeout
Set a maximum time the step is allowed to run. If the model does not respond within this window, the step will fail gracefully so your workflow can handle the error. This is useful for keeping workflows responsive, especially when using tool calls that depend on external services.
Use Cases
The AI Model is versatile enough for nearly any AI task. Common examples include:
- Complex reasoning — multi-step analysis, research synthesis, and problem solving
- Content creation — blog posts, marketing copy, emails, and social media content
- Code generation — writing, reviewing, and debugging code in any language
- Data extraction — pulling structured data from unstructured text, PDFs, or web pages
- Multi-turn agents — autonomous workflows where the AI plans and executes steps using tools
Best Practices
- Pick the right model for the job: Use fast, lightweight models (GPT-4o Mini, Haiku 4.5, Gemini 2.5 Flash) for simple tasks and high-volume steps. Reserve larger models (GPT-5.2, Claude Opus 4, Gemini 2.5 Pro) for complex reasoning where quality matters most.
- Set temperature appropriately: Use low temperature (0 - 0.3) for factual, structured tasks. Use higher temperature (0.7 - 1.0) only when you want creative or varied output.
- Use tools for external services: Instead of trying to encode data in your prompt, attach tools that let the model fetch live data, call APIs, or perform actions on your behalf.
- Wire inputs from previous steps: Use variable references to pass data from earlier steps into your prompt. This keeps your workflows dynamic and avoids hardcoding values.
Mix and Match
You do not have to commit to one provider. Use a fast model for triage, a capable model for deep analysis, and a different model for final formatting — all in the same workflow.
Key Takeaways
- The AI Model works with models from OpenAI, Anthropic, Google, Mistral, and xAI
- Pick your model based on the task: fast for simple work, capable for complex reasoning
- Write clear, specific instructions and set temperature to match the task
- Attach tools to let the model interact with external services
- Mix different providers and models within a single workflow