Skip to content

AI Integration Nodes

Nodes for connecting workflows to AI language models.

Module Required

AI nodes are provided by the FlowDrop AI Provider module — a separate package that bridges FlowDrop with Drupal's AI module. Install it before using these nodes:

composer require drupal/ai drupal/flowdrop_ai_provider
drush en ai flowdrop_ai_provider

Then configure an AI provider (OpenAI, Mistral, etc.) at Administration > Configuration > AI.

See Installation — AI Integration for full setup steps.


Chat Model

Sends a prompt to an AI language model and returns the response. Supports any provider configured via the Drupal AI module.

Category: AI Integration

Configuration

Parameter Type Required Default Description
provider String No (site default) AI provider to use (e.g., openai, mistral). Leave blank to use the site-wide default.
model String No (provider default) Model name (e.g., gpt-4o, mistral-large-latest). Leave blank to use the provider's default.
system_prompt String No System-level instructions sent before the user prompt. Defines the assistant's role or behavior.
temperature Number No 0.7 Sampling temperature (0.0–2.0). Lower values produce more deterministic output; higher values increase creativity.
max_tokens Number No Maximum tokens in the response. Leave blank for the provider default.

Input Ports

Port Type Description
prompt String The user prompt to send to the model
context String Optional additional context to prepend to the prompt

Output Ports

Port Type Description
response String The model's text response
model String The model that was used
provider String The provider that was used
tokens_used Number Total tokens consumed (prompt + completion)

Example: Summarize an Article

  1. Add an Entity Context node and load a node entity.
  2. Add a Data Extractor node, extract the body.value field.
  3. Add a Prompt Template node with the template:
    Summarize the following article in 3 sentences:
    
    {{ text }}
    
  4. Connect Data ExtractorPrompt TemplateChat Model (into the prompt port).
  5. Connect Chat Model response output to a Logger or Entity Save node.

Tips

  • Use a Prompt Template node before Chat Model to build dynamic prompts from workflow data.
  • For chat-style interactive workflows, use Chat Model with the Playground (flowdrop_playground) — the Chat Input and Chat Output nodes are designed for this pattern.
  • Set temperature: 0 for deterministic, reproducible outputs (e.g., classification, extraction tasks).
  • Use the system_prompt parameter to constrain the model's behavior (e.g., "Respond only in JSON", "You are a content moderation assistant").
  • Monitor token usage via the tokens_used output port if you need to track API costs across workflow runs.

Supported Providers

Any provider installed via the Drupal AI module is supported. Common options:

Provider Composer package
OpenAI (GPT-4o, GPT-4, etc.) drupal/ai_provider_openai
Mistral drupal/ai_provider_mistral
Anthropic Claude drupal/ai_provider_anthropic
Ollama (local models) drupal/ai_provider_ollama

Configure providers at Administration > Configuration > AI > Providers.