By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information.

Product update

  • 3 Min

AI inference inside gaiia workflows

Product Update

March 31, 2026

AI tools like Claude, ChatGPT and Gemini are already part of many teams’ daily work. Support agents summarize tickets with them. Operations teams analyze logs. Marketing teams draft responses. The challenge is that these insights often stay outside the systems where work actually happens.

With the new AI Inference node in gaiia’s Workflow Builder, CSPs can now bring large language models directly into their operational workflows. Instead of copying information into external AI tools, workflows can now send data to a model, interpret the response, and take action automatically. No custom API integrations required.

How it works: AI becomes another building block in your workflows

The AI Inference node works just like any other node in the Workflow Builder. It can receive data from existing gaiia nodes and pass results to the next step in the workflow.

This means AI can now sit inside the same automations that already power provisioning, ticketing, notifications, and customer lifecycle events.

For example a workflow could:

  1. Trigger when a new customer interaction is logged
  2. Pull the customer’s account notes
  3. Send those notes to an AI model for summarization
  4. Automatically create a ticket with recommended next actions

All of this happens in a single automated flow. Instead of adding more manual review steps, workflows become capable of interpreting information and making smarter decisions.

Bring your own AI models

The AI Inference node currently supports:

  • OpenAI models such as GPT-4o and GPT-4o-mini
  • Google Gemini models such as Gemini Flash
  • Claude models such as Opus and Sonnet

Operators simply connect their API key in the integrations module and select the model inside the workflow. This approach gives teams full control over:

  1. Provider and model selection: Operators can choose the AI provider and model that best fits their needs, whether that means prioritizing cost, speed, or output quality. Teams can easily test different models across workflows without changing infrastructure.
  2. Usage costs: AI requests are billed directly by the provider, giving operators clear visibility into usage and the flexibility to control costs by selecting smaller or faster models where appropriate.
  3. Data policies: Each provider has its own security and privacy controls. By connecting directly to their chosen provider, CSPs maintain control over where data is processed and which policies apply. For enterprise environments, this includes options like Claude via AWS Bedrock or ChatGPT via Azure Foundry, giving teams deeper visibility, control, and compliance alignment.

If a team already uses Claude, ChatGPT or Gemini internally, they can now automate those same capabilities inside gaiia.

Example: AI-powered ticket routing

Many ISPs still route tickets manually or through simple keyword rules. This works for basic scenarios but breaks down when customer descriptions vary. With the AI node, workflows can interpret the intent of a support request and route it automatically.

Example Workflow:

  1. Trigger when a new support ticket is created
  2. Send the ticket description and customer context to an AI model
  3. Classify the issue (billing, Wi-Fi, fiber cut, installation, outage)
  4. Route the ticket to the correct team, tag ticket, & set priority level

Instead of relying on rigid rules, AI can understand the meaning behind customer messages and route issues more accurately. This reduces internal handoffs and speeds up resolution times.

Example: Churn risk detection

Customer churn is often visible in interaction history long before a cancellation happens. Repeated support issues, billing disputes, or negative sentiment in conversations can all signal risk. With AI inside workflows, operators can automatically analyze customer interactions and flag accounts that may need attention.

A workflow could:

  1. Trigger after a customer support interaction
  2. Analyze notes or messages using an AI model
  3. Detect signals of frustration, repeated issues, or cancellation intent
  4. Tag the account as at-risk and notify the retention team

Instead of discovering churn only after a cancellation request, operators can proactively intervene when early signals appear.

From a system of record to a system of action

The real power of this feature comes from how it integrates with gaiia’s workflow engine.

Traditional OSS/BSS platforms tend to act as systems of record. They store data but rarely enable teams to build new operational capabilities on top of it. gaiia is designed differently.

By combining automation, integrations, and now AI inside the workflow builder, gaiia becomes a system of action. Operators are not just managing records. They are building automated processes that improve how the business runs.

March 31, 2026

Subscribe to our newsletter for exclusive updates

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Discover how ISPs leverage gaiia to support their growth

If your BSS is kind of BS, talk to us.