Documentation

CrewAI

How to use GATE/0 with CrewAI

Introduction

CrewAI is an open-source framework for orchestrating multi-agent systems, enabling collaborative AI workflows with specialized agents and structured task flows.

GATE/0 integrates seamlessly with CrewAI, allowing you to route all LLM requests through GATE/0’s API layer — unlocking cost tracking, custom labeling, and enhanced observability — with no changes to CrewAI’s core logic.

Using GATE/0 with CrewAI

Using GATE/0 with the OpenAI Agents SDK

Set up your OpenAI provider in GATE/0

  1. Retrieve your OpenAI API key from OpenAI
  2. Add a new OpenAI provider in GATE/0 here
  3. Copy your GATE/0 API key from here

Set up CrewAI

  1. Follow the official installation guide
  2. Build your first agent using the quickstart tutorial

Implement your agent

from crewai import Agent, Crew, Process, Task, LLM
from crewai.project import CrewBase, agent, crew, task
from crewai.agents.agent_builder.base_agent import BaseAgent
from typing import List
import os

llm = LLM(
    model="openai/openai/gpt-4o", # Prefixed model name
    base_url="https://gateway.gate.io/v1", # GATE0 cloud proxy URL
    api_key=os.environ.get("GATE0_API_KEY"),  # API key provided by GATE0
    extra_headers={ # Custom labels
        'x-gate0-label-project': 'alpha', 
        'x-gate0-label-client': 'frontend', 
        'x-gate0-label-env': 'prod'
    } 
)


@CrewBase
class CrewaiExamples():
    """CrewaiExamples crew"""

    agents: List[BaseAgent]
    tasks: List[Task]

    @agent
    def researcher(self) -> Agent:
        return Agent(
            config=self.agents_config['researcher'],
            verbose=True,
            llm=llm
        )

    @agent
    def reporting_analyst(self) -> Agent:
        return Agent(
            config=self.agents_config['reporting_analyst'],
            verbose=True,
            llm=llm
        )

    @task
    def research_task(self) -> Task:
        return Task(
            config=self.tasks_config['research_task'],
        )

    @task
    def reporting_task(self) -> Task:
        return Task(
            config=self.tasks_config['reporting_task'],
            output_file='report.md'
        )

    @crew
    def crew(self) -> Crew:
        """Creates the CrewaiExamples crew"""

        return Crew(
            agents=self.agents,
            tasks=self.tasks, 
            process=Process.sequential,
            verbose=True,
        )

Explanation of the implementation

Each CrewAI agent is configured with a custom LLM instance that:

  1. Uses https://gateway.gate0.io/v1 as the base URL
  2. Is authorized using your GATE/0 API key
  3. Uses a model name prefixed with the provider slug (e.g., openai/gpt-4o instead of gpt-4o)

Provider-prefixed model names

When defining a custom LLM in CrewAI, you need to use the provider-prefixed model name (e.g., openai/gpt-4o instead of gpt-4o). This model name must be prefixed with the GATE/0 provider slug (e.g., openai for OpenAI, azure for Azure OpenAI, etc.), meaning you need to use openai/openai/gpt-4o to access gpt-4o from OpenAI and azure/azure/gpt-4o to access gpt-4o from Azure OpenAI.

  1. Includes extra_headers to attach custom labels for detailed cost tracking and analytics

This configuration allows you to plug GATE/0 directly into CrewAI without modifying how agents or tasks are defined.