Documentation

LangChain

How to use GATE/0 with LangChain

LangChain is a powerful framework for building applications with largelanguage models — supporting chains, tools, memory, agents, and more.

GATE/0 integrates seamlessly with LangChain by leveraging its support for custom API endpoints and authentication keys. This allows you to route all LLM calls through GATE/0 with minimal setup — unlocking detailed usage analytics, labeling, and cost tracking — without altering how your chains are structured.

Using GATE/0 with LangChain

Set up your OpenAI provider in GATE/0

  1. Retrieve your OpenAI API key from OpenAI
  2. Add a new OpenAI provider in GATE/0 here
  3. Copy your GATE/0 API key from here

Install LangChain and its OpenAI integration

pip install langchain langchain_openai

Configure the GATE/0 API key Set your GATE/0 API key as an environment variable:

export GATE0_API_KEY=your-gate0-api-key

Implement your agent

from langchain_openai import ChatOpenAI
import os

chat = ChatOpenAI(
    openai_api_base="https://gateway.gate.io/v1", # GATE0 cloud proxy URL
    openai_api_key=os.environ.get("GATE0_API_KEY"), # API key provided by GATE0
    model="openai/gpt-4o", # Model name with provider prefix
    default_headers={ # Custom labels
        "x-gate0-label-project": "alpha-1", 
        "x-gate0-label-client": "frontend", 
        "x-gate0-label-env": "prod"
    }
)

response = chat.invoke("Hello, how are you?")
print(response.content)

Explanation of the implementation

  1. ChatOpenAI (or other LangChain LLM wrappers) is configured with:
    • base_url set to https://gateway.gate0.io/v1
    • A model name prefixed with the provider slug (e.g., openai/gpt-4o instead of gpt-4o)
    • Your GATE/0 API key passed via headers or environment
  2. Optional extra_headers can be used to include custom labels for resource tracking and analytics.
  3. All standard LangChain functionality (chains, tools, memory, etc.) works as-is — now powered by GATE/0 under the hood.