LangChain
LangChain gives you tools for every step of the agent development lifecycle. This guide demonstrates how to integrate Vercel AI Gateway with LangChain to access various AI models and providers.
First, create a new directory for your project and initialize it:
terminalmkdir langchain-ai-gateway cd langchain-ai-gateway pnpm dlx init -yInstall the required LangChain packages along with the
dotenvand@types/nodepackages:pnpm i langchain @langchain/core @langchain/openai dotenv @types/nodeCreate a
.envfile with your Vercel AI Gateway API key:.envAI_GATEWAY_API_KEY=your-api-key-hereIf you're using the AI Gateway from within a Vercel deployment, you can also use the
VERCEL_OIDC_TOKENenvironment variable which will be automatically provided.Create a new file called
index.tswith the following code:index.tsimport 'dotenv/config'; import { ChatOpenAI } from '@langchain/openai'; import { HumanMessage } from '@langchain/core/messages'; async function main() { console.log('=== LangChain Chat Completion with AI Gateway ==='); const apiKey = process.env.AI_GATEWAY_API_KEY || process.env.VERCEL_OIDC_TOKEN; const chat = new ChatOpenAI({ apiKey: apiKey, modelName: 'openai/gpt-5', temperature: 0.7, configuration: { baseURL: 'https://ai-gateway.vercel.sh/v1', }, }); try { const response = await chat.invoke([ new HumanMessage('Write a one-sentence bedtime story about a unicorn.'), ]); console.log('Response:', response.content); } catch (error) { console.error('Error:', error); } } main().catch(console.error);The following code:
- Initializes a
ChatOpenAIinstance configured to use the AI Gateway - Sets the model
temperatureto0.7 - Makes a chat completion request
- Handles any potential errors
- Initializes a
Run your application using Node.js:
pnpm dlx tsx index.tsYou should see a response from the AI model in your console.
Was this helpful?