LangFuse
LangFuse is an LLM engineering platform that helps teams collaboratively develop, monitor, evaluate, and debug AI applications. This guide demonstrates how to integrate Vercel AI Gateway with LangFuse to access various AI models and providers.
First, create a new directory for your project and initialize it:
Install the required LangFuse packages along with the and packages:
Create a file with your Vercel AI Gateway API key and LangFuse API keys:
If you're using the AI Gateway from within a Vercel deployment, you can also use the environment variable which will be automatically provided.
Create a new file called with the following code:
The following code:
- Creates an OpenAI client configured to use the Vercel AI Gateway
- Uses to wrap the client for automatic tracing and logging
- Makes a chat completion request through the AI Gateway
- Automatically captures request/response data, token usage, and metrics
Run your application using Node.js:
You should see a response from the AI model in your console.
Was this helpful?