Getting Started
Building AI apps on Vercel: an overview
Learn the key AI concepts and tools for building and scaling AI apps.
Streaming responses from LLMs
Learn how to use the AI SDK to stream LLM responses.
Course: Builder’s Guide to the AI SDK
Build production-ready AI features with the AI SDK and Next.js. Learn LLMs, prompting, extraction, streaming, and more.
Agents
How to build AI Agents with Vercel and the AI SDK
Learn how to build, deploy, and scale AI agents on Vercel using the AI SDK. This guide covers calling LLMs, defining tools, creating multi-step agents, and monitoring performance.
Building stateful Slack bots with Vercel Workflow
Learn how to build Slack bots that maintain state and handle long-running processes without managing queues, databases, or background job infra.
Safely running AI generated code in your application
How to execute untrusted, AI‑generated code inside Vercel Sandbox - an isolated, ephemeral environment.
Building durable AI Agents
Learn how to convert an AI chat app into a durable AI agent using Workflow DevKit.
Human-in-the-Loop
Learn how Workflow DevKit's webhook and hook primitives enable "human-in-the-loop" patterns where workflows pause until a human takes action
Slack Agents on Vercel with the AI SDK
Step-by-step course to build, deploy, and run a real Slack bot on Vercel using the AI SDK, with logging, safeguards, and an ops runbook for your team’s workspace.
All AI Guides
- Streaming responses from LLMs
- Building AI apps on Vercel: an overview
- Build an MCP Server with Weather tools using Express and Vercel
- What is Retrieval Augmented Generation (RAG)
- Building stateful Slack bots with Vercel Workflow
- Build an AI Chat Agent with Weather API Tool Calling
- Understanding vector databases for AI apps
- How to build AI Agents with Vercel and the AI SDK
- An Introduction to Evals
- Build a ChatGPT Connector (MCP server)