Skip to main content
This guide takes you through building your first AI agent with Restate and popular AI SDKs. We will run a simple weather agent that can answer questions about the weather using durable execution to ensure reliability. AI Agent Quickstart Select your AI SDK:
  • /img/languages/typescript.svg TypeScript + Vercel AI
  • /img/languages/python.svg Python + OpenAI
Prerequisites:
1

Install Restate Server & CLI

Restate is a single self-contained binary. No external dependencies needed.
  • Homebrew
  • Download binaries
  • npm
  • Docker
brew install restatedev/tap/restate-server restatedev/tap/restate
Start the server:
restate-server
You can find the Restate UI running on port 9070 (http://localhost:9070) after starting the Restate Server.
2

Get the AI Agent template

Get the weather agent template for the Vercel AI SDK and Restate:
git clone https://github.com/restatedev/ai-examples.git &&
cd ai-examples/vercel-ai/template &&
npm install
3

Run the AI Agent service

Export your OpenAI key and run the agent:
export OPENAI_API_KEY=your_openai_api_key_here
npm run dev
The weather agent is now listening on port 9080.
4

Register the service

Tell Restate where the service is running (http://localhost:9080), so Restate can discover and register the services and handlers behind this endpoint. You can do this via the UI (http://localhost:9070) or via:
restate deployments register http://localhost:9080
If you run Restate with Docker, register http://host.docker.internal:9080 instead of http://localhost:9080.
When using Restate Cloud, your service must be accessible over the public internet so Restate can invoke it. If you want to develop with a local service, you can expose it using our tunnel feature.
5

Send weather requests to the AI Agent

Invoke the agent via the Restate UI playground: go to http://localhost:9070, click on your service and then on playground.
Restate UI Playground
Or invoke via curl:
curl localhost:8080/agent/run --json '"What is the weather in Detroit?"'
Output: The weather in Detroit is currently 17°C with misty conditions..
6

Congratulations, you just ran a Durable AI Agent!

The agent you just invoked uses Durable Execution to make agents resilient to failures. Restate persisted all LLM calls and tool execution steps, so if anything fails, the agent can resume exactly where it left off.We did this by using Restate’s durableCalls middleware to persist LLM responses and using Restate Context actions (e.g. ctx.run) to make the tool executions resilient:
async function weatherAgent(restate: restate.Context, prompt: string) {
  // The durableCalls middleware persists each LLM response in Restate,
  // so they can be restored on retries without re-calling the LLM
  const model = wrapLanguageModel({
    model: openai("gpt-4o"),
    middleware: durableCalls(restate, { maxRetryAttempts: 3 }),
  });

  const { text } = await generateText({
    model,
    system: "You are a helpful agent that provides weather updates.",
    prompt,
    tools: {
      getWeather: tool({
        description: "Get the current weather for a given city.",
        inputSchema: z.object({ city: z.string() }),
        execute: async ({ city }) => {
          // call tool wrapped as Restate durable step
          return await restate.run("get weather", () => fetchWeather(city));
        },
      }),
    },
    stopWhen: [stepCountIs(5)],
    providerOptions: { openai: { parallelToolCalls: false } },
  });

  return text;
}

// create a Restate Service as the callable entrypoint
// for our durable agent function
const agent = restate.service({
  name: "agent",
  handlers: {
    run: async (ctx: restate.Context, prompt: string) => {
      return weatherAgent(ctx, prompt);
    },
  },
});

// Serve the entry-point via an HTTP/2 server
restate.serve({
  services: [agent],
});
The Invocations tab of the Restate UI shows us how Restate captured each LLM call and tool step in a journal:
Restate UI Journal Entries
Ask about the weather in Denver:
curl localhost:8080/agent/run --json '"What is the weather in Denver?"'
You can see in the service logs and in the Restate UI how each LLM call and tool step gets durably executed. We can see how the weather tool is currently stuck, because the weather API is down.
Restate UI Durable Execution
This was a mimicked failure. To fix the problem, remove the line failOnDenver from the fetchWeather function in the utils.ts file:
export async function fetchWeather(city: string) {
  failOnDenver(city);
  const output = await fetchWeatherFromAPI(city);
  return parseWeatherResponse(output);
}
Once you restart the service, the agent resumes at the weather tool call and successfully completes the request.
Next step: Follow the Tour of Agents to learn how to build agents with Restate and Vercel AI SDK, OpenAI Agents SDK, etc.
I