Skip to main content
Build resilient workflows that include human approval steps or external signals without worrying about failures or interruptions. Sometimes you need to include a human evaluator, approval step, or another external signal in an agentic workflow. With Restate, you can model this by creating a Durable Promise and awaiting its completion (via a callback). Without worrying about failures or interruptions.

How does Restate help?

The benefits of using Restate here are:
  • Durable promises: Create promises that survive process restarts and failures
  • Automatic recovery: Resume exactly where you left off after any interruption
  • Works with any LLM SDK (Vercel AI, LangChain, LiteLLM, etc.) and any programming language supported by Restate (TypeScript, Python, Go, etc.).
  • Cost-efficient waiting: Suspend execution during approval wait times - pay for active work, not idle time:

Example

Use ctx.awakeable() to create durable promises that can be resolved externally. The workflow suspends during the wait and resumes automatically when the promise is resolved.
const tools = {
  getHumanReview: tool({
    description: "Request human review if policy violation is uncertain.",
    inputSchema: z.object({}),
  }),
};

async function moderate(ctx: Context, { message }: { message: string }) {
  const prompt = `You are a content moderation agent. Decide if the content violates policy: ${message}`;
  const { text, toolCalls } = await ctx.run(
    "LLM call",
    // Use your preferred LLM SDK here
    async () => llmCall(prompt, tools),
    { maxRetryAttempts: 3 },
  );

  if (toolCalls?.[0]?.toolName === "getHumanReview") {
    // Create a recoverable approval promise
    const approval = ctx.awakeable<string>();
    await ctx.run("Ask review", () => notifyModerator(message, approval.id));

    // Suspend until moderator resolves the approval
    // Check the service logs to see how to resolve it over HTTP, e.g.:
    // curl http://localhost:8080/restate/awakeables/sign_.../resolve --json '"approved"'
    return approval.promise;
  }

  return text;
}
View on GitHub: TS / Python Check out the SDK documentation for more details on the awakeables and durable promises API (TS / Python).
This pattern is implementable with any of our SDKs and any AI SDK. If you need help with a specific SDK, please reach out to us via Discord or Slack.
1

Requirements

  • AI SDK of your choice (e.g., OpenAI, LangChain, Pydantic AI, LiteLLM, etc.) to make LLM calls.
  • API key for your model provider.
2

Download the example

git clone https://github.com/restatedev/ai-examples.git &&
cd typescript-patterns &&
npm install
3

Start the Restate Server

restate-server
4

Start the Service

Export the API key of your model provider as an environment variable and then start the agent. For example, for OpenAI:
export OPENAI_API_KEY=your_openai_api_key
npm run dev
5

Register the services

  • UI
  • CLI
Service Registration
6

Send a request

  • UI
  • curl
In the UI (http://localhost:9070), click on the moderate handler of the CallChainingService to open the playground and send a default request:Human approval playground - UI
This will block on the human approval step, so you will not see a response yet.You can see in the Invocations Tab of the UI how the workflow suspends after waiting for a minute:Human approval suspended journal
7

Resolve the approval

Check the service logs to find the curl command to resolve the approval. It will look like this:
curl http://localhost:8080/restate/awakeables/{awakeable_id}/resolve --json '"approved"'
Replace {awakeable_id} with the actual ID from the logs.
8

Check the Restate UI

You can see in the Invocations Tab of the UI how the workflow suspends during the human approval step and resumes once the promise is resolved.Human approval - journal