Skip to main content
LiteLLM is a library that simplifies multiple model providers. You can combine it with Restate to build resilient agents that can be customized to your use case. You wrap model calls in ctx.run and use Restate context actions to execute tools.
https://mintcdn.com/restate-6d46e1dc/eyiUDPHMMaoJj2hw/img/ai/sdk-integrations/lite-llm_icon.webp?fit=max&auto=format&n=eyiUDPHMMaoJj2hw&q=85&s=26d2b56cbaae9765079c2b25b855791e

Learn more about LiteLLM

Have a look at the following resources to get started with Restate and LiteLLM:
  • Examples: The examples which use only Restate to implement the agent logic, use LiteLLM to make model calls.
  • Guides: Check out the Restate Python examples to learn how to use Restate with LiteLLM.

Learn more

  • Blog: AI Agents should be serverless