Skip to content

Installation

LiteLLM Proxy Server is an open-source, self-hosted proxy server that offers an OpenAI-compatible API, enabling seamless interaction with multiple LLM providers like OpenAI, Azure, IBM WatsonX etc. It simplifies model integration, providing a unified interface for diverse AI backends.

This guide will walk you through integrating LiteLLM Proxy Server with AG2, ensuring efficient AI agent orchestration with minimal setup.

Prerequisites#

Before proceeding, ensure the following:

Installation#

Install AG2#

AG2 is a powerful framework designed to simplify AI agent orchestration.

To install AG2, simply run the following command:

pip install ag2[openai]

Tip

If you have been using autogen or pyautogen, all you need to do is upgrade it using:

pip install -U autogen[openai]
or
pip install -U pyautogen[openai]
as pyautogen, autogen, and ag2 are aliases for the same PyPI package.

Install LiteLLM#

LiteLLM runs as a lightweight proxy server, making it easier to integrate different LLM providers.

To install LiteLLM, download the latest Docker image:

docker pull ghcr.io/berriai/litellm:main-latest