Installation
LiteLLM Proxy Server is an open-source, self-hosted proxy server that offers an OpenAI-compatible API, enabling seamless interaction with multiple LLM providers like OpenAI, Azure, IBM WatsonX etc. It simplifies model integration, providing a unified interface for diverse AI backends.
This guide will walk you through integrating LiteLLM Proxy Server with AG2, ensuring efficient AI agent orchestration with minimal setup.
Prerequisites#
Before proceeding, ensure the following:
- Docker is installed. Refer to the Docker installation guide
- (Optional) Install Postman for easier API request testing.
Installation#
Install AG2#
AG2 is a powerful framework designed to simplify AI agent orchestration.
To install AG2
, simply run the following command:
Tip
If you have been using autogen
or pyautogen
, all you need to do is upgrade it using:
pyautogen
, autogen
, and ag2
are aliases for the same PyPI package. Install LiteLLM#
LiteLLM runs as a lightweight proxy server, making it easier to integrate different LLM providers.
To install LiteLLM, download the latest Docker image: