Note: For a standalone deep research implementation without Google Drive, see the Deep Research Agent reference project. This project extends it with Google Drive integration.
DataRoom Research is an multi-agent research system that combines the capabilities of DeepResearchAgent with GoogleDriveToolkit, built on the AG2 framework. It enhances OpenAI’s deep research agent concept by adding document handling and Google Drive integration.
This project combines two powerful agent systems:
The system also features a professional report writer agent that creates well-structured markdown reports with proper formatting.
TAGS: deep-research, data-retrieval, google-drive, document-analysis, multi-agent, report-writer, automation, research-assistant, web-scraping
DeepResearchAgent requires Python 3.11 or higher.
Install dependencies using uv:
uv sync
Install Playwright (required for web data retrieval):
uv run playwright install
For a detailed tutorial on the Google Drive Toolkit functionality, see the AG2 Google Drive Documentation
Note: In order to successfully run that notebook, you would need to do the following steps first
To enable Google Drive features, you need OAuth credentials following Google’s Quickstart:
credentials.jsoncredentials.json in the project root directorytoken.json file by running gdrive_signin.pyThe system will automatically handle OAuth authentication on first run, and you should be able to see the files in your Google Drive account get listed here.
uv run python gdrive_signin.py
You can also copy over the credentials.json and token.json to run the AG2 Google Drive Collab Notebook
Run the main application with standard research capabilities:
uv run python main.py
With Google Drive capabilities enabled:
uv run python main.py --use-gdrive
For testing with simulated research responses:
uv run python main.py --use-fake
This system supports using local language models through Ollama as an alternative to OpenAI’s API. This can provide privacy benefits, reduce costs, and enable offline usage.
Install Ollama by following the instructions at Ollama’s official website
Pull a compatible model (we recommend models with at least 7B parameters):
Install a model (example)
ollama pull deepseek-r1
ollama pull llama3:8b
ollama serve
Ollama runs a REST API on http://localhost:11434 by default. The Ollama configuration is included in OAI_CONFIG_LIST to use the local LLM with your agent.
For more information or any questions, please refer to the documentation or reach out to us!
This project is licensed under the Apache License 2.0. See the LICENSE for details.