A multi-agent AI chat application, simulation of grok heavy.
(1 agents working, 1 agent finish work)
(final response)
- Python 3.8+
- pip package manager
- Access to OpenAI-compatible API (NVIDIA API configured by default)
- SearXNG search service running on
localhost:8888(for web search functionality)
-
Clone and navigate to the project:
git clone https://github.com/valerka1292/OpenHeavy.git cd OpenHeavy -
Install dependencies:
make install # or manually: pip install -r requirements.txt -
Configure environment:
cp .env.example .env # Edit .env with your API keys and settingsRequired settings in .env:
# LLM API Configuration BASE_URL=https://integrate.api.nvidia.com/v1 MODEL=qwen/qwen3-235b-a22b # Required API Keys API_KEY=your_actual_api_key_here FLASK_SECRET_KEY=your_secret_key_here
Optional settings:
# Search API (for web search functionality) SEARCH_API_URL=http://localhost:8888/search # Flask settings FLASK_PORT=5000 FLASK_DEBUG=false # Logging LOG_LEVEL=INFO LOG_FORMAT=json
-
Run the application:
make run # or manually: python src/main.py -
Open your browser: Navigate to
http://localhost:5000