The Wren MCP Server is a Model Context Protocol (MCP) server that provides tools for interacting with Wren Engine to facilitate AI agent integration.
Before setting up the Wren MCP Server, ensure you have the following dependency installed:
- uv - A fast and efficient Python package manager.
The server requires the following environment variables to be set:
| Variable | Description |
|---|---|
WREN_URL |
The URL of the Wren Ibis server. |
CONNECTION_INFO_FILE |
The path to the required connection info file. |
MDL_PATH |
The path to the MDL file. |
The following JSON is a connection info of a Postgres. You can find the requried fields for each data source in the source code.
{
"host": "localhost",
"port": "5432",
"user": "test",
"password": "test",
"database": "test"
}In the MDL, the dataSource field is required to indicate which data source should be connected.
Wren MCP Server supports an .env file for easier environment configuration. You can define all the required environment variables in this file.
Use the uv command to create a virtual envrionment and activate it:
> uv venv
Using CPython 3.11.11
Creating virtual environment at: .venv
Activate with: source .venv/bin/activate
> source .venv/bin/activate
> uv run app/wren.py
Loaded MDL etc/mdl.json
Loaded connection info etc/pg_conneciton.json
You would see that the MDL and connection info are loaded. Then, you can use Ctrl + C terminate the process.
-
If you already have a running Wren Engine, ensure that
WREN_URLis correctly set to point to your server. -
If you don't have a running engine, you can start one using Docker:
cd docker docker compose up
There are two ways to set the required environment variables:
- Set up
.envfile in the root directory of the MCP server. Make sure all required environment variables are properly configured, either in your system or within a.envfile. - Set up system environment variables in MCP configuration. See the next step.
Create a configuration file with the following structure:
{
"mcpServers": {
"wren": {
"command": "uv",
"args": [
"--directory",
"/ABSOLUTE/PATH/TO/PARENT/FOLDER/wren-engine/mcp-server",
"run",
"app/wren.py"
],
"env": {
"WREN_URL": "localhost:8000",
"CONNECTION_INFO_FILE": "/path-to-connection-info/connection.json",
"MDL_PATH": "/path-to-mdl/mdl.json"
},
"autoApprove": [],
"disabled": false
}
}
}- You may need to provide the full path to the
uvexecutable in the"command"field. You can find it using:- MacOS/Linux:
which uv - Windows:
where uv
- MacOS/Linux:
- Ensure that the absolute path to the MCP server directory is used in the configuration.
- For more details, refer to the MCP Server Guide.
The following AI agents are compatible with Wren MCP Server and deploy the MCP config:
You can ask the AI agent to perform a health check for Wren Engine.
Now, you can start asking questions through your AI agent and interact with Wren Engine. Tip: prime your agent with a short instruction so it knows how to use the Wren MCP tools.
Recommended prompt:
Use the get_wren_guide() tool to learn how to use Wren Engine and discover available tools and examples.
Optional follow-ups:
- "Open the Wren guide."
- "What Wren MCP tools are available?"
- "Show me the available tables in Wren Engine."
- "Query Wren Engine to get ... (your question here)."
- Wren Engine Documentation: Wren AI
- MCP Protocol Guide: Model Context Protocol