Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
44 commits
Select commit Hold shift + click to select a range
ef2bd10
Changes for azure-ai-projects release v2.0.1
dargilco Mar 6, 2026
38e2bfe
x
dargilco Mar 6, 2026
95df964
Set beta version
dargilco Mar 6, 2026
d04d08b
Re-emit
dargilco Mar 9, 2026
06c4c26
fix
dargilco Mar 9, 2026
86d53c5
Rename env varrs name
howieleung Mar 10, 2026
67a8d79
Revert "Rename env varrs name"
howieleung Mar 10, 2026
c357ed8
Better Exception messages when you try to use preview features, and "…
dargilco Mar 10, 2026
7602ddb
marking finetuning pause and resume operations as live extended tests…
jayesh-tanna Mar 10, 2026
c2e7e57
Rename env vars (#45599)
howieleung Mar 10, 2026
c497c29
Add CSV and synthetic data generation evaluation samples (#45603)
aprilk-ms Mar 11, 2026
db3ea9d
Fix azure-ai-projects linting errors with pylint version 4.0.5 (#45628)
dargilco Mar 11, 2026
cb4ef7b
Re-emit from latest TypeSpec in branch `feature/foundry-staging` (#45…
dargilco Mar 12, 2026
c15af4f
Merge remote-tracking branch 'origin/main' into feature/azure-ai-proj…
dargilco Mar 12, 2026
090b923
Remove
dargilco Mar 12, 2026
d27b152
Update project status to Production/Stable
dargilco Mar 13, 2026
b70d349
Classes `UpdateMemoriesLROPollingMethod` and `AsyncUpdateMemoriesLROP…
dargilco Mar 13, 2026
cb6748f
Unit-tests to make sure "Foundry-Features" HTTP request header is add…
dargilco Mar 16, 2026
902ac13
Merge remote-tracking branch 'origin/main' into feature/azure-ai-proj…
dargilco Mar 17, 2026
bf7b0ad
Re-emit from TypeSpec, with support for api-key auth (#45748)
dargilco Mar 18, 2026
98b8f73
New samples (#45775)
howieleung Mar 18, 2026
8cb669a
Merge remote-tracking branch 'origin/main' into feature/azure-ai-proj…
dargilco Mar 19, 2026
ed5cba5
Fix missing IntelliSense from returned OpenAI client (#45800)
dargilco Mar 19, 2026
822c347
Merge remote-tracking branch 'origin/main' into feature/azure-ai-proj…
dargilco Mar 19, 2026
3b48acd
LLM validation use 5.2 and chat completion (#45816)
howieleung Mar 20, 2026
aa8f213
update .env.template
howieleung Mar 20, 2026
2dcb2c1
Merge branch 'feature/azure-ai-projects/2.0.2' of https://github.com/…
howieleung Mar 20, 2026
54b2815
Merge remote-tracking branch 'origin/main' into feature/azure-ai-proj…
dargilco Mar 23, 2026
e6c5933
making trace context propagation enabled by default (#45830)
M-Hietala Mar 23, 2026
8dc5bb8
Custom Eval - Upload (#45678)
w-javed Mar 23, 2026
2ad4c44
added toolset samples and more (#45832)
howieleung Mar 23, 2026
efd8570
Fix failing unit-tests "test_foundry_features_header" and failing qua…
dargilco Mar 23, 2026
b035800
Fix pyright and pylint errors in evaluator upload operations (#45867)
w-javed Mar 24, 2026
d948b5f
A better way to handle removing "foundry_features" input argument fro…
dargilco Mar 24, 2026
89c796b
Another minor round of updates, after using new Python emitter (dev b…
dargilco Mar 25, 2026
07c3321
Merge remote-tracking branch 'origin/main' into feature/azure-ai-proj…
dargilco Mar 25, 2026
ba32dd5
Re-emit from latest TypeSpec in branch feature/foundry-staging (#45924)
dargilco Mar 26, 2026
002b4f1
Revert some api-key related changes that were not supposed to be comm…
dargilco Mar 26, 2026
25a0a7a
disabling test that fail with pydantic beta version (#45912)
M-Hietala Mar 26, 2026
1e49fd5
Add underscore to FoundryFeaturesOptInKeys and AgentDefinitionOptInKe…
dargilco Mar 26, 2026
39fb228
Configurations: 'specification/ai-foundry/data-plane/Foundry/tspconf…
azure-sdk Mar 26, 2026
af45442
Revert "Configurations: 'specification/ai-foundry/data-plane/Foundry…
dargilco Mar 26, 2026
072a40c
Merge branch 'main' into feature/azure-ai-projects/2.0.2
dargilco Mar 26, 2026
d55d005
Configurations: 'specification/ai-foundry/data-plane/Foundry/tspconf…
azure-sdk Mar 26, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 7 additions & 3 deletions sdk/ai/azure-ai-projects/.env.template
Original file line number Diff line number Diff line change
Expand Up @@ -20,9 +20,10 @@ AZURE_AI_PROJECTS_CONSOLE_LOGGING=

# Project endpoint has the format:
# `https://<your-ai-services-account-name>.services.ai.azure.com/api/projects/<your-project-name>`
AZURE_AI_PROJECT_ENDPOINT=
AZURE_AI_MODEL_DEPLOYMENT_NAME=
AZURE_AI_AGENT_NAME=
FOUNDRY_PROJECT_ENDPOINT=
FOUNDRY_PROJECT_API_KEY=
FOUNDRY_MODEL_NAME=
FOUNDRY_AGENT_NAME=
CONVERSATION_ID=
CONNECTION_NAME=
MEMORY_STORE_CHAT_MODEL_DEPLOYMENT_NAME=
Expand Down Expand Up @@ -65,6 +66,9 @@ PAUSED_FINE_TUNING_JOB_ID=
AZURE_SUBSCRIPTION_ID=
AZURE_RESOURCE_GROUP=

# Used in all samples
LLM_VALIDATION_PROJECT_ENDPOINT=

# Used in Fine-tuning samples
AZURE_AI_PROJECTS_AZURE_SUBSCRIPTION_ID=
AZURE_AI_PROJECTS_AZURE_RESOURCE_GROUP=
Expand Down
35 changes: 35 additions & 0 deletions sdk/ai/azure-ai-projects/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,40 @@
# Release History

## 2.1.0 (2026-03-26)

skip changelog generation for data-plane package and please add changelog manually.

## 2.0.2 (Unreleased)

### Features Added

* Placeholder

### Breaking Changes

* Tracing: trace context propagation is enabled by default when tracing is enabled.

### Bugs Fixed

* Placeholder

### Sample updates

* Renamed environment variable `AZURE_AI_PROJECT_ENDPOINT` to `FOUNDRY_PROJECT_ENDPOINT` in all samples.
* Renamed environment variable `AZURE_AI_MODEL_DEPLOYMENT_NAME` to `FOUNDRY_MODEL_NAME` in all samples.
* Renamed environment variable `AZURE_AI_MODEL_AGENT_NAME` to `FOUNDRY_AGENT_NAME` in all samples.
* Added structured inputs + file upload sample (`sample_agent_structured_inputs_file_upload.py`) demonstrating passing an uploaded file ID to an agent at runtime.
* Added structured inputs + File Search sample (`sample_agent_file_search_structured_inputs.py`) demonstrating configuring File Search tool resources via structured inputs.
* Added structured inputs + Code Interpreter sample (`sample_agent_code_interpreter_structured_inputs.py`) demonstrating passing an uploaded file ID to Code Interpreter via structured inputs.
* Added CSV evaluation sample (`sample_evaluations_builtin_with_csv.py`) demonstrating evaluation with an uploaded CSV dataset.
* Added synthetic data evaluation samples (`sample_synthetic_data_agent_evaluation.py`) and (`sample_synthetic_data_model_evaluation.py`).
* Added Chat Completions basic samples (`sample_chat_completions_basic.py`, `sample_chat_completions_basic_async.py`) demonstrating chat completions calls using `AIProjectClient` + the OpenAI-compatible client.
* Added Toolsets CRUD samples (`sample_toolsets_crud.py`, `sample_toolsets_crud_async.py`) demonstrating `project_client.beta.toolsets` create/get/update/list/delete.

### Other Changes

* Placeholder

## 2.0.1 (2026-03-12)

### Bugs Fixed
Expand Down
101 changes: 72 additions & 29 deletions sdk/ai/azure-ai-projects/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -54,8 +54,9 @@ To report an issue with the client library, or request additional features, plea
* Python 3.9 or later.
* An [Azure subscription][azure_sub].
* A [project in Microsoft Foundry](https://learn.microsoft.com/azure/foundry/how-to/create-projects).
* A Foundry project endpoint URL of the form `https://your-ai-services-account-name.services.ai.azure.com/api/projects/your-project-name`. It can be found in your Microsoft Foundry Project home page. Below we will assume the environment variable `AZURE_AI_PROJECT_ENDPOINT` was defined to hold this value.
* An Entra ID token for authentication. Your application needs an object that implements the [TokenCredential](https://learn.microsoft.com/python/api/azure-core/azure.core.credentials.tokencredential) interface. Code samples here use [DefaultAzureCredential](https://learn.microsoft.com/python/api/azure-identity/azure.identity.defaultazurecredential). To get that working, you will need:
* A Foundry project endpoint URL of the form `https://your-ai-services-account-name.services.ai.azure.com/api/projects/your-project-name`. It can be found in your Microsoft Foundry Project home page. Below we will assume the environment variable `FOUNDRY_PROJECT_ENDPOINT` was defined to hold this value.
* To authenticate using API key, you will need the "Project API key" as shown in your Microsoft Foundry Project home page.
* To authenticate using Entra ID, your application needs an object that implements the [TokenCredential](https://learn.microsoft.com/python/api/azure-core/azure.core.credentials.tokencredential) interface. Code samples here use [DefaultAzureCredential](https://learn.microsoft.com/python/api/azure-identity/azure.identity.defaultazurecredential). To get that working, you will need:
* An appropriate role assignment. See [Role-based access control in Microsoft Foundry portal](https://learn.microsoft.com/azure/foundry/concepts/rbac-foundry). Role assignment can be done via the "Access Control (IAM)" tab of your Azure AI Project resource in the Azure portal.
* [Azure CLI](https://learn.microsoft.com/cli/azure/install-azure-cli) installed.
* You are logged into your Azure account by running `az login`.
Expand All @@ -74,9 +75,46 @@ pip show azure-ai-projects

## Key concepts

### Create and authenticate the client with Entra ID
### Create and authenticate the client with API key

To construct a synchronous client using a context manager:

```python
import os
from azure.core.credentials import AzureKeyCredential
from azure.ai.projects import AIProjectClient

endpoint = os.environ["FOUNDRY_PROJECT_ENDPOINT"]
api_key = os.environ["FOUNDRY_PROJECT_API_KEY"]

with (
AIProjectClient(endpoint=endpoint, credential=AzureKeyCredential(api_key)) as project_client
):
```

To construct an asynchronous client, install the additional package [aiohttp](https://pypi.org/project/aiohttp/):

```bash
pip install aiohttp
```

and run:

Entra ID is the only authentication method supported at the moment by the client.
```python
import os
import asyncio
from azure.core.credentials import AzureKeyCredential
from azure.ai.projects.aio import AIProjectClient

endpoint = os.environ["FOUNDRY_PROJECT_ENDPOINT"]
api_key = os.environ["FOUNDRY_PROJECT_API_KEY"]

async with (
AIProjectClient(endpoint=endpoint, credential=AzureKeyCredential(api_key)) as project_client
):
```

### Create and authenticate the client with Entra ID

To construct a synchronous client using a context manager:

Expand All @@ -85,9 +123,11 @@ import os
from azure.ai.projects import AIProjectClient
from azure.identity import DefaultAzureCredential

endpoint = os.environ["FOUNDRY_PROJECT_ENDPOINT"]

with (
DefaultAzureCredential() as credential,
AIProjectClient(endpoint=os.environ["AZURE_AI_PROJECT_ENDPOINT"], credential=credential) as project_client,
AIProjectClient(endpoint=endpoint, credential=credential) as project_client,
):
```

Expand All @@ -105,9 +145,11 @@ import asyncio
from azure.ai.projects.aio import AIProjectClient
from azure.identity.aio import DefaultAzureCredential

endpoint = os.environ["FOUNDRY_PROJECT_ENDPOINT"]

async with (
DefaultAzureCredential() as credential,
AIProjectClient(endpoint=os.environ["AZURE_AI_PROJECT_ENDPOINT"], credential=credential) as project_client,
AIProjectClient(endpoint=endpoint, credential=credential) as project_client,
):
```

Expand All @@ -117,20 +159,20 @@ async with (

Your Microsoft Foundry project may have one or more AI models deployed. These could be OpenAI models, Microsoft models, or models from other providers. Use the code below to get an authenticated [OpenAI](https://github.com/openai/openai-python?tab=readme-ov-file#usage) client from the [openai](https://pypi.org/project/openai/) package, and execute an example multi-turn "Responses" calls.

The code below assumes the environment variable `AZURE_AI_MODEL_DEPLOYMENT_NAME` is defined. It's the deployment name of an AI model in your Foundry Project. See "Build" menu, under "Models" (First column of the "Deployments" table).
The code below assumes the environment variable `FOUNDRY_MODEL_NAME` is defined. It's the deployment name of an AI model in your Foundry Project. See "Build" menu, under "Models" (First column of the "Deployments" table).

<!-- SNIPPET:sample_responses_basic.responses -->

```python
with project_client.get_openai_client() as openai_client:
response = openai_client.responses.create(
model=os.environ["AZURE_AI_MODEL_DEPLOYMENT_NAME"],
model=os.environ["FOUNDRY_MODEL_NAME"],
input="What is the size of France in square miles?",
)
print(f"Response output: {response.output_text}")

response = openai_client.responses.create(
model=os.environ["AZURE_AI_MODEL_DEPLOYMENT_NAME"],
model=os.environ["FOUNDRY_MODEL_NAME"],
input="And what is the capital city?",
previous_response_id=response.id,
)
Expand All @@ -145,7 +187,7 @@ See the "responses" folder in the [package samples][samples] for additional samp

The `.agents` property on the `AIProjectClient` gives you access to all Agent operations. Agents use an extension of the OpenAI Responses protocol, so you will need to get an `OpenAI` client to do Agent operations, as shown in the example below.

The code below assumes environment variable `AZURE_AI_MODEL_DEPLOYMENT_NAME` is defined. It's the deployment name of an AI model in your Foundry Project. See "Build" menu, under "Models" (First column of the "Deployments" table).
The code below assumes environment variable `FOUNDRY_MODEL_NAME` is defined. It's the deployment name of an AI model in your Foundry Project. See "Build" menu, under "Models" (First column of the "Deployments" table).

See the "agents" folder in the [package samples][samples] for an extensive set of samples, including streaming, tool usage and memory store usage.

Expand All @@ -156,7 +198,7 @@ with project_client.get_openai_client() as openai_client:
agent = project_client.agents.create_version(
agent_name="MyAgent",
definition=PromptAgentDefinition(
model=os.environ["AZURE_AI_MODEL_DEPLOYMENT_NAME"],
model=os.environ["FOUNDRY_MODEL_NAME"],
instructions="You are a helpful assistant that answers general questions",
),
)
Expand All @@ -177,7 +219,7 @@ with project_client.get_openai_client() as openai_client:
conversation_id=conversation.id,
items=[{"type": "message", "role": "user", "content": "And what is the capital city?"}],
)
print(f"Added a second user message to the conversation")
print("Added a second user message to the conversation")

response = openai_client.responses.create(
conversation=conversation.id,
Expand Down Expand Up @@ -229,7 +271,7 @@ the `code_interpreter_call` output item:

```python
code = next((output.code for output in response.output if output.type == "code_interpreter_call"), "")
print(f"Code Interpreter code:")
print("Code Interpreter code:")
print(code)
```

Expand All @@ -246,7 +288,9 @@ asset_file_path = os.path.abspath(
)

# Upload the CSV file for the code interpreter
file = openai_client.files.create(purpose="assistants", file=open(asset_file_path, "rb"))
with open(asset_file_path, "rb") as f:
file = openai_client.files.create(purpose="assistants", file=f)

tool = CodeInterpreterTool(container=AutoCodeInterpreterToolParam(file_ids=[file.id]))
```

Expand All @@ -273,9 +317,8 @@ print(f"Vector store created (id: {vector_store.id})")
asset_file_path = os.path.abspath(os.path.join(os.path.dirname(__file__), "../assets/product_info.md"))

# Upload file to vector store
file = openai_client.vector_stores.files.upload_and_poll(
vector_store_id=vector_store.id, file=open(asset_file_path, "rb")
)
with open(asset_file_path, "rb") as f:
file = openai_client.vector_stores.files.upload_and_poll(vector_store_id=vector_store.id, file=f)
print(f"File uploaded to vector store (id: {file.id})")

tool = FileSearchTool(vector_store_ids=[vector_store.id])
Expand Down Expand Up @@ -415,7 +458,7 @@ Call external APIs defined by OpenAPI specifications without additional client-s
<!-- SNIPPET:sample_agent_openapi.tool_declaration-->

```python
with open(weather_asset_file_path, "r") as f:
with open(weather_asset_file_path, "r", encoding="utf-8") as f:
openapi_weather = cast(dict[str, Any], jsonref.loads(f.read()))

tool = OpenApiTool(
Expand Down Expand Up @@ -765,7 +808,7 @@ with (
project_client.get_openai_client() as openai_client,
):
agent = project_client.agents.create_version(
agent_name=os.environ["AZURE_AI_AGENT_NAME"],
agent_name=os.environ["FOUNDRY_AGENT_NAME"],
definition=PromptAgentDefinition(
model=model_deployment_name,
instructions="You are a helpful assistant that answers general questions",
Expand Down Expand Up @@ -1175,24 +1218,24 @@ Trace context propagation allows client-side spans generated by the Projects SDK

This feature ensures that all operations within a distributed trace share the same trace ID, providing end-to-end visibility across your application and Azure services in your observability backend (such as Azure Monitor).

To enable trace context propagation, set the `AZURE_TRACING_GEN_AI_ENABLE_TRACE_CONTEXT_PROPAGATION` environment variable to `true`:

If no value is provided for the `enable_trace_context_propagation` parameter with the AIProjectInstrumentor.instrument()` call and the environment variable is not set, trace context propagation defaults to `false` (opt-in).
Trace context propagation is **enabled by default** when tracing is enabled (for example through `configure_azure_monitor` or the `AIProjectInstrumentor().instrument()` call). To disable it, set the `AZURE_TRACING_GEN_AI_ENABLE_TRACE_CONTEXT_PROPAGATION` environment variable to `false`, or pass `enable_trace_context_propagation=False` to the `AIProjectInstrumentor().instrument()` call.

**Important Security and Privacy Considerations:**
**When does the change take effect?**
- Changes to `enable_trace_context_propagation` (whether via `instrument()` or the environment variable) only affect OpenAI clients obtained via `get_openai_client()` **after** the change is applied. Previously acquired clients are unaffected.
- To apply the new setting to all clients, call `AIProjectInstrumentor().instrument(enable_trace_context_propagation=<value>)` before acquiring your OpenAI clients, or re-acquire the clients after making the change.

- **Trace IDs**: When trace context propagation is enabled, trace IDs are sent to Azure OpenAI and other external services.
- **Request Correlation**: Trace IDs allow Azure services to correlate requests from the same session or user across multiple API calls, which may have privacy implications depending on your use case.
- **Opt-in by Design**: This feature is disabled by default to give you explicit control over when trace context is propagated to external services.

Only enable trace context propagation after carefully reviewing your observability, privacy and security requirements.
**Security and Privacy Considerations:**
- **Trace IDs are sent to external services**: The `traceparent` and `tracestate` headers from your client-side originating spans are injected into requests sent to service. This enables end-to-end distributed tracing, but note that the trace identifier may be shared beyond the initial API call.
- **Enabled by Default**: If you have privacy or compliance requirements that prohibit sharing trace identifiers with services, disable trace context propagation by setting `enable_trace_context_propagation=False` or the environment variable to `false`.

#### Controlling baggage propagation

When trace context propagation is enabled, you can separately control whether the baggage header is included. By default, only `traceparent` and `tracestate` headers are propagated. To also include the `baggage` header, set the `AZURE_TRACING_GEN_AI_TRACE_CONTEXT_PROPAGATION_INCLUDE_BAGGAGE` environment variable to `true`:

If no value is provided for the `enable_baggage_propagation` parameter with the `AIProjectInstrumentor.instrument()` call and the environment variable is not set, the value defaults to `false` and baggage is not included.

**Note:** The `enable_baggage_propagation` flag is evaluated dynamically on each request, so changes take effect **immediately** for all clients that have the trace context propagation hook registered. However, the hook is only registered on clients acquired via `get_openai_client()` **while trace context propagation was enabled**. Clients acquired when trace context propagation was disabled will never propagate baggage, regardless of the `enable_baggage_propagation` value.

**Why is baggage propagation separate?**

The baggage header can contain arbitrary key-value pairs added anywhere in your application's trace context. Unlike trace IDs (which are randomly generated identifiers), baggage may contain:
Expand Down Expand Up @@ -1357,7 +1400,7 @@ By default logs redact the values of URL query strings, the values of some HTTP
```python
project_client = AIProjectClient(
credential=DefaultAzureCredential(),
endpoint=os.environ["AZURE_AI_PROJECT_ENDPOINT"],
endpoint=os.environ["FOUNDRY_PROJECT_ENDPOINT"],
logging_enable=True
)
```
Expand Down
6 changes: 5 additions & 1 deletion sdk/ai/azure-ai-projects/_metadata.json
Original file line number Diff line number Diff line change
Expand Up @@ -2,5 +2,9 @@
"apiVersion": "v1",
"apiVersions": {
"Azure.AI.Projects": "v1"
}
},
"commit": "bd70f1fbb6f1691a3107d791c7aa5d1dede46f01",
"repository_url": "https://github.com/Azure/azure-rest-api-specs",
"typespec_src": "specification/ai-foundry/data-plane/Foundry",
"emitterVersion": "0.61.1"
}
Loading