Nota
L'accesso a questa pagina richiede l'autorizzazione. È possibile provare ad accedere o modificare le directory.
L'accesso a questa pagina richiede l'autorizzazione. È possibile provare a modificare le directory.
Develop Agents using the Azure AI Foundry platform, leveraging an extensive ecosystem of models, tools, and capabilities from OpenAI, Microsoft, and other LLM providers.
The client library uses a single service version v1 of the AI Foundry data plane REST APIs.
Documentation
Various documentation is available to help you get started
Getting started
Prerequisites
- Java Development Kit (JDK) with version 8 or above
- Azure Subscription
Adding the package to your product
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-ai-agents</artifactId>
<version>2.0.0-beta.1</version>
</dependency>
Authentication
Azure Identity package provides the default implementation for authenticating the client.
Key concepts
Create an AgentsClient
To interact with the Azure Agents service, you'll need to create an instance of the AgentsClient class.
AgentsClient agentsClient = new AgentsClientBuilder()
.credential(new DefaultAzureCredentialBuilder().build())
.endpoint(endpoint)
.buildClient();
Alternatively, you can create an asynchronous client using the AgentsAsyncClient class.
AgentsAsyncClient agentsAsyncClient = new AgentsClientBuilder()
.credential(new DefaultAzureCredentialBuilder().build())
.endpoint(endpoint)
.buildAsyncClient();
The Agents client library has 3 sub-clients which group the different operations that can be performed:
AgentsClient/AgentsAsyncClient: Perform operations related to agents, such as creating, retrieving, updating, and deleting agents.ConversationsClient/ConversationsAsyncClient: Handle conversation operations. See the OpenAI's Conversation API documentation for more information.ResponsesClient/ResponsesAsyncClient: Handle responses operations. See the OpenAI's Responses API documentation for more information.MemoryStoresClient/MemoryStoresAsyncClient(preview): Manage memory stores for agents. This operation group requires theMemoryStores=V1Previewfeature opt-in flag and is automatically set by the SDK on every request.
To access each sub-client you need to use your AgentsClientBuilder(). The Agents client library takes the Official OpenAI SDK as a dependency, which is used for all operations, except the ones corresponding to direct Agent management.
AgentsClientBuilder builder = new AgentsClientBuilder()
.credential(new DefaultAzureCredentialBuilder().build())
.endpoint(endpoint);
// Agents sub-clients
AgentsClient agentsClient = builder.buildClient();
AgentsAsyncClient agentsAsyncClient = builder.buildAsyncClient();
// Conversations sub-clients.
ConversationsClient conversationsClient = builder.buildConversationsClient();
ConversationsAsyncClient conversationsAsyncClient = builder.buildConversationsAsyncClient();
// Responses sub-clients.
ResponsesClient responsesClient = builder.buildResponsesClient();
ResponsesAsyncClient responsesAsyncClient = builder.buildResponsesAsyncClient();
The OpenAI Official Java SDK is imported transitively and can be accessed from either the ResponsesClient or the ConversationsClient using the getOpenAIClient() method. Alternatively, you can build an OpenAIClient or OpenAIClientAsync directly from the AgentsClientBuilder:
OpenAIClient openAIClient = builder.buildOpenAIClient();
OpenAIClientAsync openAIAsyncClient = builder.buildOpenAIAsyncClient();
Agent tools
The SDK supports a variety of tools that can be attached to agent definitions. Some tools are generally available, while others are in preview and may change in future releases.
Generally available tools:
| Tool class | Description |
|---|---|
AzureAISearchTool |
Azure AI Search |
AzureFunctionTool |
Azure Functions |
BingGroundingTool |
Bing grounding |
CodeInterpreterTool |
Code interpreter |
FileSearchTool |
File search |
FunctionTool |
Custom function calling |
ImageGenTool |
Image generation |
OpenApiTool |
OpenAPI spec-based tools |
Preview tools:
| Tool class | Description |
|---|---|
A2APreviewTool |
Agent-to-agent communication |
BingCustomSearchPreviewTool |
Bing custom search |
BrowserAutomationPreviewTool |
Browser automation |
ComputerUsePreviewTool |
Computer use |
McpTool |
Model Context Protocol (MCP) |
MemorySearchPreviewTool |
Memory search |
MicrosoftFabricPreviewTool |
Microsoft Fabric |
SharepointPreviewTool |
SharePoint grounding |
WebSearchPreviewTool |
Web search |
Experimental features and opt-in flags
Some features require an opt-in via the Foundry-Features HTTP header. The SDK provides two enums for these flags:
AgentDefinitionFeatureKeys— Used when creating or updating agents. Passed as a parameter tocreateAgent,updateAgent,createAgentVersion, and related methods. Available keys:HOSTED_AGENTS_V1_PREVIEW,WORKFLOW_AGENTS_V1_PREVIEW.FoundryFeaturesOptInKeys— Defines all known opt-in keys, including:HOSTED_AGENTS_V1_PREVIEW,WORKFLOW_AGENTS_V1_PREVIEW,EVALUATIONS_V1_PREVIEW,SCHEDULES_V1_PREVIEW,RED_TEAMS_V1_PREVIEW,INSIGHTS_V1_PREVIEW,MEMORY_STORES_V1_PREVIEW.
Note: The
MemoryStoresClientautomatically sets theMemoryStores=V1Previewopt-in flag on every request.
// OpenAI SDK ResponsesService accessed from ResponsesClient
ResponsesClient responsesClient = builder.buildResponsesClient();
ResponsesService responsesService = responsesClient.getOpenAIClient();
// OpenAI SDK ConversationService accessed from ConversationsClient
ConversationsClient conversationsClient = builder.buildConversationsClient();
ConversationService conversationService = conversationsClient.getOpenAIClient();
Using OpenAI's official library
If you prefer using the OpenAI official Java client library instead, you can do so by including that dependency in your project instead and following the instructions in the linked repository. Additionally, you will have to set up your OpenAIClient as shown below:
OpenAIClient client = OpenAIOkHttpClient.builder()
.baseUrl(endpoint.endsWith("/") ? endpoint + "openai/v1" : endpoint + "/openai/v1")
.credential(BearerTokenCredential.create(AuthenticationUtil.getBearerTokenSupplier(
new DefaultAzureCredentialBuilder().build(), "https://ai.azure.com/.default")))
.build();
ResponseCreateParams responseRequest = new ResponseCreateParams.Builder()
.input("Hello, how can you help me?")
.model(model)
.build();
Response result = client.responses().create(responseRequest);
Remember to adjust your base URL so that your AI Foundry project endpoint's path ends with openai/v1 like it's shown in the above code snippet.
Examples
Prompt Agent
This example will show how to create the context necessary for a PromptAgent to work. Note that the way that context is handled in this scenario would allow you to share the context with multiple agents.
Create an Agent
Creating an Agent can be done like in the following code snippet:
PromptAgentDefinition promptAgentDefinition = new PromptAgentDefinition("gpt-4o");
AgentVersionDetails agent = agentsClient.createAgentVersion("my-agent", promptAgentDefinition);
This will return an AgentVersionObject which contains the information necessary to create an AgentReference. But first it's necessary to setup the Conversation and its messages to be able to obtain Responses with a centralized context.
Create conversation
First we need to create our Conversation object so we can attach items to it:
Conversation conversation = conversationsClient.create();
With conversation.id() contains the reference we will use to append messages to this Conversation. Conversation objects can be used by multiple agents and serve the purpose of being a centralized source of context. To add items:
conversationsClient.items().create(
ItemCreateParams.builder()
.conversationId(conversation.id())
.addItem(EasyInputMessage.builder()
.role(EasyInputMessage.Role.SYSTEM)
.content("You are a helpful assistant that speaks like a pirate.")
.build()
).addItem(EasyInputMessage.builder()
.role(EasyInputMessage.Role.USER)
.content("Hello, agent!")
.build()
).build()
);
Text generation with Responses
And the final step that ties everything together, we pass the AgentReference and the conversation.id() as parameters for the Response creation:
AgentReference agentReference = new AgentReference(agent.getName()).setVersion(agent.getVersion());
Response response = responsesClient.createAzureResponse(
new AzureCreateResponseOptions().setAgentReference(agentReference),
ResponseCreateParams.builder().conversation(conversation.id()));
// To extract Azure-specific response details:
AzureCreateResponseDetails azureResults = ResponsesUtils.getAzureFields(response);
Using Agent tools
Agents can be enhanced with specialized tools for various capabilities. For complete working examples, see the tools/ folder under samples.
In the description below, tools are organized by their Foundry connection requirements: "Built-in Tools" (which do not require a Foundry connection) and "Connection-based Tools" (which require a Foundry connection).
Built-in Tools
These tools work immediately without requiring external connections.
Code Interpreter (documentation)
Write and run Python code in a sandboxed environment, process files and work with diverse data formats.
// Create a CodeInterpreterTool with default auto container configuration
CodeInterpreterTool tool = new CodeInterpreterTool();
See the full sample in CodeInterpreterSync.java.
File Search (documentation)
Search through files in a vector store for knowledge retrieval:
// Create a FileSearchTool with the vector store ID
FileSearchTool tool = new FileSearchTool(Collections.singletonList(vectorStore.id()));
See the full sample in FileSearchSync.java.
Image Generation (documentation)
Generate images from text descriptions:
// Create image generation tool with model, quality, and size
ImageGenTool imageGenTool = new ImageGenTool()
.setModel(ImageGenToolModel.fromString(imageModel))
.setQuality(ImageGenToolQuality.LOW)
.setSize(ImageGenToolSize.fromString("1024x1024"));
See the full sample in ImageGenerationSync.java.
Web Search (Preview) (documentation)
Search the web for current information:
// Create a WebSearchPreviewTool
WebSearchPreviewTool tool = new WebSearchPreviewTool();
See the full sample in WebSearchSync.java.
Computer Use (Preview) (documentation)
Interact with computer interfaces through simulated actions and screenshots:
ComputerUsePreviewTool tool = new ComputerUsePreviewTool(
ComputerEnvironment.WINDOWS,
1026,
769
);
See the full sample in ComputerUseSync.java.
Model Context Protocol (MCP) (documentation)
Connect agents to external MCP servers:
// Uses gitmcp.io to expose a GitHub repository as an MCP-compatible server
McpTool tool = new McpTool("api-specs")
.setServerUrl("https://gitmcp.io/Azure/azure-rest-api-specs")
.setRequireApproval("always");
See the full sample in McpSync.java.
OpenAPI (documentation)
Call external APIs defined by OpenAPI specifications without additional client-side code:
// Load the OpenAPI spec from a JSON file
Map<String, BinaryData> spec = OpenApiFunctionDefinition.readSpecFromFile(
SampleUtils.getResourcePath("assets/httpbin_openapi.json"));
OpenApiTool tool = new OpenApiTool(
new OpenApiFunctionDefinition(
"httpbin_get",
spec,
new OpenApiAnonymousAuthDetails())
.setDescription("Get request metadata from an OpenAPI endpoint."));
See the full sample in OpenApiSync.java.
Function Tool (documentation)
Define custom functions that allow agents to interact with external APIs, databases, or application logic:
Map<String, Object> locationProp = new LinkedHashMap<String, Object>();
locationProp.put("type", "string");
locationProp.put("description", "The city and state, e.g. Seattle, WA");
Map<String, Object> unitProp = new LinkedHashMap<String, Object>();
unitProp.put("type", "string");
unitProp.put("enum", Arrays.asList("celsius", "fahrenheit"));
Map<String, Object> properties = new LinkedHashMap<String, Object>();
properties.put("location", locationProp);
properties.put("unit", unitProp);
Map<String, BinaryData> parameters = new HashMap<String, BinaryData>();
parameters.put("type", BinaryData.fromObject("object"));
parameters.put("properties", BinaryData.fromObject(properties));
parameters.put("required", BinaryData.fromObject(Arrays.asList("location", "unit")));
parameters.put("additionalProperties", BinaryData.fromObject(false));
FunctionTool tool = new FunctionTool("get_weather", parameters, true)
.setDescription("Get the current weather in a given location");
See the full sample in FunctionCallSync.java.
Azure Functions
Integrate Azure Functions with agents to extend capabilities via serverless compute. Functions are invoked through Azure Storage Queue triggers, allowing asynchronous execution of custom logic:
// Create Azure Function tool with Storage Queue bindings
AzureFunctionTool azureFunctionTool = new AzureFunctionTool(
new AzureFunctionDefinition(
new AzureFunctionDefinitionDetails("queue_trigger", parameters)
.setDescription("Get weather for a given location"),
new AzureFunctionBinding(
new AzureFunctionStorageQueue(queueServiceEndpoint, inputQueueName)),
new AzureFunctionBinding(
new AzureFunctionStorageQueue(queueServiceEndpoint, outputQueueName))
)
);
After calling responsesClient.createAzureResponse(), the agent enqueues function arguments to the input queue. Your Azure Function processes the request and returns results via the output queue.
See the full sample in AzureFunctionSync.java.
Memory Search (Preview) (documentation)
The Memory Search tool adds memory to an agent, allowing the agent's AI model to search for past information related to the current user prompt:
// Create memory search tool
MemorySearchPreviewTool tool = new MemorySearchPreviewTool(memoryStore.getName(), scope)
.setUpdateDelaySeconds(1);
See the full sample in MemorySearchSync.java showing how to create an agent with a memory store and use it across multiple conversations.
Connection-Based Tools
These tools require configuring connections in your Microsoft Foundry project and use a projectConnectionId.
Azure AI Search (documentation)
Integrate with Azure AI Search indexes for powerful knowledge retrieval and semantic search capabilities:
// Create Azure AI Search tool with index configuration
AzureAISearchTool aiSearchTool = new AzureAISearchTool(
new AzureAISearchToolResource(Arrays.asList(
new AISearchIndexResource()
.setProjectConnectionId(connectionId)
.setIndexName(indexName)
.setQueryType(AzureAISearchQueryType.SIMPLE)
))
);
See the full sample in AzureAISearchSync.java.
Bing Grounding (documentation)
Ground agent responses with real-time web search results from Bing to provide up-to-date information:
// Create Bing grounding tool with connection configuration
BingGroundingTool bingTool = new BingGroundingTool(
new BingGroundingSearchToolParameters(Arrays.asList(
new BingGroundingSearchConfiguration(bingConnectionId)
))
);
See the full sample in BingGroundingSync.java.
Bing Custom Search (Preview) (documentation)
Warning: Grounding with Bing Custom Search uses Grounding with Bing, which has additional costs and terms: terms of use and privacy statement. Customer data will flow outside the Azure compliance boundary.
Use custom-configured Bing search instances for domain-specific or filtered web search results:
// Create Bing Custom Search tool with connection and instance configuration
BingCustomSearchPreviewTool bingCustomSearchTool = new BingCustomSearchPreviewTool(
new BingCustomSearchToolParameters(Arrays.asList(
new BingCustomSearchConfiguration(connectionId, instanceName)
))
);
See the full sample in BingCustomSearchSync.java.
Microsoft Fabric (Preview) (documentation)
Query data from Microsoft Fabric data sources:
// Create Microsoft Fabric tool with connection configuration
MicrosoftFabricPreviewTool fabricTool = new MicrosoftFabricPreviewTool(
new FabricDataAgentToolParameters()
.setProjectConnections(Arrays.asList(
new ToolProjectConnection(fabricConnectionId)
))
);
See the full sample in FabricSync.java.
Microsoft SharePoint (Preview) (documentation)
Search through SharePoint documents for grounding:
// Create SharePoint grounding tool with connection configuration
SharepointPreviewTool sharepointTool = new SharepointPreviewTool(
new SharepointGroundingToolParameters()
.setProjectConnections(Arrays.asList(
new ToolProjectConnection(sharepointConnectionId)
))
);
See the full sample in SharePointGroundingSync.java.
Browser Automation (Preview) (documentation)
Interact with web pages through browser automation:
// Create browser automation tool with connection configuration
BrowserAutomationPreviewTool browserTool = new BrowserAutomationPreviewTool(
new BrowserAutomationToolParameters(
new BrowserAutomationToolConnectionParameters(connectionId)
)
);
See the full sample in BrowserAutomationSync.java.
Agent-to-Agent (A2A) (Preview) (documentation)
Enable agent-to-agent communication with remote A2A endpoints:
// Create agent-to-agent tool with connection ID
A2APreviewTool a2aTool = new A2APreviewTool()
.setProjectConnectionId(a2aConnectionId);
See the full sample in AgentToAgentSync.java.
MCP with Project Connection (documentation)
MCP integration using project-specific connections for accessing connected MCP servers:
// Create MCP tool with project connection authentication
McpTool mcpTool = new McpTool("api-specs")
.setServerUrl("https://api.githubcopilot.com/mcp")
.setProjectConnectionId(mcpConnectionId)
.setRequireApproval("always");
See the full sample in McpWithConnectionSync.java.
OpenAPI with Project Connection (documentation)
Call external APIs defined by OpenAPI specifications using project connection authentication:
// Create OpenAPI tool with project connection authentication
OpenApiTool openApiTool = new OpenApiTool(
new OpenApiFunctionDefinition(
"httpbin_get",
spec,
new OpenApiProjectConnectionAuthDetails(
new OpenApiProjectConnectionSecurityScheme(connectionId)))
.setDescription("Get request metadata from an OpenAPI endpoint."));
See the full sample in OpenApiWithConnectionSync.java.
Streaming responses
The ResponsesClient and ResponsesAsyncClient support streaming, which allows you to process response events as they arrive rather than waiting for the full response. This is useful for displaying text to users in real time and observing tool execution progress.
Synchronous streaming
The synchronous streaming methods return IterableStream<ResponseStreamEvent>, which can be consumed with a standard for-each loop. Use the ResponseAccumulator from the OpenAI SDK to collect events into a final Response:
// Use ResponseAccumulator to collect streamed events into a final Response
ResponseAccumulator responseAccumulator = ResponseAccumulator.create();
// Stream response - text is printed as it arrives
IterableStream<ResponseStreamEvent> events =
responsesClient.createStreamingAzureResponse(
new AzureCreateResponseOptions().setAgentReference(agentReference),
ResponseCreateParams.builder()
.input("Tell me a short story about a brave explorer."));
for (ResponseStreamEvent event : events) {
responseAccumulator.accumulate(event);
event.outputTextDelta()
.ifPresent(textEvent -> System.out.print(textEvent.delta()));
}
System.out.println(); // newline after streamed text
// Access the complete accumulated response
Response response = responseAccumulator.response();
System.out.println("\nResponse ID: " + response.id());
See the full samples in SimpleStreamingSync.java, FunctionCallStreamingSync.java, and CodeInterpreterStreamingSync.java.
Asynchronous streaming
The asynchronous streaming methods return Flux<ResponseStreamEvent>, integrating naturally with Reactor pipelines:
// Use ResponseAccumulator to collect streamed events into a final Response
ResponseAccumulator responseAccumulator = ResponseAccumulator.create();
// Stream response asynchronously - text is printed as each chunk arrives
return responsesAsyncClient.createStreamingAzureResponse(
new AzureCreateResponseOptions().setAgentReference(agentReference),
ResponseCreateParams.builder()
.input("Tell me a short story about a brave explorer."))
.doOnNext(event -> {
responseAccumulator.accumulate(event);
event.outputTextDelta()
.ifPresent(textEvent -> System.out.print(textEvent.delta()));
})
.then(Mono.fromCallable(() -> {
System.out.println(); // newline after streamed text
// Access the complete accumulated response
Response response = responseAccumulator.response();
System.out.println("\nResponse ID: " + response.id());
See the full samples in SimpleStreamingAsync.java, FunctionCallStreamingAsync.java, and CodeInterpreterStreamingAsync.java.
Structured inputs
Structured inputs allow you to define named parameters on an agent that get substituted into its prompt template at runtime. This is useful when you want the same agent definition to handle different users or contexts by simply changing the input values.
Define structured inputs on an agent
When creating the agent, declare each structured input with a description and whether it is required. Use {{inputName}} placeholders in the instructions to reference them:
// Create an agent with structured input definitions
Map<String, StructuredInputDefinition> structuredInputDefinitions = new LinkedHashMap<>();
structuredInputDefinitions.put("userName",
new StructuredInputDefinition().setDescription("User's name").setRequired(true));
structuredInputDefinitions.put("userRole",
new StructuredInputDefinition().setDescription("User's role").setRequired(true));
AgentVersionDetails agent = agentsClient.createAgentVersion("structured-input-agent",
new PromptAgentDefinition(model)
.setInstructions("You are a helpful assistant. "
+ "The user's name is {{userName}} and their role is {{userRole}}. "
+ "Greet them and confirm their details.")
.setStructuredInputs(structuredInputDefinitions));
Create a response with structured input values
When creating a response, pass a Map<String, BinaryData> whose keys match the structured input names declared on the agent. The values are substituted into the prompt template before the model processes the request:
// Build the structured input values that match the agent's definitions
Map<String, BinaryData> structuredInputValues = new LinkedHashMap<>();
structuredInputValues.put("userName", BinaryData.fromObject("Alice Smith"));
structuredInputValues.put("userRole", BinaryData.fromObject("Senior Developer"));
// Create a response using AzureCreateResponse, which flattens agent_reference
// and structured_inputs as top-level properties in the request body
Response response = responsesClient.createAzureResponse(
new AzureCreateResponseOptions()
.setAgentReference(new AgentReference(agent.getName()).setVersion(agent.getVersion()))
.setStructuredInputs(structuredInputValues),
ResponseCreateParams.builder().input("Hello! Can you confirm my details?")
);
Streaming is also supported via createStreamingAzureResponse, which returns an IterableStream<ResponseStreamEvent> (sync) or Flux<ResponseStreamEvent> (async).
See the full sample in CreateResponseWithStructuredInput.java.
Service API versions
The client library targets the latest service API version by default. The service client builder accepts an optional service API version parameter to specify which API version to communicate.
Select a service API version
You have the flexibility to explicitly select a supported service API version when initializing a service client via the service client builder. This ensures that the client can communicate with services using the specified API version.
When selecting an API version, it is important to verify that there are no breaking changes compared to the latest API version. If there are significant differences, API calls may fail due to incompatibility.
Always ensure that the chosen API version is fully supported and operational for your specific use case and that it aligns with the service's versioning policy.
Troubleshooting
Next steps
Contributing
For details on contributing to this repository, see the contributing guide.
- Fork it
- Create your feature branch (
git checkout -b my-new-feature) - Commit your changes (
git commit -am 'Add some feature') - Push to the branch (
git push origin my-new-feature) - Create new Pull Request