Skip to content

Commit 0856861

Browse files
Merge pull request #5809 from MicrosoftDocs/main
Merged by Learn.Build PR Management system
2 parents 5b7339b + 83e5f1f commit 0856861

File tree

497 files changed

+1338
-1431
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

497 files changed

+1338
-1431
lines changed

articles/ai-foundry/concepts/prompt-flow.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ ms.custom:
99
- build-2024
1010
- ignite-2024
1111
ms.topic: concept-article
12-
ms.date: 03/18/2025
12+
ms.date: 06/30/2025
1313
ms.reviewer: none
1414
ms.author: lagayhar
1515
author: lgayhardt

articles/ai-foundry/foundry-models/how-to/monitor-models.md

Lines changed: 9 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ author: ssalgadodev
55
ms.author: ssalgado
66
ms.service: azure-ai-model-inference
77
ms.topic: how-to
8-
ms.date: 4/30/2025
8+
ms.date: 06/30/2025
99
manager: scottpolly
1010
ms.reviewer: fasantia
1111
reviewer: santiagxf
@@ -26,7 +26,7 @@ To use monitoring capabilities for model deployments in Foundry Models, you need
2626
* An Azure AI services resource. For more information, see [Create an Azure AI Services resource](../../model-inference/how-to/quickstart-create-resources.md).
2727

2828
> [!TIP]
29-
> If you are using Serverless API Endpoints and you want to take advantage of monitoring capabilities explained in this document, [migrate your Serverless API Endpoints to Foundry Models](../../model-inference/how-to/quickstart-ai-project.md).
29+
> If you're using Serverless API Endpoints and you want to take advantage of monitoring capabilities explained in this document, [migrate your Serverless API Endpoints to Foundry Models](../../model-inference/how-to/quickstart-ai-project.md).
3030
3131
* At least one model deployment.
3232

@@ -54,11 +54,15 @@ You can view metrics within Azure AI Foundry portal. To view them, follow these
5454

5555
1. Select the tab **Metrics**.
5656

57-
1. You can see a high level view of the most common metrics you may be interested about.
57+
1. You can access an overview of common metrics that might be of interest. For cost-related metrics, use the Azure Cost Management deep link, which provides access to detailed post-consumption cost metrics in the Cost analysis section located in the Azure portal. Cost data in Azure portal displays actual post-consumption charges for model consumption, including other AI resources within Azure AI Foundry. Follow this link for a full list of [AI resources](https://azure.microsoft.com/products/ai-services#tabs-pill-bar-oc14f0_tab0). There's approximately a five hour delay from the billing event to when it can be viewed in Azure portal cost analysis.
5858

5959
:::image type="content" source="../media/monitor-models/deployment-metrics.png" alt-text="Screenshot showing the metrics displayed for model deployments in Azure AI Foundry portal." lightbox="../media/monitor-models/deployment-metrics.png":::
6060

61-
1. To slice, filter, or view model details about the metrics you can **Open in Azure Monitor**, where you have more advanced options.
61+
> [!IMPORTANT]
62+
> The **Azure Cost Management deep link**  provides a direct link within the Azure portal, allowing users to access detailed cost metrics for deployed AI models. This deep link integrates with the Azure Cost Analysis service view, offering transparent and actionable insights into model-level costs.
63+
> The deep link directs users to the Cost Analysis view in the Azure portal, providing a one-click experience to view deployments per resource, including input/output token cost/consumption. To view cost data, you need at least read access for an Azure account. For information about assigning access to Microsoft Cost Management data, see [Assign access to data](/azure/cost-management-billing/costs/assign-access-acm-data).
64+
65+
1. You can **view and analyze metrics with Azure Monitor metrics explorer** to further slice and filter your model deployment metrics.
6266

6367
:::image type="content" source="../media/monitor-models/deployment-metrics-azmonitor.png" alt-text="Screenshot showing the option to open model deployment metrics in Azure Monitor." lightbox="../media/monitor-models/deployment-metrics-azmonitor.png":::
6468

@@ -91,7 +95,7 @@ To use Azure Monitor, follow these steps:
9195

9296
:::image type="content" source="../media/monitor-models/azmon-add-filter.png" alt-text="Screenshot showing how to apply a filter to a metric." lightbox="../media/monitor-models/azmon-add-filter.png":::
9397

94-
1. It is useful to break down specific metrics by some of the dimensions. The following example shows how to break down the number of requests made to the resource by model by using the option **Add splitting**:
98+
1. It's useful to break down specific metrics by some of the dimensions. The following example shows how to break down the number of requests made to the resource by model by using the option **Add splitting**:
9599

96100
:::image type="content" source="../media/monitor-models/azmon-add-splitting.png" alt-text="Screenshot showing how to split the metric by a given dimension." lightbox="../media/monitor-models/azmon-add-splitting.png":::
97101

Loading
Loading

articles/ai-foundry/how-to/develop/simulator-interaction-data.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ ms.custom:
99
- build-2024
1010
- references_regions
1111
ms.topic: how-to
12-
ms.date: 10/24/2024
12+
ms.date: 06/30/2025
1313
ms.reviewer: minthigpen
1414
ms.author: lagayhar
1515
author: lgayhardt

articles/ai-foundry/how-to/flow-process-image.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ ms.service: azure-ai-foundry
77
ms.custom:
88
- build-2024
99
ms.topic: how-to
10-
ms.date: 02/14/2025
10+
ms.date: 06/30/2025
1111
ms.reviewer: none
1212
ms.author: lagayhar
1313
author: lgayhardt

articles/ai-foundry/how-to/prompt-flow-tools/azure-open-ai-gpt-4v-tool.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ ms.service: azure-ai-foundry
77
ms.custom:
88
- build-2024
99
ms.topic: reference
10-
ms.date: 01/29/2025
10+
ms.date: 06/30/2025
1111
ms.reviewer: none
1212
ms.author: lagayhar
1313
author: lgayhardt

articles/ai-foundry/how-to/prompt-flow-tools/content-safety-tool.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ ms.custom:
88
- ignite-2023
99
- build-2024
1010
ms.topic: reference
11-
ms.date: 01/29/2025
11+
ms.date: 06/30/2025
1212
ms.reviewer: none
1313
ms.author: lagayhar
1414
author: lgayhardt

articles/ai-foundry/how-to/prompt-flow-tools/embedding-tool.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ ms.custom:
88
- ignite-2023
99
- build-2024
1010
ms.topic: reference
11-
ms.date: 01/29/2025
11+
ms.date: 6/30/2025
1212
ms.reviewer: none
1313
ms.author: lagayhar
1414
author: lgayhardt

articles/ai-foundry/how-to/prompt-flow-tools/index-lookup-tool.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ ms.service: azure-ai-foundry
77
ms.custom:
88
- build-2024
99
ms.topic: reference
10-
ms.date: 01/29/2025
10+
ms.date: 6/30/2025
1111
ms.reviewer: none
1212
ms.author: lagayhar
1313
author: lgayhardt

articles/ai-foundry/how-to/prompt-flow-tools/llm-tool.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ ms.custom:
88
- ignite-2023
99
- build-2024
1010
ms.topic: reference
11-
ms.date: 1/29/2025
11+
ms.date: 6/30/2025
1212
ms.reviewer: none
1313
ms.author: lagayhar
1414
author: lgayhardt

articles/ai-foundry/how-to/prompt-flow-tools/prompt-flow-tools-overview.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ ms.service: azure-ai-foundry
77
ms.custom:
88
- build-2024
99
ms.topic: reference
10-
ms.date: 01/29/2025
10+
ms.date: 6/30/2025
1111
ms.reviewer: none
1212
ms.author: lagayhar
1313
author: lgayhardt

articles/ai-foundry/how-to/prompt-flow-tools/prompt-tool.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ ms.custom:
88
- ignite-2023
99
- build-2024
1010
ms.topic: reference
11-
ms.date: 01/29/2025
11+
ms.date: 6/30/2025
1212
ms.reviewer: none
1313
ms.author: lagayhar
1414
author: lgayhardt

articles/ai-foundry/how-to/prompt-flow-tools/python-tool.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ manager: scottpolly
66
ms.service: azure-ai-foundry
77
ms.custom: ignite-2023, devx-track-python, build-2024, ignite-2024
88
ms.topic: reference
9-
ms.date: 01/29/2025
9+
ms.date: 6/30/2025
1010
ms.reviewer: none
1111
ms.author: lagayhar
1212
author: lgayhardt

articles/ai-foundry/how-to/prompt-flow-tools/rerank-tool.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ description: This article introduces you to the Rerank tool for flows in Azure A
55
manager: scottpolly
66
ms.service: azure-ai-foundry
77
ms.topic: reference
8-
ms.date: 01/29/2025
8+
ms.date: 6/30/2025
99
ms.reviewer: jingyizhu
1010
ms.author: lagayhar
1111
author: lgayhardt

articles/ai-foundry/how-to/prompt-flow-tools/serp-api-tool.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ ms.custom:
88
- ignite-2023
99
- build-2024
1010
ms.topic: reference
11-
ms.date: 01/31/2025
11+
ms.date: 6/30/2025
1212
ms.reviewer: none
1313
ms.author: lagayhar
1414
author: lgayhardt
Binary file not shown.
Binary file not shown.
Binary file not shown.

articles/ai-services/autoscale.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -7,16 +7,16 @@ ms.service: azure-ai-services
77
ms.custom:
88
- ignite-2023
99
ms.topic: how-to
10-
ms.date: 01/10/2025
10+
ms.date: 06/30/2025
1111
---
1212

1313
# Autoscale AI services limits
1414

15-
This article provides guidance for how customers can access higher rate limits on certain Azure AI services resources.
15+
This article provides guidance on how customers can access higher rate limits on certain Azure AI services resources.
1616

1717
## Overview
1818

19-
Each Azure AI services resource has a pre-configured static call rate (transactions per second) which limits the number of concurrent calls that customers can make to the backend service in a given time frame. The autoscale feature will automatically increase/decrease a customer's resource's rate limits based on near-real-time resource usage metrics and backend service capacity metrics.
19+
Each Azure AI services resource has a pre-configured static call rate (transactions per second) which limits the number of concurrent calls that customers can make to the service in a given time frame. The autoscale feature will automatically increase/decrease a customer's resource's rate limits based on near-real-time resource usage metrics and backend service capacity metrics.
2020

2121
## Get started with the autoscale feature
2222

@@ -84,7 +84,7 @@ Be aware of potential errors and their consequences. If a bug in your client app
8484
Yes, you can disable the autoscale feature through Azure portal or CLI and return to your default call rate limit setting. If your resource was previously approved for a higher default TPS, it goes back to that rate. It can take up to five minutes for the changes to go into effect.
8585

8686

87-
## Next steps
87+
## Related content
8888

8989
* [Plan and Manage costs for Azure AI services](../ai-foundry/how-to/costs-plan-manage.md).
9090
* [Optimize your cloud investment with Microsoft Cost Management](/azure/cost-management-billing/costs/cost-mgt-best-practices?WT.mc_id=costmanagementcontent_docsacmhorizontal_-inproduct-learn).

articles/ai-services/computer-vision/computer-vision-how-to-install-containers.md

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ author: PatrickFarley
66
manager: nitinme
77
ms.service: azure-ai-vision
88
ms.topic: how-to
9-
ms.date: 06/26/2024
9+
ms.date: 06/30/2025
1010
ms.collection: "ce-skilling-fresh-tier2, ce-skilling-ai-copilot"
1111
ms.update-cycle: 365-days
1212
ms.author: pafarley
@@ -15,7 +15,7 @@ keywords: on-premises, OCR, Docker, container
1515

1616
# Install Azure AI Vision 3.2 GA Read OCR container
1717

18-
Containers let you run the Azure AI Vision APIs in your own environment and can help you meet specific security and data governance requirements. In this article you'll learn how to download, install, and run the Azure AI Vision Read (OCR) container.
18+
Containers let you run the Azure AI Vision APIs in your own environment and can help you meet specific security and data governance requirements. In this article you learn how to download, install, and run the Azure AI Vision Read (OCR) container.
1919

2020
The Read container allows you to extract printed and handwritten text from images and documents in JPEG, PNG, BMP, PDF, and TIFF file formats. For more information on the Read service, see the [Read API how-to guide](how-to/call-read-api.md).
2121

@@ -30,7 +30,7 @@ The Read 3.2 OCR container is the latest GA model and provides:
3030
* Support for larger documents and images.
3131
* Confidence scores.
3232
* Support for documents with both print and handwritten text.
33-
* Ability to extract text from only selected page(s) in a document.
33+
* Ability to extract text from only selected pages in a document.
3434
* Choose text line output order from default to a more natural reading order for Latin languages only.
3535
* Text line classification as handwritten style or not for Latin languages only.
3636

@@ -133,7 +133,7 @@ mcr.microsoft.com/azure-cognitive-services/vision/read:3.2-model-2022-04-30
133133
More [examples](./computer-vision-resource-container-config.md#example-docker-commands) of the `docker run` command are available.
134134

135135
> [!IMPORTANT]
136-
> The `Eula`, `Billing`, and `ApiKey` options must be specified to run the container; otherwise, the container won't start. For more information, see [Billing](#billing).
136+
> The `Eula`, `Billing`, and `ApiKey` options must be specified to run the container; otherwise, the container won't start. For more information, see [Billing](#billing).
137137
138138
<!--If you need higher throughput (for example, when processing multi-page files), consider deploying multiple containers [on a Kubernetes cluster](deploy-computer-vision-on-premises.md), using [Azure Storage](/azure/storage/common/storage-account-create) and [Azure Queue](/azure/storage/queues/storage-queues-introduction).-->
139139

@@ -143,7 +143,7 @@ To find your connection string:
143143

144144
1. Navigate to **Storage accounts** on the Azure portal, and find your account.
145145
2. Select on **Access keys** in the left pane.
146-
3. Your connection string will be located below **Connection string**
146+
3. Your connection string is located below **Connection string**
147147

148148
[!INCLUDE [Running multiple containers on the same host](../includes/cognitive-services-containers-run-multiple-same-host.md)]
149149

@@ -159,9 +159,9 @@ Use the host, `http://localhost:5000`, for container APIs. You can view the Swag
159159

160160
### Asynchronous Read
161161

162-
You can use the `POST /vision/v3.2/read/analyze` and `GET /vision/v3.2/read/operations/{operationId}` operations in concert to asynchronously read an image, similar to how the Azure AI Vision service uses those corresponding REST operations. The asynchronous POST method will return an `operationId` that is used as the identifier to the HTTP GET request.
162+
You can use the `POST /vision/v3.2/read/analyze` and `GET /vision/v3.2/read/operations/{operationId}` operations in concert to asynchronously read an image, similar to how the Azure AI Vision service uses those corresponding REST operations. The asynchronous POST method returns an `operationId` that is used as the identifier to the HTTP GET request.
163163

164-
From the swagger UI, select the `Analyze` to expand it in the browser. Then select **Try it out** > **Choose file**. In this example, we'll use the following image:
164+
From the swagger UI, select the `Analyze` to expand it in the browser. Then select **Try it out** > **Choose file**. In this example, we use the following image:
165165

166166
![tabs vs spaces](media/tabs-vs-spaces.png)
167167

@@ -174,7 +174,7 @@ When the asynchronous POST has run successfully, it returns an **HTTP 202** stat
174174
server: Kestrel
175175
```
176176

177-
The `operation-location` is the fully qualified URL and is accessed via an HTTP GET. Here is the JSON response from executing the `operation-location` URL from the preceding image:
177+
The `operation-location` is the fully qualified URL and is accessed via an HTTP GET. Here's the JSON response from executing the `operation-location` URL from the preceding image:
178178

179179
```json
180180
{
@@ -284,7 +284,7 @@ The `operation-location` is the fully qualified URL and is accessed via an HTTP
284284

285285

286286
> [!IMPORTANT]
287-
> If you deploy multiple Read OCR containers behind a load balancer, for example, under Docker Compose or Kubernetes, you must have an external cache. Because the processing container and the GET request container might not be the same, an external cache stores the results and shares them across containers. For details about cache settings, see [Configure Azure AI Vision Docker containers](./computer-vision-resource-container-config.md).
287+
> If you deploy multiple Read OCR containers behind a load balancer, for example, under Docker Compose or Kubernetes, you must have an external cache. Because the processing container and the `GET` request container might not be the same, an external cache stores the results and shares them across containers. For details about cache settings, see [Configure Azure AI Vision Docker containers](./computer-vision-resource-container-config.md).
288288
289289
### Synchronous read
290290

articles/ai-services/computer-vision/concept-image-retrieval.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -61,12 +61,12 @@ The following are the main steps of the image retrieval process using Multimodal
6161

6262
### Relevance score
6363

64-
The image and video retrieval services return a field called "relevance." The term "relevance" denotes a measure of similarity between a query and image or video frame embeddings. The relevance score is composed of two parts:
65-
1. The cosine similarity (that falls in the range of [0,1]) between the query and image or video frame embeddings.
66-
1. A metadata score, which reflects the similarity between the query and the metadata associated with the image or video frame.
64+
The image retrieval service returns a field called "relevance." The term "relevance" denotes a measure of similarity between a query and image embeddings. The relevance score is composed of two parts:
65+
1. The cosine similarity (that falls in the range of [0,1]) between the query and image embeddings.
66+
1. A metadata score, which reflects the similarity between the query and the metadata associated with the image.
6767

6868
> [!IMPORTANT]
69-
> The relevance score is a good measure to rank results such as images or video frames with respect to a single query. However, the relevance score cannot be accurately compared across queries. Therefore, it's not possible to easily map the relevance score to a confidence level. It's also not possible to trivially create a threshold algorithm to eliminate irrelevant results based solely on the relevance score.
69+
> The relevance score is a good measure to rank results such as images with respect to a single query. However, the relevance score cannot be accurately compared across queries. Therefore, it's not possible to easily map the relevance score to a confidence level. It's also not possible to trivially create a threshold algorithm to eliminate irrelevant results based solely on the relevance score.
7070
7171
## Input requirements
7272

articles/ai-services/computer-vision/how-to/call-analyze-image-40.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ ms.update-cycle: 365-days
99
ms.author: pafarley
1010
ms.service: azure-ai-vision
1111
ms.topic: how-to
12-
ms.date: 06/01/2024
12+
ms.date: 06/30/2025
1313
ms.custom: devx-track-python, devx-track-extended-java, devx-track-js
1414
zone_pivot_groups: programming-languages-computer-vision
1515
---
@@ -49,7 +49,7 @@ This article demonstrates how to call the Image Analysis 4.0 API to return infor
4949

5050
::: zone-end
5151

52-
## Next steps
52+
## Related content
5353

5454
* Explore the [concept articles](../concept-describe-images-40.md) to learn more about each feature.
5555
* Explore the SDK code samples on GitHub:

articles/ai-services/computer-vision/includes/how-to-guides/analyze-image-40-rest.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -4,18 +4,18 @@ manager: nitinme
44
ms.service: ai-services
55
ms.subservice: computer-vision
66
ms.topic: include
7-
ms.date: 08/01/2023
7+
ms.date: 06/30/2025
88
ms.collection: "ce-skilling-fresh-tier2, ce-skilling-ai-copilot"
99
ms.update-cycle: 365-days
1010
ms.author: pafarley
1111
---
1212

1313
## Prerequisites
1414

15-
This guide assumes you have successfully followed the steps mentioned in the [quickstart](/azure/ai-services/computer-vision/quickstarts-sdk/image-analysis-client-library-40) page. This means:
15+
This guide assumes you've successfully followed the steps mentioned in the [quickstart](/azure/ai-services/computer-vision/quickstarts-sdk/image-analysis-client-library-40) page. This means:
1616

17-
* You have <a href="https://portal.azure.com/#create/Microsoft.CognitiveServicesComputerVision" title="created a Computer Vision resource" target="_blank">created a Computer Vision resource </a> and obtained a key and endpoint URL.
18-
* You have successfully made a `curl.exe` call to the service (or used an alternative tool). You modify the `curl.exe` call based on the examples here.
17+
* You've <a href="https://portal.azure.com/#create/Microsoft.CognitiveServicesComputerVision" title="created a Computer Vision resource" target="_blank">created a Computer Vision resource </a> and obtained a key and endpoint URL.
18+
* You've successfully made a `curl.exe` call to the service (or used an alternative tool). You modify the `curl.exe` call based on the examples here.
1919

2020
## Authenticate against the service
2121

@@ -69,7 +69,7 @@ A populated URL might look like this:
6969

7070
### Set model name when using a custom model
7171

72-
You can also do image analysis with a custom trained model. To create and train a model, see [Create a custom Image Analysis model](/azure/ai-services/computer-vision/how-to/model-customization). Once your model is trained, all you need is the model's name. You do not need to specify visual features if you use a custom model.
72+
You can also do image analysis with a custom trained model. To create and train a model, see [Create a custom Image Analysis model](/azure/ai-services/computer-vision/how-to/model-customization). Once your model is trained, all you need is the model's name. You don't need to specify visual features if you use a custom model.
7373

7474

7575
To use a custom model, don't use the features query parameter. Instead, set the `model-name` parameter to the name of your model as shown here. Replace `MyCustomModelName` with your custom model name.
@@ -272,7 +272,7 @@ The service returns a `200` HTTP response, and the body contains the returned da
272272

273273
## Error codes
274274

275-
On error, the Image Analysis service response contains a JSON payload that includes an error code and error message. It may also include other details in the form of and inner error code and message. For example:
275+
On error, the Image Analysis service response contains a JSON payload that includes an error code and error message. It might also include other details in the form of and inner error code and message. For example:
276276

277277
```json
278278
{

0 commit comments

Comments
 (0)