Skip to content
3 changes: 1 addition & 2 deletions data-explorer/cluster-encryption-disk.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,10 +28,9 @@ Your cluster security settings allow you to enable disk encryption on your clust
## Considerations

The following considerations apply to encryption using Azure Disk Encryption:
The following consideration apply to encryption using Azure Disk Encryption:

* Performance impact of up to a single digit
* Can't be used with sandboxes

## Related content

Expand Down
56 changes: 28 additions & 28 deletions data-explorer/data-factory-command-activity.md
Original file line number Diff line number Diff line change
@@ -1,17 +1,17 @@
---
title: 'Use Azure Data Explorer management commands in Azure Data Factory'
description: 'In this topic, use Azure Data Explorer management commands in Azure Data Factory'
title: Use Azure Data Explorer Management Commands in Azure Data Factory
description: In this topic, use Azure Data Explorer management commands in Azure Data Factory
ms.reviewer: tzgitlin
ms.topic: how-to
ms.date: 09/13/2023
ms.date: 02/23/2026
ms.custom: sfi-image-nochange

#Customer intent: I want to use Azure Data Explorer management commands in Azure Data Factory.
---

# Use Azure Data Factory command activity to run Azure Data Explorer management commands

[Azure Data Factory](/azure/data-factory/) (ADF) is a cloud-based data integration service that allows you to perform a combination of activities on the data. Use ADF to create data-driven workflows for orchestrating and automating data movement and data transformation. The **Azure Data Explorer Command** activity in Azure Data Factory enables you to run [Azure Data Explorer management commands](/kusto/query/index?view=azure-data-explorer&preserve-view=true#management-commands) within an ADF workflow. This article teaches you how to create a pipeline with a lookup activity and ForEach activity containing an Azure Data Explorer command activity.
[Azure Data Factory](/azure/data-factory/) (ADF) is a cloud-based data integration service that you can use to perform a combination of activities on the data. Use ADF to create data-driven workflows for orchestrating and automating data movement and data transformation. The **Azure Data Explorer Command** activity in Azure Data Factory enables you to run [Azure Data Explorer management commands](/kusto/query/index?view=azure-data-explorer&preserve-view=true#management-commands) within an ADF workflow. This article shows you how to create a pipeline with a lookup activity and ForEach activity containing an Azure Data Explorer command activity.

## Prerequisites

Expand All @@ -29,18 +29,18 @@ ms.custom: sfi-image-nochange

## Create a Lookup activity

A [lookup activity](/azure/data-factory/control-flow-lookup-activity) can retrieve a dataset from any Azure Data Factory-supported data sources. The output from Lookup activity can be used in a ForEach or other activity.
A [lookup activity](/azure/data-factory/control-flow-lookup-activity) can retrieve a dataset from any Azure Data Factory-supported data source. You can use the output from the Lookup activity in a ForEach or other activity.

1. In the **Activities** pane, under **General**, select the **Lookup** activity. Drag and drop it into the main canvas on the right.

![select lookup activity.](media/data-factory-command-activity/select-activity.png)

1. The canvas now contains the Lookup activity you created. Use the tabs below the canvas to change any relevant parameters. In **General**, rename the activity.
1. The canvas now contains the Lookup activity you created. Use the tabs under the canvas to change any relevant parameters. In **General**, rename the activity.

![edit lookup activity.](media/data-factory-command-activity/edit-lookup-activity.png)

> [!TIP]
> Click on the empty canvas area to view the pipeline properties. Use the **General** tab to rename the pipeline. Our pipeline is named *pipeline-4-docs*.
> Select the empty canvas area to view the pipeline properties. Use the **General** tab to rename the pipeline. The pipeline is named *pipeline-4-docs*.

### Create an Azure Data Explorer dataset in lookup activity

Expand Down Expand Up @@ -70,44 +70,44 @@ A [lookup activity](/azure/data-factory/control-flow-lookup-activity) can retrie
* Select **Name** for Azure Data Explorer linked service. Add **Description** if needed.
* In **Connect via integration runtime**, change current settings, if needed.
* In **Account selection method** select your cluster using one of two methods:
* Select the **From Azure subscription** radio button and select your **Azure subscription** account. Then, select your **Cluster**. Note the dropdown will only list clusters that belong to the user.
* Select the **From Azure subscription** radio button and select your **Azure subscription** account. Then, select your **Cluster**. The dropdown only lists clusters that belong to you.
* Instead, select **Enter manually** radio button and enter your **Endpoint** (cluster URL).
* Specify the **Tenant**.
* Enter **Service principal ID**. This value can be found in the [Azure portal](https://ms.portal.azure.com/) under **App Registrations** > **Overview** > **Application (client) ID**. The principal must have the adequate permissions, according to the permission level required by the command being used.
* Enter **Service principal ID**. Find this value in the [Azure portal](https://ms.portal.azure.com/) under **App Registrations** > **Overview** > **Application (client) ID**. The principal must have the adequate permissions, according to the permission level required by the command being used.
* Select **Service principal key** button and enter **Service Principal Key**.
* Select your **Database** from the dropdown menu. Alternatively, select **Edit** checkbox and enter your database name.
* Select **Test Connection** to test the linked service connection you created. If you can connect to your setup, a green checkmark **Connection successful** will appear.
* Select **Test Connection** to test the linked service connection you created. If you can connect to your setup, a green checkmark **Connection successful** appears.
* Select **Finish** to complete linked service creation.

1. Once you've set up a linked service, In **AzureDataExplorerTable** > **Connection**, add **Table** name. Select **Preview data**, to make sure that the data is presented properly.
1. After you set up a linked service, In **AzureDataExplorerTable** > **Connection**, add **Table** name. Select **Preview data**, to make sure that the data is presented properly.

Your dataset is now ready, and you can continue editing your pipeline.
Your dataset is ready, and you can continue editing your pipeline.

### Add a query to your lookup activity

1. In **pipeline-4-docs** > **Settings** add a query in **Query** text box, for example:
1. In **pipeline-4-docs** > **Settings**, add a query in the **Query** text box, for example:

```kusto
ClusterQueries
| where Database !in ("KustoMonitoringPersistentDatabase", "$systemdb")
| summarize count() by Database
```

1. Change the **Query timeout** or **No truncation** and **First row only** properties, as needed. In this flow, we keep the default **Query timeout** and uncheck the checkboxes.
1. Change the **Query timeout** or **No truncation** and **First row only** properties, as needed. In this flow, keep the default **Query timeout** and uncheck the checkboxes.

![Final settings of lookup activity.](media/data-factory-command-activity/lookup-activity-final-settings.png)

## Create a For-Each activity

The [For-Each](/azure/data-factory/control-flow-for-each-activity) activity is used to iterate over a collection and execute specified activities in a loop.
Use the [For-Each](/azure/data-factory/control-flow-for-each-activity) activity to iterate over a collection and execute specified activities in a loop.

1. Now you add a For-Each activity to the pipeline. This activity will process the data returned from the Lookup activity.
* In the **Activities** pane, under **Iteration & Conditionals**, select the **ForEach** activity and drag and drop it into the canvas.
1. Add a For-Each activity to the pipeline. This activity processes the data returned from the Lookup activity.
* In the **Activities** pane, under **Iteration & Conditionals**, select the **ForEach** activity. Drag and drop it into the canvas.
* Draw a line between the output of the Lookup activity and the input of the ForEach activity in the canvas to connect them.

![ForEach activity.](media/data-factory-command-activity/for-each-activity.png)

1. Select the ForEach activity in the canvas. In the **Settings** tab below:
1. Select the ForEach activity in the canvas. In the **Settings** tab:
* Check the **Sequential** checkbox for a sequential processing of the Lookup results, or leave it unchecked to create parallel processing.
* Set **Batch count**.
* In **Items**, provide the following reference to the output value:
Expand All @@ -117,12 +117,12 @@ The [For-Each](/azure/data-factory/control-flow-for-each-activity) activity is u

## Create an Azure Data Explorer Command activity within the ForEach activity

1. Double-click the ForEach activity in the canvas to open it in a new canvas to specify the activities within ForEach.
1. Double-click the ForEach activity in the canvas to open it in a new canvas. Specify the activities within ForEach.
1. In the **Activities** pane, under **Azure Data Explorer**, select the **Azure Data Explorer Command** activity and drag and drop it into the canvas.

![Azure Data Explorer command activity.](media/data-factory-command-activity/adx-command-activity.png)

1. In the **Connection** tab, select the same Linked Service previously created.
1. In the **Connection** tab, select the same Linked Service you previously created.

![azure data explorer command activity connection tab.](media/data-factory-command-activity/adx-command-activity-connection-tab.png)

Expand All @@ -139,33 +139,33 @@ The [For-Each](/azure/data-factory/control-flow-for-each-activity) activity is u
```

The **Command** instructs Azure Data Explorer to export the results of a given query into a blob storage, in a compressed format. It runs asynchronously (using the async modifier).
The query addresses the database column of each row in the Lookup activity result. The **Command timeout** can be left unchanged.
The query addresses the database column of each row in the Lookup activity result. You can leave the **Command timeout** unchanged.

![command activity.](media/data-factory-command-activity/command.png)

> [!NOTE]
> The command activity has the following limits:
> * Size limit: 1 MB response size
> * Time limit: 20 minutes (default), 1 hour (maximum).
> * If needed, you can append a query to the result using [AdminThenQuery](/kusto/management/index?view=azure-data-explorer&preserve-view=true#combining-queries-and-management-commands), to reduce resulting size/time.
> * If needed, you can append a query to the result using [AdminThenQuery](/kusto/management/index?view=azure-data-explorer&preserve-view=true#combining-queries-and-management-commands), to reduce resulting size or time.

1. Now the pipeline is ready. You can go back to the main pipeline view by clicking the pipeline name.
1. Now the pipeline is ready. You can go back to the main pipeline view by selecting the pipeline name.

![Azure Data Explorer command pipeline.](media/data-factory-command-activity/adx-command-pipeline.png)

1. Select **Debug** before publishing the pipeline. The pipeline progress can be monitored in the **Output** tab.
1. Select **Debug** before publishing the pipeline. You can monitor the pipeline progress in the **Output** tab.

![azure data explorer command activity output.](media/data-factory-command-activity/command-activity-output.png)

1. You can **Publish All** and then **Add trigger** to run the pipeline.
1. Select **Publish All** and then **Add trigger** to run the pipeline.

## Management command outputs

The structure of the command activity output is detailed below. This output can be used by the next activity in the pipeline.
The following section describes the structure of the command activity output. The next activity in the pipeline can use this output.

### Returned value of a non-async management command

In a non-async management command, the structure of the returned value is similar to the structure of the Lookup activity result. The `count` field indicates the number of returned records. A fixed array field `value` contains a list of records.
In a non-async management command, the structure of the returned value is similar to the structure of the Lookup activity result. The `count` field shows the number of returned records. A fixed array field `value` contains a list of records.

```json
{
Expand All @@ -187,7 +187,7 @@ In a non-async management command, the structure of the returned value is simila

### Returned value of an async management command

In an async management command, the activity polls the operations table behind the scenes, until the async operation is completed or times-out. Therefore, the returned value will contain the result of `.show operations OperationId` for that given **OperationId** property. Check the values of **State** and **Status** properties, to verify successful completion of the operation.
In an async management command, the activity polls the operations table behind the scenes, until the async operation is completed or times out. Therefore, the returned value contains the result of `.show operations OperationId` for that given **OperationId** property. Check the values of **State** and **Status** properties, to verify successful completion of the operation.

```json
{
Expand Down
10 changes: 5 additions & 5 deletions data-explorer/excel.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
---
title: 'Visualize a Query in Excel'
description: 'In this article, you learn how to use a query from the web UI into Excel, by exporting it directly or by using the native connector in Excel.'
title: Visualize a Query in Excel
description: In this article, you learn how to use a query from the web UI into Excel, by exporting it directly or by using the native connector in Excel.
ms.reviewer: orspodek
ms.topic: how-to
ms.date: 02/08/2026
ms.date: 02/23/2026

# Customer intent: As a data analyst, I want to understand how to visualize my Azure Data Explorer data in Excel.
---
Expand All @@ -30,7 +30,7 @@ Export the query directly from the web UI.

:::image type="content" source="media/excel/web-ui-query-to-excel.png" alt-text="Screenshot that shows Azure Data Explorer web UI query to Open in Excel." lightbox="media/excel/web-ui-query-to-excel.png":::

The query is saved as an Excel workbook in the Downloads folder.
The query is saved as an Excel workbook in the **Downloads** folder.

1. Open the downloaded workbook to view your data. Select **Enable editing** and **Enable content** if requested in the top ribbon.

Expand Down Expand Up @@ -69,7 +69,7 @@ Get data from Azure Data Explorer datasource into Excel.

:::image type="content" source="media/excel/complete-sign-in.png" alt-text="Screenshot that shows that show the sign-in pop-up window.":::

1. In the **Navigator** pane, navigate to the correct table. In the table preview pane, select **Transform Data** to open the **Power Query Editor** and make changes to your data, or select **Load** to load it straight to Excel.
1. In the **Navigator** pane, go to the correct table. In the table preview pane, select **Transform Data** to open the **Power Query Editor** and make changes to your data, or select **Load** to load it straight to Excel.

:::image type="content" source="media/excel/navigate-table-preview-window.png" alt-text="Screenshot of the Table preview window.":::

Expand Down
14 changes: 7 additions & 7 deletions data-explorer/flow-usage.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ title: Usage examples for Azure Data Explorer connector to Power Automate
description: Learn some common usage examples for Azure Data Explorer connector to Power Automate.
ms.reviewer: miwalia
ms.topic: how-to
ms.date: 05/04/2022
ms.date: 02/23/2026
no-loc: [Power Automate]
ms.custom: sfi-image-nochange
---
Expand All @@ -18,14 +18,14 @@ For more information, see [Azure Data Explorer Power Automate connector](flow.md

Use the Power Automate connector to query your data and aggregate it in an SQL database.

> [!Note]
> [!NOTE]
> Only use the Power Automate connector for small amounts of output data. The SQL insert operation is done separately for each row.

:::image type="content" source="media/flow-usage/flow-sql-example.png" alt-text="Screenshot of SQL connector, showing querying data by using the Power Automate connector.":::

## Push data to a Microsoft Power BI dataset

You can use the Power Automate connector with the Power BI connector to push data from Kusto queries to Power BI streaming datasets.
Use the Power Automate connector with the Power BI connector to push data from Kusto queries to Power BI streaming datasets.

1. Create a new **Run query and list results** action.
1. Select **New step**.
Expand All @@ -34,7 +34,7 @@ You can use the Power Automate connector with the Power BI connector to push dat

:::image type="content" source="media/flow-usage/flow-power-bi-connector.png" alt-text="Screenshot of Power BI connector, showing add row to a dataset action.":::

1. Enter the **Workspace**, **Dataset**, and **Table** to which data will be pushed.
1. Enter the **Workspace**, **Dataset**, and **Table** to which you want to push data.
1. From the dynamic content dialog box, add a **Payload** that contains your dataset schema and the relevant Kusto query results.

:::image type="content" source="media/flow-usage/flow-power-bi-fields.png" alt-text="Screenshot of Power BI action, showing action fields.":::
Expand All @@ -47,8 +47,8 @@ The flow automatically applies the Power BI action for each row of the Kusto que

You can use the results of Kusto queries as input or conditions for the next Power Automate actions.

In the following example, we query Kusto for incidents that occurred during the last day. For each resolved incident, a Slack message is posted and a push notification is created.
For each incident that is still active, we query Kusto for more information about similar incidents. It sends that information as an email, and opens a related task in Azure DevOps Server.
In the following example, you query Kusto for incidents that occurred during the last day. For each resolved incident, the flow posts a Slack message and creates a push notification.
For each incident that is still active, the flow queries Kusto for more information about similar incidents. It sends that information as an email, and opens a related task in Azure DevOps Server.

Follow these instructions to create a similar flow:

Expand All @@ -65,7 +65,7 @@ Follow these instructions to create a similar flow:
:::image type="content" source="media/flow-usage/flow-condition-actions-inline.png" alt-text="Screenshot showing adding actions for when a condition is true or false, flow conditions based on Kusto query results." lightbox="media/flow-usage/flow-condition-actions.png":::

You can use the result values from the Kusto query as input for the next actions. Select the result values from the dynamic content window.
In the following example, we add a **Slack - Post Message** action and a **Visual Studio - Create a new work item** action, containing data from the Kusto query.
In the following example, you add a **Slack - Post Message** action and a **Visual Studio - Create a new work item** action, containing data from the Kusto query.

:::image type="content" source="media/flow-usage/flow-slack.png" alt-text="Screenshot of Slack - Post Message action.":::

Expand Down
Loading