-->

Azure Storage Explorer is a free tool from Microsoft that allows you to work with Azure Storage data on Windows, macOS, and Linux. This topic describes how to use it to upload and download data from Azure Blob Storage. The tool can be downloaded from Microsoft Azure Storage Explorer.

This menu links to technologies you can use to move data to and from Azure Blob storage:

Azure storage explorer blob containers

Note

Open Storage Explorer. In the left pane, expand the storage account within which you wish to create the blob container. Right-click Blob Containers, and - from the context menu - select Create Blob Container. A text box will appear below the Blob Containers folder. Enter the name for your blob container. Freeware Azure Blob Storage explorer allows tracking, analyzing and debugging your usage of storage. It also comes with full support for Microsoft Azure Storage Analytics. CloudBerry Explorer PRO makes Azure Cloud Storage management even easier, more secure and efficient.

If you are using VM that was set up with the scripts provided by Data Science Virtual machines in Azure, then Azure Storage Explorer is already installed on the VM.

Note

For a complete introduction to Azure Blob Storage, refer to Azure Blob Basics and Azure Blob Service.

Prerequisites

This document assumes that you have an Azure subscription, a storage account, and the corresponding storage key for that account. Before uploading/downloading data, you must know your Azure Storage account name and account key.

  • To set up an Azure subscription, see Free one-month trial.
  • For instructions on creating a storage account and for getting account and key information, see About Azure Storage accounts. Make a note the access key for your storage account as you need this key to connect to the account with the Azure Storage Explorer tool.
  • The Azure Storage Explorer tool can be downloaded from Microsoft Azure Storage Explorer. Accept the defaults during install.

Use Azure Storage Explorer

The following steps document how to upload/download data using Azure Storage Explorer.

  1. Launch Microsoft Azure Storage Explorer.
  2. To bring up the Sign in to your account... wizard, select Azure account settings icon, then Add an account and enter you credentials.
  3. To bring up the Connect to Azure Storage wizard, select the Connect to Azure Storage icon.
  4. Enter the access key from your Azure Storage account on the Connect to Azure Storage wizard and then Next.
  5. Enter storage account name in the Account name box and then select Next.
  6. The storage account added should now be displayed. To create a blob container in a storage account, right-click the Blob Containers node in that account, select Create Blob Container, and enter a name.
  7. To upload data to a container, select the target container and click the Upload button.
  8. Click on the ... to the right of the Files box, select one or multiple files to upload from the file system and click Upload to begin uploading the files.
  9. To download data, selecting the blob in the corresponding container to download and click Download.
-->

Azure Data Explorer is a fast and highly scalable data exploration service for log and telemetry data. Azure Data Explorer offers ingestion (data loading) from Event Hubs, IoT Hubs, and blobs written to blob containers.

In this article, you learn how to ingest blobs from your storage account into Azure Data Explorer using an Event Grid data connection. You'll create an Event Grid data connection that sets an Azure Event Grid subscription. The Event Grid subscription routes events from your storage account to Azure Data Explorer via an Azure Event Hub. Then you'll see an example of the data flow throughout the system.

For general information about ingesting into Azure Data Explorer from Event Grid, see Connect to Event Grid. To create resources manually in the Azure portal, see Manually create resources for Event Grid ingestion.

Prerequisites

  • An Azure subscription. Create a free Azure account.
  • A cluster and database.
  • A storage account.
    • Event Grid notification subscription can be set on Azure Storage accounts for BlobStorage, StorageV2, or Data Lake Storage Gen2.

Create a target table in Azure Data Explorer

Create a table in Azure Data Explorer where Event Hubs will send data. Create the table in the cluster and database prepared in the prerequisites.

  1. In the Azure portal, under your cluster, select Query.

  2. Copy the following command into the window and select Run to create the table (TestTable) that will receive the ingested data.

  3. Copy the following command into the window and select Run to map the incoming JSON data to the column names and data types of the table (TestTable).

Create an Event Grid data connection in Azure Data Explorer

Now connect the storage account to Azure Data Explorer, so that data flowing into the storage is streamed to the test table.

  1. Under the cluster you created, select Databases > TestDatabase.

  2. Select Data ingestion > Add data connection.

Data connection - Basics tab

  1. Select the connection type: Blob storage.

  2. Fill out the form with the following information:

    SettingSuggested valueField description
    Data connection nametest-grid-connectionThe name of the connection that you want to create in Azure Data Explorer.
    Storage account subscriptionYour subscription IDThe subscription ID where your storage account is.
    Storage accountgridteststorage1The name of the storage account that you created previously.
    Event typeBlob created or Blob renamedThe type of event that triggers ingestion. Blob renamed is supported only for ADLSv2 storage. Supported types are: Microsoft.Storage.BlobCreated or Microsoft.Storage.BlobRenamed.
    Resources creationAutomaticDefine whether you want Azure Data Explorer to create an Event Grid Subscription, an Event Hub namespace, and an Event Hub for you. To create resources manually, see Manually create resources for Event Grid ingestion
  3. Select Filter settings if you want to track specific subjects. Set the filters for the notifications as follows:

    • Prefix field is the literal prefix of the subject. As the pattern applied is startswith, it can span multiple containers, folders, or blobs. No wildcards are allowed.
      • To define a filter on the blob container, the field must be set as follows: /blobServices/default/containers/[container prefix].
      • To define a filter on a blob prefix (or a folder in Azure Data Lake Gen2), the field must be set as follows: /blobServices/default/containers/[container name]/blobs/[folder/blob prefix].
    • Suffix field is the literal suffix of the blob. No wildcards are allowed.
    • Case-Sensitive field indicates whether the prefix and suffix filters are case-sensitive
    • For more information about filtering events, see Blob storage events.
  4. Select Next: Ingest properties.

Data connection - Ingest properties tab

  1. Fill out the form with the following information. Table and mapping names are case-sensitive:

    Ingest properties:

    SettingSuggested valueField description
    Table nameTestTableThe table you created in TestDatabase.
    Data formatJSONSupported formats are Avro, CSV, JSON, MULTILINE JSON, ORC, PARQUET, PSV, SCSV, SOHSV, TSV, TXT, TSVE, APACHEAVRO, RAW, and W3CLOG. Supported compression options are Zip and GZip.
    MappingTestMappingThe mapping you created in TestDatabase, which maps incoming JSON data to the column names and data types of TestTable.
    Advanced settingsMy data has headersIgnores headers. Supported for *SV type files.

    Note

    You don't have to specify all Default routing settings. Partial settings are also accepted.

  2. Select Next: Review + Create

Data connection - Review + Create tab

  1. Review the resources that were auto created for you and select Create.

Deployment

Azure Blob Explorer Online

Wait until the deployment is completed. If your deployment failed, select Operation details next to the failed stage to get more information for the failure reason. Select Redeploy to try to deploy the resources again. You can alter the parameters before deployment.

Generate sample data

Now that Azure Data Explorer and the storage account are connected, you can create sample data.

Upload blob to the storage container

We'll work with a small shell script that issues a few basic Azure CLI commands to interact with Azure Storage resources. This script does the following actions:

  1. Creates a new container in your storage account.
  2. Uploads an existing file (as a blob) to that container.
  3. Lists the blobs in the container.

You can use Azure Cloud Shell to execute the script directly in the portal.

Save the data into a file and upload it with this script:

Note

To achieve the best ingestion performance, the uncompressed size of the compressed blobs submitted for ingestion must be communicated. Because Event Grid notifications contain only basic details, the size information must be explicitly communicated. The uncompressed size information can be provided by setting the rawSizeBytes property on the blob metadata with the uncompressed data size in bytes.

Rename blob

If you are ingesting data from ADLSv2 storage and have defined Blob renamed as the event type for the data connection, the trigger for blob ingestion is blob renaming. To rename a blob, navigate to the blob in Azure portal, right click on the blob and select Rename:

Ingestion properties

You can specify the ingestion properties of the blob ingestion via the blob metadata.

Note

Azure Data Explorer won't delete the blobs post ingestion.Retain the blobs for three to five days.Use Azure Blob storage lifecycle to manage blob deletion.

Review the data flow

Note

Azure Data Explorer has an aggregation (batching) policy for data ingestion designed to optimize the ingestion process.By default, the policy is configured to 5 minutes.You'll be able to alter the policy at a later time if needed. In this article you can expect a latency of a few minutes.

  1. In the Azure portal, under your event grid, you see the spike in activity while the app is running.

  2. To check how many messages have made it to the database so far, run the following query in your test database.

  3. To see the content of the messages, run the following query in your test database.

    The result set should look like the following image:

Clean up resources

If you don't plan to use your event grid again, clean up the Event Grid Subscription, Event Hub namespace, and Event Hub that were auto-created for you, to avoid incurring costs.

Cloudberry Azure Blob Explorer

  1. In Azure portal, go to the left menu and select All resources.

  2. Search for your Event Hub Namespace and select Delete to delete it:

  3. In the Delete resources form, confirm the deletion to delete the Event Hub Namespace and Event Hub resources.

  4. Go to your storage account. In the left menu, select Events:

  5. Below the graph, Select your Event Grid Subscription and then select Delete to delete it:

  6. To delete your Event Grid data connection, go to your Azure Data Explorer cluster. On the left menu, select Databases.

  7. Select your database TestDatabase:

  8. On the left menu, select Data ingestion:

  9. Select your data connection test-grid-connection and then select Delete to delete it.

Azure Storage Explorer Blob Containers

Next steps