Get data from Azure storage - Microsoft Fabric (2024)

  • Article

In this article, you learn how to get data from Azure storage (ADLS Gen2 container, blob container, or individual blobs) into either a new or existing table.

Prerequisites

  • A workspace with a Microsoft Fabric-enabled capacity
  • A KQL database with editing permissions
  • A storage account

Source

  1. On the lower ribbon of your KQL database, select Get Data.

    In the Get data window, the Source tab is selected.

  2. Select the data source from the available list. In this example, you're ingesting data from Azure storage.

Configure

  1. Select a target table. If you want to ingest data into a new table, select + New table and enter a table name.

    Note

    Table names can be up to 1024 characters including spaces, alphanumeric, hyphens, and underscores. Special characters aren't supported.

  2. To add your data source, paste your storage connection string in the URI field, and then select +. The following table lists the supported authentication methods and the permissions needed for ingesting data from Azure storage.

    Authentication methodIndividual blobBlob containerAzure Data Lake Storage Gen2
    Shared Access (SAS) tokenRead and WriteRead and ListRead and List
    Storage account access key

    Note

    • You can either add up to 10 individual blobs, or ingest up to 5000 blobs from a single container. You can't ingest both at the same time.
    • Each blob can be a max of 1 GB uncompressed.
    1. If you pasted a connection string for a blob container or an Azure Data Lake Storage Gen2, you can then add the following optional filters:

      SettingField description
      File filters (optional)
      Folder pathFilters data to ingest files with a specific folder path.
      File extensionFilters data to ingest files with a specific file extension only.
  3. Select Next

Inspect

The Inspect tab opens with a preview of the data.

To complete the ingestion process, select Finish.

Optionally:

  • Select Command viewer to view and copy the automatic commands generated from your inputs.
  • Use the Schema definition file dropdown to change the file that the schema is inferred from.
  • Change the automatically inferred data format by selecting the desired format from the dropdown. For more information, see Data formats supported by Real-Time Intelligence.
  • Edit columns.
  • Explore Advanced options based on data type.

Edit columns

Note

  • For tabular formats (CSV, TSV, PSV), you can't map a column twice. To map to an existing column, first delete the new column.
  • You can't change an existing column type. If you try to map to a column having a different format, you may end up with empty columns.

The changes you can make in a table depend on the following parameters:

  • Table type is new or existing
  • Mapping type is new or existing
Table typeMapping typeAvailable adjustments
New tableNew mappingRename column, change data type, change data source, mapping transformation, add column, delete column
Existing tableNew mappingAdd column (on which you can then change data type, rename, and update)
Existing tableExisting mappingnone

Mapping transformations

Some data format mappings (Parquet, JSON, and Avro) support simple ingest-time transformations. To apply mapping transformations, create or update a column in the Edit columns window.

Mapping transformations can be performed on a column of type string or datetime, with the source having data type int or long. Supported mapping transformations are:

  • DateTimeFromUnixSeconds
  • DateTimeFromUnixMilliseconds
  • DateTimeFromUnixMicroseconds
  • DateTimeFromUnixNanoseconds

Advanced options based on data type

Tabular (CSV, TSV, PSV):

  • If you're ingesting tabular formats in an existing table, you can select Advanced > Keep table schema. Tabular data doesn't necessarily include the column names that are used to map source data to the existing columns. When this option is checked, mapping is done by-order, and the table schema remains the same. If this option is unchecked, new columns are created for incoming data, regardless of data structure.

  • To use the first row as column names, select Advanced > First row is column header.

    Get data from Azure storage - Microsoft Fabric (5)

JSON:

  • To determine column division of JSON data, select Advanced > Nested levels, from 1 to 100.

  • If you select Advanced > Skip JSON lines with errors, the data is ingested in JSON format. If you leave this check box unselected, the data is ingested in multijson format.

    Get data from Azure storage - Microsoft Fabric (6)

Summary

In the Data preparation window, all three steps are marked with green check marks when data ingestion finishes successfully. You can select a card to query, drop the ingested data, or see a dashboard of your ingestion summary.

Related content

  • To manage your database, see Manage data
  • To create, store, and export queries, see Query data in a KQL queryset
Get data from Azure storage - Microsoft Fabric (2024)
Top Articles
Does Homeowners Insurance Cover Hurricane Damage? (2024)
Customize company and financial reports
Menards Thermal Fuse
Davita Internet
Melson Funeral Services Obituaries
Don Wallence Auto Sales Vehicles
Klustron 9
Find The Eagle Hunter High To The East
Purple Crip Strain Leafly
Syracuse Jr High Home Page
Hope Swinimer Net Worth
Hmr Properties
Mens Standard 7 Inch Printed Chappy Swim Trunks, Sardines Peachy
A Guide to Common New England Home Styles
Rhinotimes
24 Hour Drive Thru Car Wash Near Me
Vandymania Com Forums
Libinick
Bernie Platt, former Cherry Hill mayor and funeral home magnate, has died at 90
Xsensual Portland
Company History - Horizon NJ Health
What Are The Symptoms Of A Bad Solenoid Pack E4od?
Restaurants In Shelby Montana
No Limit Telegram Channel
Jazz Total Detox Reviews 2022
My Reading Manga Gay
Ezstub Cross Country
Adecco Check Stubs
Unlock The Secrets Of "Skip The Game" Greensboro North Carolina
Synchrony Manage Account
Indiefoxx Deepfake
Why Gas Prices Are So High (Published 2022)
Oxford Alabama Craigslist
Planet Fitness Santa Clarita Photos
National Insider Threat Awareness Month - 2024 DCSA Conference For Insider Threat Virtual Registration Still Available
Dcilottery Login
Dispensaries Open On Christmas 2022
Newsweek Wordle
11 Best Hotels in Cologne (Köln), Germany in 2024 - My Germany Vacation
Joey Gentile Lpsg
Doublelist Paducah Ky
Denise Monello Obituary
Ghareeb Nawaz Texas Menu
Sechrest Davis Funeral Home High Point Nc
Backpage New York | massage in New York, New York
Devotion Showtimes Near Showplace Icon At Valley Fair
Minecraft Enchantment Calculator - calculattor.com
라이키 유출
Tamilyogi Cc
Land of Samurai: One Piece’s Wano Kuni Arc Explained
Latest Posts
Article information

Author: Nicola Considine CPA

Last Updated:

Views: 6117

Rating: 4.9 / 5 (69 voted)

Reviews: 92% of readers found this page helpful

Author information

Name: Nicola Considine CPA

Birthday: 1993-02-26

Address: 3809 Clinton Inlet, East Aleisha, UT 46318-2392

Phone: +2681424145499

Job: Government Technician

Hobby: Calligraphy, Lego building, Worldbuilding, Shooting, Bird watching, Shopping, Cooking

Introduction: My name is Nicola Considine CPA, I am a determined, witty, powerful, brainy, open, smiling, proud person who loves writing and wants to share my knowledge and understanding with you.