Tutorial: Migrate on-premises data to Azure Storage with AzCopy (2024)

  • Article

AzCopy is a command-line tool for copying data to or from Azure Blob storage, Azure Files, and Azure Table storage, by using simple commands. The commands are designed for optimal performance. Using AzCopy, you can either copy data between a file system and a storage account, or between storage accounts. AzCopy may be used to copy data from local (on-premises) data to a storage account.

In this tutorial, you learn how to:

  • Create a storage account.
  • Use AzCopy to upload all your data.
  • Modify the data for test purposes.
  • Create a scheduled task or cron job to identify new files to upload.

If you don't have an Azure subscription, create a free account before you begin.

Prerequisites

To complete this tutorial, download the latest version of AzCopy. See Get started with AzCopy.

If you're on Windows, you will require Schtasks as this tutorial makes use of it in order to schedule a task. Linux users will make use of the crontab command, instead.

To create a general-purpose v2 storage account in the Azure portal, follow these steps:

  1. Under Azure services, select Storage accounts.
  2. On the Storage Accounts page, choose + Create.
  3. On the Basics blade, select the subscription in which to create the storage account.
  4. Under the Resource group field, select your desired resource group, or create a new resource group. For more information on Azure resource groups, see Azure Resource Manager overview.
  5. Next, enter a name for your storage account. The name you choose must be unique across Azure. The name also must be between 3 and 24 characters in length, and may include only numbers and lowercase letters.
  6. Select a region for your storage account, or use the default region.
  7. Select a performance tier. The default tier is Standard.
  8. Specify how the storage account will be replicated. The default redundancy option is Geo-redundant storage (GRS). For more information about available replication options, see Azure Storage redundancy.
  9. Additional options are available on the Advanced, Networking, Data protection, and Tags blades. To use Azure Data Lake Storage, choose the Advanced blade, and then set Hierarchical namespace to Enabled. For more information, see Azure Data Lake Storage Gen2 Introduction.
  10. Select Review + Create to review your storage account settings and create the account.
  11. Select Create.

The following image shows the settings on the Basics blade for a new storage account:

Create a container

The first step is to create a container, because blobs must always be uploaded into a container. Containers are used as a method of organizing groups of blobs like you would files on your computer, in folders.

Follow these steps to create a container:

  1. Select the Storage accounts button from the main page, and select the storage account that you created.

  2. Select Blobs under Services, and then select Container.

    Tutorial: Migrate on-premises data to Azure Storage with AzCopy (2)

Container names must start with a letter or number. They can contain only letters, numbers, and the hyphen character (-). For more rules about naming blobs and containers, see Naming and referencing containers, blobs, and metadata.

Download AzCopy

Download the AzCopy V10 executable file.

Place the AzCopy file anywhere on your computer. Add the location of the file to your system path variable so that you can refer to this executable file from any folder on your computer.

Authenticate with Microsoft Entra ID

First, assign the Storage Blob Data Contributor role to your identity. See Assign an Azure role for access to blob data.

Then, open a command prompt, type the following command, and press the ENTER key.

azcopy login

This command returns an authentication code and the URL of a website. Open the website, provide the code, and then choose the Next button.

Tutorial: Migrate on-premises data to Azure Storage with AzCopy (3)

A sign-in window will appear. In that window, sign into your Azure account by using your Azure account credentials. After you've successfully signed in, you can close the browser window and begin using AzCopy.

Upload contents of a folder to Blob storage

You can use AzCopy to upload all files in a folder to Blob storage on Windows or Linux. To upload all blobs in a folder, enter the following AzCopy command:

azcopy copy "<local-folder-path>" "https://<storage-account-name>.<blob or dfs>.core.windows.net/<container-name>" --recursive=true
  • Replace the <local-folder-path> placeholder with the path to a folder that contains files (For example: C:\myFolder or /mnt/myFolder).

  • Replace the <storage-account-name> placeholder with the name of your storage account.

  • Replace the <container-name> placeholder with the name of the container that you created.

To upload the contents of the specified directory to Blob storage recursively, specify the --recursive option. When you run AzCopy with this option, all subfolders and their files are uploaded as well.

Upload modified files to Blob storage

You can use AzCopy to upload files based on their last-modified time.

To try this, modify or create new files in your source directory for test purposes. Then, use the AzCopy sync command.

azcopy sync "<local-folder-path>" "https://<storage-account-name>.blob.core.windows.net/<container-name>" --recursive=true
  • Replace the <local-folder-path> placeholder with the path to a folder that contains files (For example: C:\myFolder or /mnt/myFolder.

  • Replace the <storage-account-name> placeholder with the name of your storage account.

  • Replace the <container-name> placeholder with the name of the container that you created.

To learn more about the sync command, see Synchronize files.

Create a scheduled task

You can create a scheduled task or cron job that runs an AzCopy command script. The script identifies and uploads new on-premises data to cloud storage at a specific time interval.

Copy the AzCopy command to a text editor. Update the parameter values of the AzCopy command to the appropriate values. Save the file as script.sh (Linux) or script.bat (Windows) for AzCopy.

These examples assume that your folder is named myFolder, your storage account name is mystorageaccount and your container name is mycontainer.

Note

The Linux example appends a SAS token. You'll need to provide one in your command. To utilize Microsoft Entra authentication in cron jobs, ensure you configure the AZCOPY_AUTO_LOGIN_TYPE environment variable appropriately.

  • Linux
  • Windows
azcopy sync "/mnt/myfiles" "https://mystorageaccount.blob.core.windows.net/mycontainer?sv=2018-03-28&ss=bfqt&srt=sco&sp=rwdlacup&se=2019-05-30T06:57:40Z&st=2019-05-29T22:57:40Z&spr=https&sig=BXHippZxxx54hQn/4tBY%2BE2JHGCTRv52445rtoyqgFBUo%3D" --recursive=true

In this tutorial, Schtasks is used to create a scheduled task on Windows. The Crontab command is used to create a cron job on Linux.

Schtasks enables an administrator to create, delete, query, change, run, and end scheduled tasks on a local or remote computer. Cron enables Linux and Unix users to run commands or scripts at a specified date and time by using cron expressions.

  • Linux
  • Windows

To create a cron job on Linux, enter the following command on a terminal:

crontab -e*/5 * * * * sh /path/to/script.sh

Specifying the cron expression */5 * * * * in the command indicates that the shell script script.sh should run every five minutes. You can schedule the script to run at a specific time daily, monthly, or yearly. To learn more about setting the date and time for job execution, see cron expressions.

To validate that the scheduled task/cron job runs correctly, create new files in your myFolder directory. Wait five minutes to confirm that the new files have been uploaded to your storage account. Go to your log directory to view output logs of the scheduled task or cron job.

Next steps

To learn more about ways to move on-premises data to Azure Storage and vice versa, follow this link:

  • Move data to and from Azure Storage.

For more information about AzCopy, see any of these articles:

  • Get started with AzCopy

  • Transfer data with AzCopy and blob storage

  • Transfer data with AzCopy and file storage

  • Transfer data with AzCopy and Amazon S3 buckets

  • AzCopy configuration settings

  • Optimize the performance of AzCopy

  • Find errors and resume jobs by using log and plan files in AzCopy

  • Troubleshoot problems with AzCopy v10

Tutorial: Migrate on-premises data to Azure Storage with AzCopy (2024)
Top Articles
Copenhagen or Stockholm: Which Scandi Capital to Visit
killing for trophies: an analysis of global trophy hunting trade | IFAW
Craigslist Myrtle Beach Motorcycles For Sale By Owner
Part time Jobs in El Paso; Texas that pay $15, $25, $30, $40, $50, $60 an hour online
Shs Games 1V1 Lol
Miss Carramello
Corpse Bride Soap2Day
Doby's Funeral Home Obituaries
Meg 2: The Trench Showtimes Near Phoenix Theatres Laurel Park
Wmlink/Sspr
Visustella Battle Core
How Quickly Do I Lose My Bike Fitness?
Robot or human?
Tripadvisor Near Me
Best Restaurants Ventnor
‘Accused: Guilty Or Innocent?’: A&E Delivering Up-Close Look At Lives Of Those Accused Of Brutal Crimes
Fredericksburg Free Lance Star Obituaries
Dexter Gomovies
Studentvue Columbia Heights
Buff Cookie Only Fans
Condogames Xyz Discord
Silive Obituary
Drift Boss 911
Imouto Wa Gal Kawaii - Episode 2
Encyclopaedia Metallum - WikiMili, The Best Wikipedia Reader
Globle Answer March 1 2023
Hdmovie2 Sbs
California Online Traffic School
EVO Entertainment | Cinema. Bowling. Games.
Culver's.comsummerofsmiles
Our 10 Best Selfcleaningcatlitterbox in the US - September 2024
Free Tiktok Likes Compara Smm
Used Safari Condo Alto R1723 For Sale
Willys Pickup For Sale Craigslist
Star News Mugshots
Shiftwizard Login Johnston
Boondock Eddie's Menu
Ewwwww Gif
Hell's Kitchen Valley Center Photos Menu
Section 212 at MetLife Stadium
Yogu Cheshire
13 Fun &amp; Best Things to Do in Hurricane, Utah
Zeeks Pizza Calories
Meet Robert Oppenheimer, the destroyer of worlds
Mytmoclaim Tracking
Rubmaps H
Wild Fork Foods Login
Bellin Employee Portal
Latest Posts
Article information

Author: Sen. Ignacio Ratke

Last Updated:

Views: 5467

Rating: 4.6 / 5 (56 voted)

Reviews: 95% of readers found this page helpful

Author information

Name: Sen. Ignacio Ratke

Birthday: 1999-05-27

Address: Apt. 171 8116 Bailey Via, Roberthaven, GA 58289

Phone: +2585395768220

Job: Lead Liaison

Hobby: Lockpicking, LARPing, Lego building, Lapidary, Macrame, Book restoration, Bodybuilding

Introduction: My name is Sen. Ignacio Ratke, I am a adventurous, zealous, outstanding, agreeable, precious, excited, gifted person who loves writing and wants to share my knowledge and understanding with you.