Azure Log Analytics: the Basics and a Quick Tutorial (2024)

What is Azure Log Analytics?

Azure Log Analytics is a service that monitors your cloud and on-premises resources and applications. It allows you to collect and analyze data generated by resources in your cloud and on-premises environments.

You can use Azure Log Analytics to search, analyze, and visualize data to identify trends, troubleshoot issues, and monitor your systems. You can also set up alerts to notify you when specific events or issues occur, so you can take action to resolve them.

About this Explainer:

This content is part of a series aboutlog management.

How Azure Log Analytics works

To access Azure Log Analytics, you need to sign in to the Azure portal with your Azure account. Once you’re signed in, you can access Log Analytics by selecting it from the list of services in the portal.

To use Log Analytics, you need to create a Log Analytics workspace in your Azure subscription. A workspace is a logical container for data that is collected and analyzed by Log Analytics. You can create multiple workspaces to organize data from different sources, or to use different data retention and access policies.

Azure Log Analytics: the Basics and a Quick Tutorial (1)

Some of the main features of Azure Log Analytics include:

  • Wide range of data sources:Once you have a workspace set up, you can start collecting data from your resources and applications. Log Analytics supports a wide range of data sources, including Azure resources, on-premises servers, applications, and various types of log and performance data. You can use the Log Analytics agent or other data collectors or APIs to send data to your workspace, security log repository, or SIEM.
  • Powerful query language:Log Analytics provides a powerful query language that you can use to filter, group, and aggregate data.
  • Predefined queries and solutions:Use pre-built queries and solutions to get started quickly, or create your own custom queries and solutions.
  • Monitoring and alerting:You can use Log Analytics to set up alerts that trigger when specific events or issues occur, and you can specify actions to be taken when an alert is triggered.
  • Dashboards:Create dashboards to display real-time and historical data from your resources and applications.

Quick tutorial: Exploring the Azure Log Analytics demo environment

This tutorial uses the Log Analytics demo environment, which includes sample data that you can use to explore this service’s capabilities and learn how to use it.

Open Log Analytics

To open Log Analytics using the Log Analytics demo environment, follow these steps:

  1. Go to the Log Analyticsdemo environment website.
  2. Click on theSign inbutton in the top right corner of the page.
  3. Enter your email address and clickNext.
  4. Enter your password and clickSign in.
  5. After signing in, you will be taken to the Log Analytics dashboard. From here, you can access all of the features of Log Analytics, including search, analysis, and visualization.

To get started with Log Analytics, you can try running some of the predefined queries and solutions, or you can create your own custom queries and solutions using the Log Analytics query language. You can also set up alerts and create dashboards to monitor your data in real-time.

Note that the Log Analytics demo environment is a simulated environment and is not connected to any real data sources. It is intended for demonstration and learning purposes only. To use Log Analytics with your own data, you will need to set up a Log Analytics workspace in your Azure subscription.

Azure Log Analytics: the Basics and a Quick Tutorial (2)

View table information

To view table information in Azure Log Analytics using the Log Analytics demo environment, follow these steps:

  1. Go to the Log Analytics dashboard by clicking on theDashboardbutton in the top menu.
  2. On the dashboard page, click on theTablestab in the left menu. This will display a list of all of the tables in the demo environment.
  3. To view the contents of a table, click on the name of the table in the list. This will open the table viewer, which allows you to view and analyze the data in the table.
Azure Log Analytics: the Basics and a Quick Tutorial (3)

The table viewer includes a number of features for searching, filtering, and visualizing the data. You can use the search box at the top of the table viewer to enter a query to filter the data, or you can use the column filters to narrow down the data by specific values.

You can also use the table viewer to create charts and other visualizations of the data. To do this, click on the “Visualize” button in the top menu, and then select the type of visualization you want to create. You can then use the options in the visualization editor to customize the visualization as needed.

Write a query

To write a query in Azure Log Analytics using the Log Analytics demo environment, follow these steps:

  1. Go to the Log Analytics dashboard by clicking on theDashboardbutton in the top menu.
  2. On the dashboard page, click on theLogstab in the left menu.
  3. In the search box at the top of the page, enter your query using the Log Analytics query language.
  4. Press theEnterkey or click theRunbutton to execute the query.

The Log Analytics query language is a powerful and flexible way to search and analyze data in Log Analytics. It includes a variety of operators and functions that you can use to filter, group, and aggregate data.

Analyze results

To analyze query results, you can use the options in the table or visualization to filter, group, and aggregate the data. For example, if you ran a query that returned a table of log entries, you might want to group the results by a specific column or apply a filter to show only certain rows. To do this, you can use the column filters and the search box at the top of the table to narrow down the data.

If you ran a query that returned a visualization, you can use the options in the visualization editor to customize the visualization. For example, you might want to change the data being displayed, or apply filters to the data.

Work with charts

To work with charts in Azure Log Analytics, follow these steps:

  1. Go to the Log Analytics dashboard by clicking on theDashboardbutton in the top menu.
  2. On the dashboard page, click on theLogstab in the left menu.
  3. To create a chart, click on theVisualizebutton in the top menu and select the type of chart you want to create.
  4. In the visualization editor, select the data you want to use for the chart by entering a query in the search box.
  5. Use the options in the visualization editor to customize the chart as needed. For example, you can change the axis labels, add data labels, or change the colors of the data series.
  6. When you’re finished customizing the chart, click theSavebutton to save the chart to the dashboard.

To view a saved chart, click on theDashboardstab in the left menu and select the dashboard that contains the chart. The chart will be displayed on the dashboard page, and you can use the options in the chart to interact with it.

Azure Log Analytics best practices

Use as few Log Analytics workspaces as possible

It is generally recommended to use as few Log Analytics workspaces as possible for a few reasons:

  • Cost:Each Log Analytics workspace incurs a separate charge based on the amount of data ingested and stored in the workspace. By using fewer workspaces, you can potentially reduce your costs.
  • Simplicity:Using fewer workspaces can make it easier to manage your data and queries. Instead of having to switch between multiple workspaces, you can keep all of your data and queries in a single workspace.
  • Data retention:Each workspace has its own data retention policy, which determines how long data is kept in the workspace. By using fewer workspaces, you can potentially simplify your data retention policies.
  • Data access:Each workspace has its own set of users and access controls. By using fewer workspaces, you can potentially simplify your access controls and make it easier to manage user access to your data.

However, there may be cases where it makes sense to use multiple workspaces. For example, if you have different teams working on separate projects, or if you have different data retention and access requirements for different sets of data, you might want to use separate workspaces to keep the data separate.

Use role-based access controls (RBAC)

RBAC allows you to control access to your Log Analytics workspace and the resources it uses based on the roles that you assign to users and groups. By using RBAC, you can ensure that users have the permissions they need to do their jobs, while also protecting your data and resources from unauthorized access.

There are several built-in roles in Azure that you can use to grant access to Log Analytics, including the Log Analytics Reader role and the Log Analytics Contributor role. You can also create custom roles with specific permissions to meet your specific needs.

It’s a good idea to carefully consider which roles to assign to users, and to review and update the roles as needed to ensure that they are appropriate for the user’s responsibilities.

Consider ‘table level’ retention

By default, Log Analytics has a global data retention policy that applies to all data in the workspace. However, you can also set specific retention policies for individual tables in the workspace. This is known as “table level” retention.

Table level retention can be useful in a few cases:

  • Different data retention requirements:Different types of data may have different retention requirements. By setting specific retention policies for different tables, you can ensure that data is kept for as long as it is needed, while also reducing the overall amount of data that is stored in the workspace.
  • Data compliance:In some cases, you may be required to keep data for a specific period of time to meet compliance requirements. By setting table level retention policies, you can ensure that you meet these requirements without keeping unnecessary data in the workspace.
  • Cost savings:Storing data in Log Analytics incurs a cost based on the amount of data ingested and stored. By setting specific retention policies for different tables, you can potentially reduce your costs by keeping only the data that is needed.

Use ARM Templates to Automatically Deploy VMs

Azure Resources Management (ARM) templates are JSON files that define the infrastructure and resources for your Azure solutions. They allow you to automate the deployment and management of your resources, including virtual machines (VMs).

Using ARM templates to deploy your VMs has several benefits when using Log Analytics:

  • Consistency:ARM templates allow you to define the exact configuration of your VMs, including the operating system, hardware, and software. This can help to ensure that your VMs are deployed consistently across your environment.
  • Version control:ARM templates are stored in source control, which allows you to track changes to your templates and roll back to previous versions if needed. This can be useful for managing the configuration of your VMs over time.
  • Automation:ARM templates can be used in continuous integration and continuous deployment (CI/CD) pipelines as part of your software development lifecycle, which allows you to automate the deployment and management of your VMs.
  • Reusability:ARM templates can be reused to deploy similar VMs in other environments or subscriptions, which can save time and reduce the risk of errors.

Security log management with Exabeam

Managing cloud security can be a challenge, particularly as your data, resources and services grow.Misconfigurationand lack of visibility are frequently exploited in data and system breaches. Both issues are more likely to occur without centralized tools.

Azure Log Analytics dashboards and services may be enough to provide basic visibility for specific development or DevOps teams. However, most organizations need more advanced security measures and have specific teams and groups that monitor security as a whole rather than specific tools or even IaaS/Paas like Azure. Logging onto multiple interfaces is not the most effective or efficient path to get a holistic view of events in your environment.

Log Analytics solutions are therefore combined with SIEMs anduser and entity behavior analysis(UEBA) tools. UEBA tools create baselines of “normal” activity and can identify and alert to activity that deviates from the baseline.

Security Log Management via a SIEM or UEBA (or both in one, as in Exabeam Fusion) benefits cloud management by:

  • Providing centralized monitoring – dispersed systems can be a challenge to monitor as you may have individual dashboards and portals for each service. Log Analytics can alert you to suspicious or policy-breaking behavior that you might otherwise miss in standalone dashboards.
  • Creating visibility in multi and hybrid cloud systems – cloud-specific services may not be extendable to on-premises resources and vice versa. Log Analytics can help you ensure that policies and configurations are consistent across environments. For example, by monitoring data use and transfer inhybrid storage services.
  • Helping you evaluate and prove compliance standards – Log Analytics can provide trackable, unified logging with evidence of actions taken. You can use Log Analytics logging and event tracking in compliance audits and certifications.
  • Scaling to match your system needs – Log Analytics often use daemons or agents to monitor distributed systems. These agents allow you to scale your Log Analytics to match your environment size. You can take advantage of the scalability of any tools you use by accepting and incorporating data streams for tools across your system.
  • Combining signals from Azure Log Analytics with other cloud security tools and logs such as cloud access security brokers (CASB), data loss prevention (DLP), Azure Active Directory Federation Services (AD FS) in a single platform like Exabeam can help build a full timeline of events, and gather in other associated alerts or actions that could indicate lateral movement from cloud to remote to on premise systems.
Azure Log Analytics: the Basics and a Quick Tutorial (2024)

FAQs

What is the Azure Log Analytics solution? ›

Log Analytics is a tool in the Azure portal that's used to edit and run log queries against data in the Azure Monitor Logs store. You might write a simple query that returns a set of records and then use features of Log Analytics to sort, filter, and analyze them.

How do I analyze logs in Azure? ›

Open Log Analytics

If you select Logs from an Azure resource's menu, the scope is set to only records from that resource. For details about the scope, see Log query scope. You can view the scope in the upper-left corner of the Logs experience, below the name of your active query tab.

How long is Azure Log Analytics? ›

By default Application Insights and Log Analytics has a data retention of 90 days. You can opt to extend the retention up to 730 days. However, the tyGraph Pages Site Analytics web part only support a maximum of 365 day filters.

Is Log Analytics deprecated? ›

The legacy Log Analytics agent will be deprecated by August 2024. After this date, Microsoft will no longer provide any support for the Log Analytics agent.

What is the difference between Azure Log Analytics and Azure monitor logs? ›

In conclusion, Azure Monitor and Log Analytics collectively offer a robust solution for monitoring Azure resources. While Azure Monitor provides a lot of features including aggregation of logs, real-time insights and performance metrics, Log Analytics allows advanced query capabilities and extensive log data analysis.

What language does Azure Log Analytics use? ›

Azure Monitor Logs is based on Azure Data Explorer, and log queries are written by using the same Kusto Query Language (KQL). This rich language is designed to be easy to read and author, so you should be able to start writing queries with some basic guidance.

How do I read data from Azure Log Analytics? ›

To access Azure Log Analytics, you need to sign in to the Azure portal with your Azure account. Once you're signed in, you can access Log Analytics by selecting it from the list of services in the portal. To use Log Analytics, you need to create a Log Analytics workspace in your Azure subscription.

What is the purpose of Log Analytics? ›

Log analytics involves searching, analyzing, and visualizing machine data generated by your IT systems and technology infrastructure to gain operational insights.

What is the format of Azure Log Analytics? ›

The Azure Monitor Log Analytics API response is a JSON string that contains an array of table objects. The tables property is an array of tables that represent the query result. Each table contains name , columns , and rows properties: The name property is the name of the table.

How to send logs to Azure Log Analytics? ›

Create new table in Log Analytics workspace
  1. Go to the Log Analytics workspaces menu in the Azure portal and select Tables. ...
  2. Specify a name for the table. ...
  3. Select Create a new data collection rule to create the DCR that will be used to send data to this table. ...
  4. Select the DCR that you created, and then select Next.
Jan 2, 2024

Is Log Analytics in Azure free? ›

The default pricing for Log Analytics is a pay-as-you-go model that's based on ingested data volume and data retention. Each Log Analytics workspace is charged as a separate service and contributes to the bill for your Azure subscription.

How to configure Log Analytics in Azure? ›

Configure Log Analytics
  1. Sign in to the Azure portal as at least a Security Administrator and Log Analytics Contributor.
  2. Browse to Log Analytics workspaces.
  3. Select Create.
  4. On the Create Log Analytics workspace page, perform the following steps: ...
  5. Select Review + Create.
  6. Select Create and wait for the deployment.
Feb 9, 2024

Where does Azure Log Analytics store data? ›

Log tables

Each Log Analytics workspace contains multiple tables in which Azure Monitor Logs stores data you collect. Azure Monitor Logs automatically creates tables required to store monitoring data you collect from your Azure environment.

What is the daily limit for log analytics? ›

Review the usage to confirm the desired cap will be sufficient. Near the top of the window, select Daily cap. Set to ON and enter the desired daily cap: The daily cap must be set to a minimum of 0.023 GB/day.

What is the difference between basic logs and Analytics logs? ›

Analytic logs should be used for high value security data that requires scheduled monitoring and alerting. Since Basic logs have a 8 days log retention, Archive logs should be used to store the basic logs for a longer duration - to increase the scope of threat hunting when it is required.

What does log analytics do? ›

Log analytics involves searching, analyzing, and visualizing machine data generated by your IT systems and technology infrastructure to gain operational insights.

Is log analytics in Azure free? ›

The default pricing for Log Analytics is a pay-as-you-go model that's based on ingested data volume and data retention. Each Log Analytics workspace is charged as a separate service and contributes to the bill for your Azure subscription.

How to enable log analytics in Azure? ›

Configure Log Analytics
  1. Sign in to the Azure portal as at least a Security Administrator and Log Analytics Contributor.
  2. Browse to Log Analytics workspaces.
  3. Select Create.
  4. On the Create Log Analytics workspace page, perform the following steps: ...
  5. Select Review + Create.
  6. Select Create and wait for the deployment.
Feb 9, 2024

Top Articles
Moodle in English: Changing name and IP of moodle server | Moodle.org
Farmers Insurance - Page unavailable
Cpmc Mission Bernal Campus & Orthopedic Institute Photos
AMC Theatre - Rent A Private Theatre (Up to 20 Guests) From $99+ (Select Theaters)
Printable Whoville Houses Clipart
Po Box 7250 Sioux Falls Sd
Skyward Houston County
Faint Citrine Lost Ark
Www.politicser.com Pepperboy News
Ingles Weekly Ad Lilburn Ga
7.2: Introduction to the Endocrine System
Best Theia Builds (Talent | Skill Order | Pairing + Pets) In Call of Dragons - AllClash
Bernie Platt, former Cherry Hill mayor and funeral home magnate, has died at 90
Leeks — A Dirty Little Secret (Ingredient)
Accuradio Unblocked
065106619
Bitlife Tyrone's
Gem City Surgeons Miami Valley South
Craigslist Free Stuff Merced Ca
Ratchet & Clank Future: Tools of Destruction
Robert Deshawn Swonger Net Worth
Heart and Vascular Clinic in Monticello - North Memorial Health
Panola County Busted Newspaper
Hctc Speed Test
Jesus Revolution Showtimes Near Regal Stonecrest
Maine Racer Swap And Sell
Jazz Total Detox Reviews 2022
Keshi with Mac Ayres and Starfall (Rescheduled from 11/1/2024) (POSTPONED) Tickets Thu, Nov 1, 2029 8:00 pm at Pechanga Arena - San Diego in San Diego, CA
Nurtsug
Best New England Boarding Schools
The Venus Flytrap: A Complete Care Guide
Kaiju Paradise Crafting Recipes
Selfservice Bright Lending
Sinfuldeeds Vietnamese Rmt
October 31St Weather
Rogers Centre is getting a $300M reno. Here's what the Blue Jays ballpark will look like | CBC News
Chuze Fitness La Verne Reviews
Elisabeth Shue breaks silence about her top-secret 'Cobra Kai' appearance
Ksu Sturgis Library
Deshuesadero El Pulpo
The Holdovers Showtimes Near Regal Huebner Oaks
Dogs Craiglist
Craigslist Marshfield Mo
Dolce Luna Italian Restaurant & Pizzeria
Bama Rush Is Back! Here Are the 15 Most Outrageous Sorority Houses on the Row
Walmart Front Door Wreaths
Bluebird Valuation Appraiser Login
303-615-0055
O'reilly's Eastman Georgia
Electronics coupons, offers & promotions | The Los Angeles Times
Latest Posts
Article information

Author: Merrill Bechtelar CPA

Last Updated:

Views: 6557

Rating: 5 / 5 (50 voted)

Reviews: 89% of readers found this page helpful

Author information

Name: Merrill Bechtelar CPA

Birthday: 1996-05-19

Address: Apt. 114 873 White Lodge, Libbyfurt, CA 93006

Phone: +5983010455207

Job: Legacy Representative

Hobby: Blacksmithing, Urban exploration, Sudoku, Slacklining, Creative writing, Community, Letterboxing

Introduction: My name is Merrill Bechtelar CPA, I am a clean, agreeable, glorious, magnificent, witty, enchanting, comfortable person who loves writing and wants to share my knowledge and understanding with you.