Configure Hyper-Q Logging for Azure Log Analytics
Using Azure Monitor, you can collect data directly from your Azure virtual machines into a Log Analytics workspace for analysis of details and correlations. Installing the Log Analytics VM extension for Linux allows Azure Monitor to collect data from your Hyper-Q VMs.
Prerequisites for Azure Log Analytics
Before you configure Hyper-Q to work with Azure Log Analytics, verify that the Microsoft Azure environment meets the following requirements.
Before you begin
Verify that you have installed Hyper-Q in the Microsoft Azure environment.
To learn how to install Hyper-Q, see Hyper-Q for Azure Synapse Installation and Setup.
Confirm the location of the Hyper-Q log files for your deployment.
There are default locations for log files, but these locations can be changed. If you change the location of a log file from the default location, ensure that you maintain a record of the new location. To confirm the location of the Hyper-Q log files, look at the
“general”.log_dir = “path”
parameter in thedtm.ini
configuration file. The file is located at:/opt/datometry/config/dtm.ini
You must have an Azure Log Analytics workspace.
To learn more, see the Microsoft Azure article Create a Log Analytics workspace in the Azure portal.
You must have available the name of the Azure Synapse instance, hostname, and network port to which clients can connect.
To learn more, see the Microsoft Azure article Azure Synapse Analytics.
Identify a Maintenance Window to Perform the Configuration Tasks
Configuring Hyper-Q logging for use with Azure Log Analytics and the Telegraf Data Collector requires that you reboot the Linux operating system running in the Hyper-Q VM, and that you restart the Hyper-Q VM itself. For this reason, choose a period of time in advance during which you can perform the necessary configuration tasks without causing a disruption of service.
Disable Security-Enhanced Linux
Security-Enhanced Linux (SELinux) is a security architecture for Linux systems. The Azure Log Analytics agent does not support SELinux, so you must disable it. Disabling SELinux stops it completely from logging.
To disable SELinux:
Log into each Hyper-Q VM in the Azure workspace using SSH.
Open the
/etc/selinux/config
file in a text editor of your choice.$ sudo vi /etc/selinux/config
Change the
SELINUX
value toSELINUX=disabled
.# This file controls the state of SELinux on the system. # SELINUX= can take one of these three values: # enforcing - SELinux security policy is enforced. # permissive - SELinux prints warnings instead of enforcing. # disabled - No SELinux policy is loaded. SELINUX=disabled # SELINUXTYPE= can take one of these two values: # targeted - Targeted processes are protected, # mls - Multi Level Security protection. SELINUXTYPE=targeted
Reboot each Hyper-Q VM.
$ sudo reboot
After the system restarts, confirm that the
getenforce
command returns Disabled.$ getenforce Disabled
Install the Azure Log Analytics Agent
The Log Analytics agent collects monitoring data from the guest operating system and workloads of virtual machines in Azure, other cloud providers, and on-premises machines. It sends data to a Log Analytics workspace.
To learn how to install the Log Analytics agent on the Hyper-Q VM, see the Microsoft Azure article Log Analytics virtual machine extension for Linux.
Collect Custom Log Files for Hyper-Q
You can use the Hyper-Q log files as a data source for the Log Analytics agent in Azure Monitor.
By default, Hyper-Q log files are in the directory /opt/datometry/logs
. The available log files are:
default_tracer-DATE_TIME.csv
contains trace messages.error_log-DATE_TIME.csv
contains critical startup messages as well as errors at run time.system_info-DATE_TIME.csv
contains global system activity messages.
At startup and at midnight new log files are created with the current date and time as part of the filename. The date and time format is YYYY-MM-DD HH:MM:SS. For example: 2021-04-23T18-32-14. Log entries in the Hyper-Q logs are always in the Universal Time Coordinated (UTC).
Note
There are default locations for log files, but these locations can be changed. If you change the location of a log file from its default location, ensure that you maintain a record of the new location. To confirm the location of the Hyper-Q log files for your deployment, look at the “general”.log_dir = “path”
parameter in the dtm.ini
configuration file. The file is located at: /opt/datometry/config/dtm.ini
To learn how to collect log events for Hyper-Q VM, and parse them using Azure Monitor, see the Microsoft Azure article Collect custom logs with Log Analytics agent in Azure Monitor.
The following procedure summarizes how to collect log events for the Hyper-Q VM and parse them using Azure Monitor. Refer to the Microsoft Azure documentation for more information.
To collect Hyper-Q log files for the Log Analytics agent in Azure Monitor:
In Azure, navigate to the Log Analytics Workspace you created for the Hyper-Q VM, and click Custom logs under Settings.
Click Add custom log to open the Custom Log Wizard.
Select and upload a Hyper-Q error log and click Next.
Select Timestamp for the Record delimiter and select YYYY-MM-DD HH:MM:SS as the Timestamp format.
Specify one or more Collection paths to the log files you want to collect.
For example, to specify the error log, select Linux from the Type drop-down menu, and enter the log file path
/opt/datometry/logs/error_log*.csv
.After you specify the log file path, the Custom Log Wizard prompts you to name the custom log file.
For example, name the file
hyper_q_error_log
. Azure Monitor appends the suffix_CL
to the log file.Click Done, and then click Create to save the custom log definition.
(Optional) Repeat this procedure to add additional log files to Azure Monitor.
Verify that the
customlog.conf
file exists on the Hyper-Q VMs, and that it contains references to the error log files you specified./etc/opt/microsoft/omsagent/workspace_id/conf/omsagent.d/customlog.conf
The
omsagent.log
file on the Hyper-Q VMs will have similar references./var/opt/microsoft/omsagent/workspace_id/log/omsagent.log
For example:
2020-08-11 15:47:35 +0000 [info]: INFO Received paths from sudo tail plugin : /opt/datometry/logs/error_log-*.csv, log_level=info
Note
It may take several minutes for the initial data to show up in the Log Analytics agent in Azure Monitor.
View the Hyper-Q VM Log Files
The log files contain information about activities and events in your Hyper-Q environment. You can use the information recorded to the log files to resolve technical issues. To learn more, see the Microsoft Azure articles Get started with log queries in Azure Monitor and Log queries in Azure Monitor.
To view the Hyper-Q VM logs:
Navigate to the Logs section of your Log Analytics workspace.
Enter the following query to display the Hyper-Q VM logs.
These examples use the
hyper_q_error_log_CL
andhyper_q_default_tracer_CL
for the names of the custom log files. Replace this with the name of your environment’s custom log files.The hyper_q_error_log_CL Log File
hyper_q_error_log_CL | extend CSVFields=parse_csv(RawData) | extend Timestamp = tostring(CSVFields[0]) | extend Pid = tostring(CSVFields[1]) | extend SessionId = toint(CSVFields[2]) | extend CRID = tostring(CSVFields[3]) | extend LSN = toint(CSVFields[4]) | extend Module = tostring(CSVFields[5]) | extend Line = toint(CSVFields[6]) | extend MessageCode = tostring(CSVFields[7]) | extend LogSeverity = tostring(CSVFields[8]) | extend Message = tostring(CSVFields[9]) | extend QueryString = tostring(CSVFields[10])
The hyper_q_default_tracer_CL Log File
hyper_q_default_tracer_CL | extend CSVFields = parse_csv(RawData) | extend Timestamp = tostring(CSVFields[0]) | extend SessionId = toint(CSVFields[1]) | extend CRID = tostring(CSVFields[2]) | extend BRID = tostring(CSVFields[3]) | extend LSN = toint(CSVFields[4]) | extend TagCategory = tostring(CSVFields[5]) | extend TagName = tostring(CSVFields[6]) | extend TagValue = tostring(CSVFields[7])