Power BI Query Usage and Performance Series – Part 1 – Enabling Log Analytics, generating the KQL queries – FourMoo | Power BI


As with most of my blog posts it involves a client from a customer where I am consulting, which I think will help others.

The requirement was to analyse the Power BI Query usage patterns of the users. The initial requirement was to find out how many users were using Excel to gain access to the Power BI Dataset.

I knew that I could get this using Azure Log Analytics. Not only could I find out how many users are using Excel, but I could also find out what queries they are running, how long they took.

And best of all I could also get all the queries running against the Power BI datasets to understand the usage of the dataset, as well as if there are any long running queries and potentially even more.

I was really excited to be using the Power BI DirectLake feature, it works really well, and it was awesome to see my changes immediately available.

In this series I am going to show you all the steps I did to have the successful outcome I had with my client.

Part 1 – Enabling Log Analytics, generating the KQL queries (This blog post)

Part 2 – Storing the output in Azure Blob Storage

Part 3 – Using Microsoft Fabric Dataflow Gen2 to store the data in the Lakehouse.

Part 4 – Creating a Power BI dataset and report using DirectLake.

Enabling Log Analytics on my Power BI Premium or Premium Per User App Workspace

One of the most important parts of this entire process is to enable the log analytics on my App workspace.

NOTE: I had to have my App Workspace using Power BI Premium or Premium Per user capacity to use the Log Analytics

I am not going to go through the steps on how to configure your Azure Log Analytics workspace. You can find the steps to configure it with this link: Configure Azure Log Analytics in Power BI

Once I had created and configured my Log Analytics workspace, I then went into the Power BI Service and to my app workspace.

I then clicked on Workspace Settings and clicked on “Azure connections”. Next under Azure Log Analytics I clicked on the drop down for my subscription where I had created the log analytics workspace.

I then selected the associated resource group. And finally, I then selected the Log Analytics Workspace I had created. This is shown below.

Once I had configured the Azure Log Analytics, I did leave it for a few hours to start collecting data. Ideally this should be planned before hand and enabled with enough time to capture enough data.

Creating the Log Analytics Query using Kusto Query Language (KQL)

Now that I had data being sent to log analytics, I needed to have the query to extract the data I required.

The requirement was to always get all the previous days log analytics queries.

What I did to figure out how get the query I needed was to use the Log Analytics Samples for the Azure Log Explorer. This was a great way to test out the KQL queries on some sample data. Here is the link below.

help.Samples | Azure Data Explorer

For me to get the KQL query that I needed I did some searching, which helped me to create the KQL Query as shown below.

let year = datetime_part("Year",now());
let month = datetime_part("Month",now());
let daynum = datetime_part("day",now());
let m = case(month < 10, strcat("0", month), tostring(month));
let d = case(day < 10, strcat("0", daynum), tostring(daynum));
let CurrentDate = todatetime(strcat(year, "-", m, "-", d ));
let PreviousDate = datetime_add('day',-1,todatetime(CurrentDate));
PowerBIDatasetsWorkspace 
| where TimeGenerated between ( PreviousDate ..CurrentDate  ) 
| where OperationName has "QueryEnd"
// | where extract(@""([^,]*)"", 1, tostring(DurationMs), typeof(long)) > 500
| project CpuTimeMs,ArtifactName, LogAnalyticsCategory,DatasetMode,DurationMs,ExecutingUser,OperationDetailName,PowerBIWorkspaceName, TimeGenerated,EventText

UPDATE: 2023-09-04

I had to update the query above to put in the leading zero for the day number.

What the above query does is to get the data for the previous day and has got the associated columns of data I required.

I did find this useful answer from Stack Overflow which allowed me to get the query that I needed.

azure data explorer – Kusto query – how to get beginning datetime of current month – Stack Overflow

Someone might be thinking, “Hey I can store my Log Analytics data for up to 2 years”. Yes, you can certainly do this by changing the retention policy in Log Analytics.

I always prefer to store log data in Azure Blob storage, this enables me to store it for as long as required. It also allows me to manage the files using the built in tools available with Azure Blob storage such as moving older blobs to different storage tiers to save on costs.

As well as I could then create a shortcut in Microsoft Fabric Lakehouse to my Azure Blob Storage location (which I will do in a later blog post).

Summary

In part 1 of the blog series on Power BI Query usage I have shown you the steps I used to enable log analytics and the log analytics query to get the data I required for the past day.

I hope you will be following along and read the future blog posts on this series.

If there are any questions or suggestions, please let me know.



Source link

Be the first to comment

Leave a Reply

Your email address will not be published.


*