A while back while I was hosting an Azure Monitor workshop I was asked about if we can use Azure Monitor to monitor the licenses that an organization has in Azure Active Directory? By default, Azure Monitor can only have monitoring rules based upon metrics or events(logs) within Microsoft Azure. This means that by default there is no mechanisms to monitor information or data that is stored within Microsoft 365 or Azure Active Directory.
By utilizing smart tools such as Logic Apps, we can retrieve information from Microsoft 365 and Azure AD through the Graph API and transfer it to Azure Monitor. This results in a mechanism that appears as follows.
The result Logic App workflow looks like this. Which is then split into five different actions as seen below.
Where we have a daily schedule defined (runs at 10:00 AM)
Then we have a HTTP Action which calls out to the Graph API which uses a GET HTTP command to the following URI https://graph.microsoft.com/v1.0/subscribedSkus to get information from the URI (as specified in the documentation here –> List subscribedSkus – Microsoft Graph v1.0 | Microsoft Learn) we need to have a service principal or managed identity which has at least the following permissions “Organization.Read.All, Directory.Read.All”.
The HTTP action needs to look like this with
URI: https://graph.microsoft.com/v1.0/subscribedSkus
Headers: Content-type – application/json
Authentication type (That can either be managed identity or Active Directory OAuth) I recommend managed identity so that you do not have any secrets in the Logic App) you can read more about how to configure that here –> Authenticate connections with managed identities – Azure Logic Apps | Microsoft Learn
Since the Graph API returns all the results as JSON code it means that we need to parse is to that we can format the data before sending it to Log Analytics.
Hence, we have a Parse JSON action there the input is the body from the HTTP call.
When setting up the Parse JSON action you need to have a defined schema so the action knows how it should format the content, you can use the following schema.
{
"properties": {
"@@odata.context": {
"type": "string"
},
"value": {
"items": {
"properties": {
"appliesTo": {
"type": "string"
},
"capabilityStatus": {
"type": "string"
},
"consumedUnits": {
"type": "integer"
},
"id": {
"type": "string"
},
"prepaidUnits": {
"properties": {
"enabled": {
"type": "integer"
},
"suspended": {
"type": "integer"
},
"warning": {
"type": "integer"
}
},
"type": "object"
},
"servicePlans": {
"items": {
"properties": {
"appliesTo": {
"type": "string"
},
"provisioningStatus": {
"type": "string"
},
"servicePlanId": {
"type": "string"
},
"servicePlanName": {
"type": "string"
}
},
"required": [
"servicePlanId",
"servicePlanName",
"provisioningStatus",
"appliesTo"
],
"type": "object"
},
"type": "array"
},
"skuId": {
"type": "string"
},
"skuPartNumber": {
"type": "string"
}
},
"required": [
"capabilityStatus",
"consumedUnits",
"id",
"skuId",
"skuPartNumber",
"appliesTo",
"prepaidUnits",
"servicePlans"
],
"type": "object"
},
"type": "array"
}
},
"type": "object"
}
Instead of performing all the operations within Logic Apps, the last step involves sending all the data to Log Analytics. The reason behind this is that I find it simpler to define queries and alerting rules in Log Analytics compared to performing the same tasks within Logic Apps.
Here I just use for each in combination with the Log Analytics Data Collector (Azure Log Analytics Data Collector – Connectors | Microsoft Learn) and define a workspace and a custom table where I want the data to be stored. It should be noted that all data being sent into a custom table in Log Analytics will always start with CL_.
Once I’m done with the last configuration, I save the logic app and trigger it manually.
With the for-each setup it means that for each SKU it will be placed within its own row in Log Analytics, so the data that comes in should look something like this in Log Analytics.
NOTE: Some SKU’s appear twice since I forgot to filter on the date
Now that I have the information available, I can make queries to see usage and monitoring available licenses on certain SKUs, so for instance If I want to make a query to look after available licenses on a particular SKU such as SPE_E5, I can use the following query.
TABLE_CL
| where skuPartNumber_s contains "SPE_E5"
| summarize availablelicenses = sum(prepaidUnits_enabled_d - consumedUnits_d) by bin(TimeGenerated, 1d)
| project TimeGenerated, availablelicenses
| render barchart
This will then generate a barchart showing the number of available licenses that are available in a visualized form. However, let us say that we want to get notified if a specific SKU is lower than 150 SKUs. Then the query could look like this
TABLE_CL
| project consumedUnits_d, prepaidUnits_enabled_d, skuPartNumber_s
| where skuPartNumber_s contains "SPE_E5"
| extend available = (prepaidUnits_enabled_d - consumedUnits_d )
| where available <= 150
If we use this Kusto query as the basis for our monitoring rule, we can set a simple condition that checks the query once a day. If the query returns one result, it indicates that there are less than 150 licenses available for this SKU.
Then after that is configured you just need to define a Action Group and setup notification to where you want the alert to be sent.