Monitor your PowerShell modules in Azure

It’s time to look at your Azure environments with the Azure Spring Clean! This awesome community project invites multiple contributors to contribute content about how to get more grips on your Azure environments.

In my contribution I want to talk about logging and monitoring of PowerShell scripts and Modules. Often when working in Azure environments we require some automation too. And PowerShell is a wonderful tool for this. But it can be hard to keep track of which of the scripts and modules you created are actually used inside your environment. Or when you create modules for the public repositories it can come in handy to know how often they are used.

I’ve created a dashboard for some of the modules I created to see how and when they are used. In this blogpost I will show you how you can also create something like this and what other use-cases there are for this.

For these examples I will use some code from my newest module PowerPlatformChecker which I am developing at the moment to help me with deployment of Power Platform Solutions. The code is publicly available on GitHub.

There are many ways to gather telemetry data from your projects but a Azure native option is Azure Application Insights. This platform is build for gathering and analyzing this kind of data. You can deploy a Application Insight resource in Azure to get started. Once it’s deployed you will see a view like this:

A Azure Application Insights Overview Dashboard

Communicating with a platform like this can feel daunting when starting out. But the PowerShell community got our back. Jan-Hendrik Peters has created the TelemetryHelper module for PowerShell. This module can help us set up this monitoring quite easy.

In my Initialize-Module.ps1 file (which is ran when the module is loaded) I added this code:

# ===================================================================
# ================== TELEMETRY ======================================
# ===================================================================

# Create env variables
$Env:PowerPlatformChecker_TELEMETRY_OPTIN = (-not $Evn:POWERSHELL_TELEMETRY_OPTOUT) # use the invert of default powershell telemetry setting

# Set up the telemetry
Initialize-THTelemetry -ModuleName "PowerPlatformChecker"
Set-THTelemetryConfiguration -ModuleName "PowerPlatformChecker" -OptInVariableName "PowerPlatformChecker_TELEMETRY_OPTIN" -StripPersonallyIdentifiableInformation $true -Confirm:$false
Add-THAppInsightsConnectionString -ModuleName "PowerPlatformChecker" -ConnectionString "InstrumentationKey=df9757a1-873b-41c6-b4a2-2b93d15c9fb1;IngestionEndpoint=https://westeurope-5.in.applicationinsights.azure.com/;LiveEndpoint=https://westeurope.livediagnostics.monitor.azure.com/"

# Create a message about the telemetry
Write-Information ("Telemetry for PowerPlatformChecker module is $(if([string] $Env:PowerPlatformChecker_TELEMETRY_OPTIN -in ("no","false","0")){"NOT "})enabled. Change the behavior by setting the value of " + '$Env:PowerPlatformChecker_TELEMETRY_OPTIN') -InformationAction Continue

# Send a metric for the installation of the module
Send-THEvent -ModuleName "PowerPlatformChecker" -EventName "Import Module PowerPlatformChecker"

What I do here is I start by creating a OptIn variable. I set this one to the invert of the default PowerShell telemetry variable. Then I initialize the module and I give a modulename.

With the Set-THTelemetryConfiguration I set up the configuration by providing the modulename again, the optin variable and a setting to strip personal data. After that I add the configuration string. This string can be found on your dashboard. As my module will be used publicly this endpoint is open for everyone to send data too.

To make sure users are aware that information is being logged a message is written to the output to let them know. And last I send a log message to my log analytics workspace to log that the module is loaded.

Personally I’m interested in when and where my module is used so this is why I log when the module is imported. Besides that I have a log call when cmdlets/functions are called. I do this by just adding a line like this at the top of the function.

# Send telemetry data
    Send-THEvent -ModuleName "PowerPlatformChecker" -EventName "Test-PowerPlatformCheckerFlowOperationName"

The messages take a short while to show up in application insights but once they do you can go to the Custom Events table in the log pane in Application Insights.

Where to find the Custom Events Table in Application Insights

Here you can find all the log messages that are send.

Application Insights supports different types of logs. You can have events, traces and more. I chose to use Events as it allows me to easily add extra data in a structured way. For example in the screenshot you see some cases where Find-GeoCodeLocation is called. For this function I have multiple options a user can pick so I also log the option so that I know which one is most popular.

Once you have this data you are almost there. Now you can create a dashboard in Azure. Say you can to have a overview of how often every module you have is installed (like the top bar graph in my dashboard).

Go back to your Application Insights logs and select the KQL editing mode and use a query like this:

Once you’ve done that you can create run it, it will show the data and you can then make it a chart.

I have changed the Chart Type and the X-axis to get a view I like. After you’ve got the view you like you can save it and pin it to your Azure Dashboard. The KQL query I use to get the information for just a specific module is:

customEvents
| extend Location=strcat (client_City, " - ", client_CountryOrRegion)
| where name == "Import Module Geocoding"

You can use this technique to capture all kinds of data. If you have internal modules you can see how often they are used. I would also recommend adding the version of the module to your data so you can see if old version are being used or not. By using custom event data you can also see which features are used the most so you can focus your development on that.
Overall this can help you get a better understanding of what your code is actually doing in your environment!