How to measure Azure network latency with PowerShell

It’s time for the Azure Spring Clean this year. This article is part of this wonderful community event. Don’t forget to check out all the other contribution too.

When working with Azure chances are that not all resources will be places in the same Azure Region. This could for example be because a specific resource can be created in a different region for less money. Or maybe the environmental footprint for the resource is smaller in this other region. Last year I’ve published this blogpost about how you can determine the sustainability data for the different regions.

A factor to keep in mind when working with different regions can be the latency. Depending on the workload this can be important or not. When talking for example about big amounts of data like the storage for backup this latency might not be very impactful. If it takes a couple of milliseconds more to show the content of the backup data this won’t impact the service much. But when it’s a database used for many transactions this latency can quickly add up. Especially when there are actions performed in serial, this extra couple of millisecond can add up to a negative user experience. Therefore it’s important to determine how much impact this will cause.

Once an application is developed it’s possible to use tools like Application Insights to monitor the different parts of this process and determine how much time is added between steps due to network latency. But when designing the system it could be beneficial to have this data already. Microsoft published this information publicly for everyone to use. But personally I find this information hard to parse at times. Because of this reason I decided to create a tool to make this easier. You can download the PowerShell module I’ve created in the PowerShell Gallery. After installing the module you can use it like this:

The number it provides is the roundtrip latency time in milliseconds for a request between these regions. In this blogpost I will explain how this tool was created.

First I wanted the latency data to be easier to digest for a script so I decided to create a script which parses the data and periodically runs this in a GitHub Action. Just as with the sustainability data this is setup in a way where if there are any changes a pullrequest will be created to compare the changes and store this in a public GitHub repository.

The script to get the data contains several parts. First it retrieves the data which is needed and sets up several variables that are used in the script.

# Get the content from Microsoft Github page
$page = Invoke-RestMethod -Method GET -Uri "https://raw.githubusercontent.com/MicrosoftDocs/azure-docs/refs/heads/main/articles/networking/azure-network-latency.md"

# Get a list of all Azure Regions
$regions = Invoke-RestMethod -Method GET -Uri "https://datacenters.microsoft.com/wp-json/globe/regions" | ConvertFrom-Json -AsHashtable

# Set some variables that are being used in the loop
$inTab = 0
$inTable = $false
$tableColumns = @()
$columnIndex = 0
$destination = ""
$sourceId = ""
$destId = ""
$latency = [PSCustomObject]@{
    "Sources" = @()
}

You’ll see that the Microsoft Learn article is downloaded from GitHub and a list of all regions is downloaded from Microsoft. Next up a loop is started for every line in the Microsoft Learn article. By analyzing the markdown syntax the document can be parsed.

# Loop through each line in the file
foreach ($line in ($page -split '\r?\n')) {
    # Check if the line starts with a 4th level header
    if ($line.StartsWith("####")) {
        Write-Host "Found tab $line"
        # Set the linecounter for this tab to 0
        $inTab = 0
        $inTable = $false
    }

The script detects if specific headers are found. If so it will check if a table is present after this.

# Check if in the first 5 lines a table starts
    if ($inTab -lt 5 -and -not $inTable) {
        if ($line -match "((?:\| *[^|\r\n]+ *)+\|)") {
            Write-Host "Found Table Header $line"
            $inTable = $true
            $tableColumns = @()

            # Add every source to the data structure and add them to the array for this table remove the first and last entries as they are empty
            foreach ($source in ($line.split("|")| Select-Object -Skip 1 -SkipLast 1)) {
                $s = $source.Trim()
                if ($s -ne "Source" -and $s -ne "") {
                    Write-Host "Found Source $s"
                    # Add the the columns array
                    $tableColumns += $s

                    # Lookup the ID for the source
                    try {
                        $sourceId = ($regions | Where-Object { $_.displayName -eq $s } | Select-Object -First 1).id
                    }
                    catch {
                        Write-Warning "[WARNING] Couldn't find id for region $s"
                        $sourceId = $null
                    }
                    #Try to determine the sourceId if it's not found
                    if(-not $sourceId) {
                        Write-Warning "[WARNING] Create own id for $s"
                        $sourceId = $s.Replace(" ","").ToLower()
                    }

                    $latency.Sources += [PSCustomObject]@{
                        "Name"         = $s
                        "id"           = $sourceId
                        "Destinations" = @()
                    }
                }
            }
        }
    }

The headers of the table are parsed and stored as different sources in the object which will be stored as JSON eventually. Because the table only shows the Display Name of the regions it will lookup the regions in the overview that was downloaded to also store the id of the region. This makes it easier to use in combination with other tools. Once the script got the header of the table it will parse the different lines in the table.

# Check if in table row and if so add the destinations for the right columns
    if ($inTable) {
        $columnIndex = 0
        #split the line up in the different cells and remove the first and last as they will be empty
        foreach ($dest in ($line.split("|")| Select-Object -Skip 1 -SkipLast 1)) {
            $d = $dest.Trim()
            # Check if the cell is not a separator or empty
            if($d.Replace("-","") -eq "" -or $d -eq "") {
                continue
            }

            # in the first column the destination name is stored
            if($columnIndex -eq 0) {
                $destination = $d

                # Lookup the ID for the destination
                try {
                    $destId = ($regions | Where-Object { $_.displayName -eq $d } | Select-Object -First 1).id
                }
                catch {
                    Write-Warning "[WARNING] Couldn't find id for region $d"
                    $destId = $null
                }
                #Try to determine the destId if it's not found
                if(-not $destId) {
                    Write-Warning "[WARNING] Create own id for $d"
                    $destId = $d.Replace(" ","").ToLower()
                }
            }

            # in the other columns the destination latency is stored in the proper source
            if($columnIndex -gt 0) {
                # Find the correct source and add the destination
                ($latency.Sources | Where-Object {$_.Name -eq $tableColumns[$columnIndex-1]}).Destinations += [PSCustomObject]@{
                    "Name" = $destination
                    "id" = $destId
                    "Latency" = $d
                }
            }

            # increment columnindex
            $columnIndex++
        }
    }

Here it stores the different destinations for which latency data is available in the object which will be parsed to JSON. The last part of the script makes sure the file is actually stored as the latencydata.json file.

Next up I created the PowerShell Module. This was done with the same techniques as describe in the blogpost about setting up a module from scratch. The module consist for now of a single function called Get-AzNetworkLatency. This function looks up the latency number in the JSON which was created and returns it.

[CmdLetBinding()]
    [OutputType([int])]

    Param (
        [Parameter(Mandatory = $true, Position = 1, ParameterSetName = 'Online')]
        [Parameter(Mandatory = $true, Position = 1, ParameterSetName = 'Offline')]
        [String] $Source,

        [Parameter(Mandatory = $true, Position = 2, ParameterSetName = 'Online')]
        [Parameter(Mandatory = $true, Position = 2, ParameterSetName = 'Offline')]
        [String] $Destination,

        [Parameter(Mandatory = $true, Position = 3, ParameterSetName = 'Online')]
        [Switch] $Online,

        [Parameter(Mandatory = $false, Position = 3, ParameterSetName = 'Offline')]
        [Switch] $IgnoreWarning
    )

    if ($Online) {
        # Send telemetry
        Send-THEvent -ModuleName "AzNetworkLatency" -EventName "Get-AzNetworkLatenc" -PropertiesHash @{Type = "Online" }

        $ldata = Invoke-RestMethod -Method GET -Uri "https://raw.githubusercontent.com/autosysops/azure_network_latency/refs/heads/main/latencydata.json"
    }
    else {
        # Send telemetry
        Send-THEvent -ModuleName "AzNetworkLatency" -EventName "Get-AzNetworkLatenc" -PropertiesHash @{Type = "Offline" }

        if(-not $IgnoreWarning) {
            Write-Warning "[WARNING] Module uses embedded latency data. This data could be outdate. Use the switch -online for the most recent data. To suppress this message use the switch -IgnoreWarning."
        }
        $ldata = $script:latencyData
    }

    # Find the source
    $s = $ldata.Sources | Where-Object { $_.id -eq $Source }

    if ($s) {
        # Find the destination
        $d = $s.Destinations | Where-Object { $_.id -eq $Destination }

        if ($d) {
            return $d.Latency
        }
        else {
            Write-Error "[ERROR] Can't find Destination $Destination. Make sure you are using the id of the destination. For West Europe that would be `"westeurope`""
        }
    }
    else {
        Write-Error "[ERROR] Can't find Source $Source. Make sure you are using the id of the source. For West Europe that would be `"westeurope`""
    }

    return -1

To make the module more usable I did make sure that the latency data can be access from the GitHub repository or from an internal store. This allows the module to also be used in a setting where no internet access is allowed. To make sure this would work I’ve added these lines in the build script:

# Add latency data
$text += "# Local storage of latency data"
$text += "`$script:latencyData = `'" + ((Invoke-WebRequest -Method GET -Uri "https://raw.githubusercontent.com/autosysops/azure_network_latency/refs/heads/main/latencydata.json").Content | ConvertFrom-Json | ConvertTo-Json -Compress -Depth 4) + "`' | ConvertFrom-Json"

This downloads the JSON file from the GitHub repository when the module is being build. It stores the JSON as text inside the .psm file and makes sure it’s converted from JSON to a variable in the script scope of the module. This way it can be used inside the functions of the module.

I’ve added a warning when this information is used to make sure that the user is aware this could be older data.

And with this the module is complete. Just as with the other modules it will use telemetry to help me gather data about how it’s used. And it uses a GitHub action to be pushed to the PowerShell Gallery.

I hope this insight in how the module was created helps. Feel free to contribute to the projects or leave any suggestions. I want to expand the module by for example allowing you to request the best region latency wise for a specific source. If there are any other functionalities you would like please add them as a comment in this blogpost or as an issue in the GitHub repository. And once again don’t forget to check out all the other awesome contributions for the Azure Spring Clean! And if you want to know more about this and other PowerShell tricks then join my session at PSConfEU (PowerShell Conference Europe) in June 2025 to learn more about this and other sustainability data sources that can be used with PowerShell.