Get a free domain for you Azure labs

Azure Infrastructure
Get a free domain for you Azure labs post image

Introduction

I’m conducting some training for Azure and need a way for students to get a domain name. In this post, I’ll show you how to get one for free and use Azure DNS to manage it.

Get the domain

Here are the steps:

    1. Head over to https://www.freenom.com
    2. Enter your desired domain name in the box. Use one of the following extensions
      • .tk
      • .ml
      • .ga
      • .cf
      • .gq

Here’s the one I chose:
freedomain_check

  1. Click checkout. The period can be anywhere from 1 month to 12 months to keep it free. Or for longer periods, the cost is displayed. I’ll leave mine at the default of 3 months and press Continue.
  2. Since I don’t have an account, I can create a new one. I used the social sign in to Google. A confirmation email will be sent.
  3. After clicking the link in my email to verify my account, I’m back at the checkout screen. Fill in your name, address, and phone number.
  4. Select the box after you’ve read the Terms and Conditions and then press Complete Order.
  5. You should be automatically logged in to the client area.

To view your domain, you click on Services, then My Domains and you should see the domain you just registered.

Manage it with Azure DNS

One of the limitations with the Freenom DNS is that you can’t add wildcard records. Wildcard records are often needed to verify your domain and to make managing DNS easier. For instance, to use this domain with Azure Active Directory, you must enter a wildcard TXT record. Let’s manage the domain with Azure DNS.

    1. Login to the Azure Portal and go to DNS zones.
    2. Click Add to add a new zone.
    3. Select (or create) a Resource group and instance name. The instance name is the name of your domain you previously registered.
freedomain_newzone
    1. Click Review and Create for validation and then Create to start the deployment. This usually takes less than a minute.
    2. When the deployment is complete, click Go to resource. The new zone is now displayed.
    3. Take a note of the name servers, we’ll need to enter these into our registrar.
freedomain_nameservers

Let’s tell Freenom that Azure DNS is our name server:

    1. Head back over to Freenom (https://my.freenom.com) and login (it may have timed out).
    2. Click on Services from the top menu, then My Domains
    3. Click Manage Domain for the domain you want to modify.
freedomain_manage
    1. In the management menu, click Management Tools and select Nameservers.
freedomain_manageNS
    1. On the Nameservers screen, select Use custom nameservers. Then enter the name servers provided by your instance of Azure DNS.
freedomain_manageNScustom
  1. Finally, click Change Nameservers. Azure DNS is now managing DNS for your domain.

Now that Azure DNS is managing your domain, you can create DNS records for your blog, other website, mail domain or other things. Let’s add a custom domain for Azure AD:

    1. Browse to Azure Active Directory and select custom domain names in the blade, or go to the direct link here.
    2. Next, click Add custom domain and type in your domain name. Click Add domain on the new blade.
freedomain_adddomain
    1. Once the domain is added, the verification screen comes up.
freedomain_verify
    1. The verification screen provides the information needed to populate Azure DNS. Copy the destination value.
    2. In the Azure Portal, go to your DNS zone and click add (+) Record set. For the Name, type the @ symbol. Change Type to TXT and paste the value from the step above into the Value field. Click OK.
freedomain_addrecord
    1. Navigate back to Azure Active Directory, Custom domain Names. You should see your domain name and it’s status as Unverified. Click your domain.
freedomain_unverified
  1. Now that you’ve added the TXT record to Azure DNS, click Verify.

Azure attempts to verify the domain. It can take several hours or days for all of these changes to propagate. If it doesn’t work the first time, try again later (after several hours). Once it’s successful, you’ll see that the status is changed to Verified.

freedomain_verified

You can now use this domain in Azure AD as a UPN suffix. You can also setup public-facing websites, such as an ADFS server and add records to Azure DNS so that users can access it to perform single sign-on.

0 comments

Automating Azure DevOps Service Connections

Azure DevOps, Azure Infrastructure, DevOps, Windows PowerShell
Azure Pipelines Icon

Overview

Recently I was working on doing infrastructure deployments using Azure DevOps Pipelines. One of the first things that needs to be done is to create a Service Connection to the target environment. In my case, my target environment is an Azure Subscription and I’ll use a Service Principal with an ID and Key (versus a Certificate) for authentication. However, we want to avoid storing this authentication information and we want it automated.

I also don’t want my Service Principal to have broad privileges so it’s scope will be limited to a single Resource Group. Let’s create several Resource Groups, a Service Principal for each one, assign it privileges and create a corresponding Service Connection in Azure Dev Ops

 

Oh, so many parameters

First, let’s define some parameters and set some variables. Most of these are self-explanatory:

#region 
#region Parameters
$cloud = "AzureCloud"
$location = 'eastus'
$tagDept = "specialprojects"
$tagEnv = "dev"
$devOpsUrl = 'https://dev.azure.com/M365x'
$devOpsProject = 'infra'
$resourceUrl = https://management.core.windows.net/$apps = @{
     'logging'="2";
     'devops'="2";
     'domain'="1";
     }
#endregion
#region Parameters
$cloud = "AzureCloud"
$location = 'eastus'
$tagDept = "specialprojects"
$tagEnv = "dev"
$devOpsUrl = 'https://dev.azure.com/M365x'
$devOpsProject = 'infra'
$resourceUrl = https://management.core.windows.net/$apps = @{
     'logging'="2";
     'devops'="2";
     'domain'="1";
     }
#endregion

The $apps parameter is a hashtable that will be used in the name of the Resource Group name and Service Principal. The number next to each one is part of my Resource Tags.

Next, we’re going to login while saving the context so we can gather more variables:

Clear-AzContext -Force
Save-AzContext -Profile (Add-AzAccount -Environment $cloud) -Path $env:TEMP\az.json -Force

#Get variables
$az = Get-Content -Path $env:TEMP\az.json | ConvertFrom-Json
$tenantId = $az.Contexts.Default.Tenant.TenantId
$subId = $az.Contexts.Default.Subscription.SubscriptionId
$subName = $az.Contexts.Default.Subscription.Name
$cloudEnv = $az.Contexts.Default.Environment.Name
$cloudUrl = $az.Contexts.Default.Environment.ResourceManagerUrl
$createdBy = $az.Contexts.Default.Account.Id

We also need the Access Token that we’ll use later to make a REST API call to Azure DevOps:

$ctx = Get-AzContext
$cacheItems = $ctx.TokenCache.ReadItems()
$token = ($cacheItems | where { $_.Resource -eq $resourceUrl }).AccessToken

 

Loop it

We’re almost ready to get to work. Since we have 3 Resource Groups defined in $apps (and could easily expand this to several dozen), we need a foreach loop. We’ll also set some additional variables:

foreach ($app in $apps.GetEnumerator())
    {
        $appName = $app.Name
        $rgName = "rg-$appName-$tagEnv"
        $spName = "sp-$appName-$tagEnv"
        $scope = "/subscriptions/$subId/resourceGroups/$rgName"
        $tags = @{
                App=$appName;
                Department=$tagDept;
                Environment=$tagEnv
                Tier=$app.Value
                CreatedBy=$createdBy
        }

These variables are just setting the names of the Resource Groups and Service Principal. If you have a different standard, modify these. Also, this is where we set the scope to the Resource Group. If you prefer to have a Subscription-wide scope, just remove /resourceGroups/$rgName.

Next, we’ll create the Resource Group and Service Principal using these values:

        #Create Resource Group
        If ((Get-AzResourceGroup -Name $rgName -ErrorAction SilentlyContinue) -eq $null)
            {
            New-AzResourceGroup -Location $location -Name $rgName -Tag $tags | Out-Null
            }

        #Create Service Principal
        If ((Get-AzADServicePrincipal -DisplayName $spName) -eq $null)
            {
            #Create Service Principal and assign rights. This can take a minute.
            $sp = New-AzADServicePrincipal -DisplayName $spName -Scope $scope  `
-Role Contributor -WarningAction SilentlyContinue
            }

        $spNameId = $sp.ServicePrincipalNames | ? {$_ -notlike "http*"} | select -First 1
        $spkey = [System.Runtime.InteropServices.Marshal]::PtrToStringAuto([System.Runtime.InteropServices.Marshal]::SecureStringToBSTR($sp.Secret))

Let me explain the last two parameters:

  • $spNameId – This is the Service Principal ID and will look like a GUID
  • $spkey – This is the password for the Service Principal. We need to use this later to create the Service Connection later but after that, we don’t really need to know it. It’s available in memory for a short time, until the next item in the loop or once the session is closed but we don’t have to store it anywhere (thus improving security).

The last thing we need to do is to create the Service Connection in Azure DevOps. I’m not aware of an official PowerShell module but there is an Azure CLI extension. The problem with the Azure CLI is that it has limited support for creating Service Endpoints (Service Connections) and is in preview. Therefore, we’ll call the REST API directly, using PowerShell:

        #Set variables for request body
        $params = @{
            data=@{
                SubscriptionId=$subId;SubscriptionName=$subName;environment=$cloudEnv;
scopeLevel="Subscription";creationMode="Manual"}
            name=$spName;type="azurerm";url=$cloudUrl;
            authorization=@{
                scheme="ServicePrincipal";
                parameters=@{
                    tenantid=$tenantId;serviceprincipalid=$spNameId;
authenticationType="spnKey";serviceprincipalkey=$spKey}
                }
            }
        $body = $params| ConvertTo-Json

        #Set headers and send request
        $headers = @{"Authorization" = "Bearer " + $token;"Content-Type" = "application/json"}
        $baseUri = "$devOpsUrl/$devOpsProject/_apis/serviceendpoint/endpoints?api-version=5.0-preview.2"
        $req = Invoke-RestMethod -Method POST -Uri $baseUri -Headers $headers -Body $body -ErrorAction SilentlyContinue
    }

Because that was in a foreach loop, we can easily create many Resource Groups, a Service Principal for each one, assign it Contributor rights on the Resource Group, and create a Service Connection.

 

Conclusion

Now, we can create Pipelines that use these Service Connections to connect to Azure Resource Manager. No passwords or secrets are kept insecurely and our Service Principals are using limited rights.

I mentioned that there was not an official PowerShell module for Azure DevOps. However, here are some community projects; I have not tried any of these:

For more information on using Pipelines for infrastructure, check out these great posts from Barbara 4bes:

Is there a better way to do this, got any ideas? Post a comment below.

1 comment

Sharing my PowerShell Profile

Windows PowerShell

Over the course of the last several months, I’ve collected several snippets for my PowerShell profile. Your PowerShell profile is loaded each time you start PowerShell. The official documentation for it can be found here: https://docs.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_profiles?view=powershell-6

To get started, you can easily edit your profile by typing notepad $profile to load your profile. The PowerShell prompt and PowerShell ISE each have their own profile but I like mine to be the same so here’s what I did:

  1. First, I created a profile.ps1 file in $home\Documents\WindowsPowerShell\. This is also where you’ll find the default profile files, Microsoft.PowerShell_profile.ps1 and Microsoft.PowerShellISE_profile.ps1.
  2. Second, in the default files, just add a line to redirect the profile like so:
    $profile="$home\Documents\WindowsPowerShell\profile.ps1"
  3. Third, save all 3 files. Next time you open PowerShell or the ISE, it will load your profile.ps1 (which might be empty at this point).

Let’s walk through what’s in my profile.ps1. At the end, I’ll include a link so you can download it in its entirety.

Logging and PSReadLine

Here I’m setting my log path and will log everything I type so I have a full transcript of my session if ever I need it. Next, some PSReadLine settings. @roggenk has a good blog post that covers PSReadLine in some detail so head over and check that out. I’ve got some of the same settings here that improve the user experience. I have two custom key bindings, one to start the @vivaldibrowser browser (my favorite browser!) and switches focus to the browser right away and the other to perform a git commit/push.

#Logging
#Logging
$PSLogPath = ("{0}{1}\Documents\WindowsPowerShell\log\{2:yyyyMMdd}-{3}.log" -f $env:HOMEDRIVE, $env:HOMEPATH,  (Get-Date), $PID)
Add-Content -Value "# $(Get-Date) $env:username $env:computername" -Path $PSLogPath -erroraction SilentlyContinue
Add-Content -Value "# $(Get-Location)" -Path $PSLogPath -erroraction SilentlyContinue

# PSReadLine Settings
Set-PSReadLineOption -HistorySearchCursorMovesToEnd
Set-PSReadlineKeyHandler -Key UpArrow -Function HistorySearchBackward
Set-PSReadlineKeyHandler -Key DownArrow -Function HistorySearchForward 
Set-PSReadlineOption -BellStyle None #Disable ding on typing error
Set-PSReadlineOption -EditMode Emacs #Make TAB key show parameter options
Set-PSReadlineKeyHandler -Key Ctrl+i -ScriptBlock { Start-Process "${env:ProgramFiles(x86)}\vivaldi\Application\vivaldi.exe" -ArgumentList "https://www.bing.com" } #KEY: Load Browsers using key "C:\Program Files (x86)\Vivaldi\Application\vivaldi.exe"

#KEY: Git, press Ctrl+Shift+G (case sensitive)
Set-PSReadlineKeyHandler -Chord Ctrl+G -ScriptBlock {
        $message = Read-Host "Please enter a commit message"
        /usr/bin/git commit -m "$message" | Write-Host
        $branch = (/usr/bin/git rev-parse --abbrev-ref HEAD)
        Write-Host "Pushing ${branch} to remote"
        /usr/bin/git push origin $branch | Write-Host
} 

Functions

Next step, some functions. The first one just shows some nice output to get the current directory size, which should be built-in! The next one checks to see if the PowerShell prompt is running as an administrator; you’ll see why we need this later. I stole borrowed both of these from @jaredtrog including most of the other stuff in my prompt below (next section).

#Functions
function Get-DirectorySize($Path='.',$InType="MB")
{
    $colItems = (Get-ChildItem $Path -recurse | Measure-Object -property length -sum)
    switch ($InType) {
        "GB" { $ret = "{0:N2}" -f ($colItems.sum / 1GB) + " GB" }
        "MB" { $ret = "{0:N2}" -f ($colItems.sum / 1MB) + " MB" }
        "KB" { $ret = "{0:N2}" -f ($colItems.sum / 1KB) + " KB"}
        "B" { $ret = "{0:N2}" -f ($colItems.sum) + " B"}
        Default { $ret = "{0:N2}" -f ($colItems.sum) + " B" }
    }
    Return $ret
}
function Test-IsAdmin {
([Security.Principal.WindowsPrincipal] [Security.Principal.WindowsIdentity]::GetCurrent()).IsInRole([Security.Principal.WindowsBuiltInRole] "Administrator")
}

The Prompt

Finally, here’s my prompt. You’ll see the Test-IsAdmin function being used here to set the prompt’s text color. This is really useful to know immediately if you’re running an elevated prompt.

function global:prompt {
    #Put the full path in the title bar

    $console = $host.ui.RawUI
    $console.ForegroundColor = "gray"
    $host.UI.RawUI.WindowTitle = Get-Location

    #Set text color based on admin
    if (Test-IsAdmin) {
        $userColor = 'Red'
    }
    else {
        $userColor = 'White'
    }

    #Setup command line numbers
    $LastCmd = Get-History -Count 1
    if($LastCmd)
    {
        $lastId = $LastCmd.Id
        
        Add-Content -Value "# $($LastCmd.StartExecutionTime)" -Path $PSLogPath
        Add-Content -Value "$($LastCmd.CommandLine)" -Path $PSLogPath
        Add-Content -Value "" -Path $PSLogPath
    }

    $nextCommand = $lastId + 1
    $fullPath = Get-Location

    Write-Host "[$($pwd)]" -ForegroundColor "Cyan"
    Write-Host -NoNewline '[' -ForegroundColor "Gray"
    Write-Host -NoNewline "$([System.Environment]::UserName)" -ForegroundColor $userColor
    Write-Host -NoNewline '@'
    Write-Host -NoNewline "$([System.Environment]::MachineName)  $nextCommand" -ForegroundColor $userColor
    Write-Host -NoNewline ']' -ForegroundColor "Gray"
    #Use $host.EnterNestedPrompt() to test a nested prompt. 
    Write-Host -NoNewline " PS$('>' * ($nestedPromptLevel + 1)) " -ForegroundColor $userColor
    Return " "
} 

Here’s how it looks:

PowerShell Prompt

Having the folder path on its own line gives me more space to focus on my account PowerShell command. Here’s how the Administrator prompt looks:

PowerShell Prompt as Administrator

Notice the red text. Almost forgot the nice ASCII. Just use any online ASCII generator and use Write-Host.

Clear-Host
Write-Host '   _       __      __    _     __'
Write-Host '  | |     / /___ _/ /_  (_)___/ /'
Write-Host '  | | /| / / __ `/ __ \/ / __  / '
Write-Host '  | |/ |/ / /_/ / / / / / /_/ /  '
Write-Host '  |__/|__/\__,_/_/ /_/_/\__,_/   '
Write-Host '                                 ' 

That’s all! If you have any other tips, share them in the comments below.

0 comments

Toying with Azure Functions – A Trade Log

Azure Infrastructure, Windows PowerShell
Toying with Azure Functions – A Trade Log post image

Serverless.

The best way for me to learn something is to use it, I’m hands-on. After being barraged by articles and hype around Azure Functions, I decided to try it out. But, I’m not a developer. All the new stuff that the cool kids are playing with today (IoT, Serverless, Bots, AI, etc.) requires real dev skills but that’s just not what I do. Sorry, not sorry. So here we go with PowerShell and Azure Functions. Warning: support for PowerShell is “experimental” and I found that I had to do a lot of workarounds.

The project

I’m going to use Azure Functions to generate a trade log. I’ve used all kinds of stock brokers (Schwab, Scottrade, Fidelity, etc.) but no platform gives you the flexibility and cost of Interactive Brokers ($IBKR). Most of my trades cost $1.00 but you get what you pay for. $IBKR doesn’t have a nice, rich, web interface that shows your trade history. Instead you have to run  your own reports and luckily there’s an API for it. My function is going to create a “up to date” log of the trades that have been executed (a lot of my trading is automated so I might not know when something has been bought or sold).

I’m going to focus on Functions and specifically this project. There’s a lot of great documentation on the Interactive Broker API as well as Azure Functions on their respective official documentation pages.

Setting up

So the first thing I’m going to do is create a new Azure Functions app using PowerShell and starting with the Timer trigger. Let’s call this “GetIBTrades2Blob.” We’re going to go right into the “Integrate” section and configure the following:

Triggers: Timer Trigger called “Daily0200ZZ” that has a schedule of “0 0 2 ***” That means, it will run at ANY day of week, month, or day at 2 hrs, 0 minutes, and 0 seconds. That’s 2:00:00 daily (Zulu time), which is 8 PM Eastern.

image

Inputs: Azure Blob Storage. I have a parameter name “inputBlob” with an explicit path, “trades/tradelog.csv” and a Storage account connection. Azure Functions sets this up nicely just by going through the portal UI. One note is that I struggled a lot with the path. It seems PowerShell doesn’t really support variables here (the default is {file}). I hope this will be fixed at some point but it may just be a limitation of the language.

image

Output: Azure Blob Storage. This is setup almost exactly the same way and the input, we’re just going to use a different parameter name but the same path and connection.

image

All of that just writes your functions.json file for you. Next, we need to write the code in run.ps1.

Code

The code isn’t really relevant for this blog post except for a few key pieces that make everything work. The code might as well just be something like

“hello world” | Out-File –FilePath $outputBlob

But, let’s see what we’re dealing with for Interactive Brokers. First, we need two parameters, a Token Code and a QueryID (see this link). This can be retrieved from your $IBKR account. For the QueryID, I created a new one (that is, a new Flex Query) that just gets my trades for the current day (see this link).

Now that we have those, we can get the trade confirmation report for today using Invoke-WebRequest and doing some parsing.

$token = "000000000000000000000000"
$q = "123456"

$request = "https://gdcdyn.interactivebrokers.com/Universal/servlet/FlexStatementService.SendRequest?t=$token&q=$q&v=3"

$response = Invoke-WebRequest $request -UseBasicParsing

[xml]$xml = $response.Content
[string]$refCode = $xml.ChildNodes.ReferenceCode
[string]$flexUrl = $xml.ChildNodes.Url

$reqData = $flexUrl + "?q=$refCode&t=$token&v=3"
$responseData = Invoke-WebRequest $reqData -UseBasicParsing
   
$content = $responseData.Content

What will be returned here is an array containing the trades returned by the query, for example:

Date/Time,Symbol,Quantity,Price,Amount,Commission,OrderType,TradeDate
20180518;134156,XYZ,-100,5.2,-94,-1,LMT,20180518

Great! But now I need to add this trade to the existing ones (from yesterday, the day before, and before that, etc.).

This is a learning exercise so I decided to get the current days trades. I realize that there is a “Month to date” Flex Query available in $IBKR. One of the things that I learned through this exercise is that appending data is not easy (or possible) in Azure Functions using PowerShell. For C#, many examples exist using Append Blob (some information here on that). So, with that said, we have to do a workaround. The workaround will be to get the existing tradelog.csv and that’s exactly why we added the Azure Blob Storage as an input.

#Remove header information from $content since it will already be there in the file we’re getting
$cleanContent = $content | Select-Object -Skip 1

#Get the current file from blob storage
$inputArray = Get-Content -Path $inputBlob

foreach ($line in $cleanContent)
     {
         $inputArray += $line
     }


Out-File -Encoding ASCII -FilePath $outputBlob -inputObject $inputArray

This is pretty simple. First we remove the heading. Then we loop through each line in the returned results for today’s trade since there could have been more than one trade. The last line is where we store the full text (including the original input) back to Azure Blob Storage. Actually, by adding additional Outputs, you can save the file in many places. For my purposes, I also have the file going to my OneDrive.

In my example, I was working with text. Working with binaries or media would be much harder because of some content type issues, unless you use C# or another supported language. A big thanks to jschmitter, who patiently assisted me working all of this out.

0 comments

Using Azure Policy Sets

Azure Infrastructure

At Microsoft Ignite in September 2017, Ryan Jones (@rjmax) discussed Azure Resource Manager Policies and some enhancements coming soon. See this blog post about the public preview announcement. One of those, was Policy Sets. Policy Sets allow you to group several policies together and assign them as a group. There’s more information at http://aka.ms/azurepolicy. In this blog post, we’re going to explore how to start using them.

Start with existing policies

In order to create a policy set, we need existing policy definitions. You can use some built-in ones but for this example, I’ve created 4 custom policies which are related to storage:

  1. Audit VM’s that don’t use Managed Disks
  2. Deny deployment if Storage Account Blob Encryption is not enabled.
  3. Deny deployment if Storage Account File Encryption is not enabled.
  4. Deny deployment if Storage Account https-only transport (secure transfer required) is not enabled.

The first thing we need to do is get the Policy Definition Id’s. We need the Policy Definition Name to get those. Here, I’ve looked up the names and created an array of the Policy Definitions that I want in my new Policy Set.

$policyNames = @( "audit-managedDisks", 
"deny-NoBlobEncryption", 
"deny-NoFileEncryption", 
"deny-NoHttpsOnly" 
)

Next, I’m going to loop through those and get the Policy Definition Id and store it in a variable called $policyDefinitionId. Since I need to loop through each Policy Definition now, I’m also going to construct an object called $policySetDefinition which we’re going to use later.

$Target=@()
$policyDefinitionIds = @()
$policyNames | 
    ForEach-Object {
        $policyDefinitionId = (Get-AzureRmPolicyDefinition -Name $_ | select -ExpandProperty PolicyDefinitionId)
        $TargetObject = New-Object PSObject –Property @{policyDefinitionId=$policyDefinitionId}
        $Target +=  $TargetObject
    }
$policySetDefinition = $target | ConvertTo-Json

 

Define and Assign the Policy Set

Next we need to create the Policy Set Definition

$policySetParams = @{ 
Name = "policySet-Storage" 
DisplayName = "Storage: Policies to enhance security of Storage Accounts." 
Description = "This initiative contains several Storage Policies to be applied at the subscription level." 
PolicyDefinition = $policySetDefinition 
} 
$policySet =  New-AzureRmPolicySetDefinition @policySetParams -Verbose

Notice that this command takes the $policySetDefinition we created earlier into the PolicyDefinition parameter. Now that we’ve created a Policy Set Definition, we can assign it.

For this example, I’m going to assign it to my Subscription but I need to exclude two Resource Groups. With the new policy language, that’s pretty easy to do. We are also going to define the Sku. The Sku is an object consisting of a name and tier. I’m going with Standard here because I want to enforce this policy set on existing resources. If you only wanted to enforce the policy set on new resources, set this to A0 and Free.

$ExcludedResourceGroup1 = Get-AzureRmResourceGroup -Name "rg-aad" 
$ExcludedResourceGroup2 = Get-AzureRmResourceGroup -Name "securitydata" 
$sku = @{ 
name="A1" 
tier="Standard" 
}

$policyAssignmentParams = @{ 
Name = "StoragePolicySetAssignment" 
DisplayName = "Storage Policy Set" 
Description = "This initiative contains several Storage Policies to be applied at the subscription level." 
PolicySetDefinition = $policySet 
Scope = "/subscriptions/{guid-of-subscription}" 
NotScope = $ExcludedResourceGroup1.ResourceId, $ExcludedResourceGroup2.ResourceId 
Sku = $sku 
} 
$new = New-AzureRmPolicyAssignment @policyAssignmentParams


Visualize

That’s it! Let’s take a look at how this appears in the new Policy UI. Here’s the Assignments blade:

image

And here’s the actual Assignment showing the scope, exclusions, and sku:

image

1 comment