Sharing my PowerShell Profile

Windows PowerShell

Over the course of the last several months, I’ve collected several snippets for my PowerShell profile. Your PowerShell profile is loaded each time you start PowerShell. The official documentation for it can be found here: https://docs.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_profiles?view=powershell-6

To get started, you can easily edit your profile by typing notepad $profile to load your profile. The PowerShell prompt and PowerShell ISE each have their own profile but I like mine to be the same so here’s what I did:

  1. First, I created a profile.ps1 file in $home\Documents\WindowsPowerShell\. This is also where you’ll find the default profile files, Microsoft.PowerShell_profile.ps1 and Microsoft.PowerShellISE_profile.ps1.
  2. Second, in the default files, just add a line to redirect the profile like so:
    $profile="$home\Documents\WindowsPowerShell\profile.ps1"
  3. Third, save all 3 files. Next time you open PowerShell or the ISE, it will load your profile.ps1 (which might be empty at this point).

Let’s walk through what’s in my profile.ps1. At the end, I’ll include a link so you can download it in its entirety.

Logging and PSReadLine

Here I’m setting my log path and will log everything I type so I have a full transcript of my session if ever I need it. Next, some PSReadLine settings. @roggenk has a good blog post that covers PSReadLine in some detail so head over and check that out. I’ve got some of the same settings here that improve the user experience. I have two custom key bindings, one to start the @vivaldibrowser browser (my favorite browser!) and switches focus to the browser right away and the other to perform a git commit/push.

#Logging
#Logging
$PSLogPath = ("{0}{1}\Documents\WindowsPowerShell\log\{2:yyyyMMdd}-{3}.log" -f $env:HOMEDRIVE, $env:HOMEPATH,  (Get-Date), $PID)
Add-Content -Value "# $(Get-Date) $env:username $env:computername" -Path $PSLogPath -erroraction SilentlyContinue
Add-Content -Value "# $(Get-Location)" -Path $PSLogPath -erroraction SilentlyContinue

# PSReadLine Settings
Set-PSReadLineOption -HistorySearchCursorMovesToEnd
Set-PSReadlineKeyHandler -Key UpArrow -Function HistorySearchBackward
Set-PSReadlineKeyHandler -Key DownArrow -Function HistorySearchForward 
Set-PSReadlineOption -BellStyle None #Disable ding on typing error
Set-PSReadlineOption -EditMode Emacs #Make TAB key show parameter options
Set-PSReadlineKeyHandler -Key Ctrl+i -ScriptBlock { Start-Process "${env:ProgramFiles(x86)}\vivaldi\Application\vivaldi.exe" -ArgumentList "https://www.bing.com" } #KEY: Load Browsers using key "C:\Program Files (x86)\Vivaldi\Application\vivaldi.exe"

#KEY: Git, press Ctrl+Shift+G (case sensitive)
Set-PSReadlineKeyHandler -Chord Ctrl+G -ScriptBlock {
        $message = Read-Host "Please enter a commit message"
        /usr/bin/git commit -m "$message" | Write-Host
        $branch = (/usr/bin/git rev-parse --abbrev-ref HEAD)
        Write-Host "Pushing ${branch} to remote"
        /usr/bin/git push origin $branch | Write-Host
} 

Functions

Next step, some functions. The first one just shows some nice output to get the current directory size, which should be built-in! The next one checks to see if the PowerShell prompt is running as an administrator; you’ll see why we need this later. I stole borrowed both of these from @jaredtrog including most of the other stuff in my prompt below (next section).

#Functions
function Get-DirectorySize($Path='.',$InType="MB")
{
    $colItems = (Get-ChildItem $Path -recurse | Measure-Object -property length -sum)
    switch ($InType) {
        "GB" { $ret = "{0:N2}" -f ($colItems.sum / 1GB) + " GB" }
        "MB" { $ret = "{0:N2}" -f ($colItems.sum / 1MB) + " MB" }
        "KB" { $ret = "{0:N2}" -f ($colItems.sum / 1KB) + " KB"}
        "B" { $ret = "{0:N2}" -f ($colItems.sum) + " B"}
        Default { $ret = "{0:N2}" -f ($colItems.sum) + " B" }
    }
    Return $ret
}
function Test-IsAdmin {
([Security.Principal.WindowsPrincipal] [Security.Principal.WindowsIdentity]::GetCurrent()).IsInRole([Security.Principal.WindowsBuiltInRole] "Administrator")
}

The Prompt

Finally, here’s my prompt. You’ll see the Test-IsAdmin function being used here to set the prompt’s text color. This is really useful to know immediately if you’re running an elevated prompt.

function global:prompt {
    #Put the full path in the title bar

    $console = $host.ui.RawUI
    $console.ForegroundColor = "gray"
    $host.UI.RawUI.WindowTitle = Get-Location

    #Set text color based on admin
    if (Test-IsAdmin) {
        $userColor = 'Red'
    }
    else {
        $userColor = 'White'
    }

    #Setup command line numbers
    $LastCmd = Get-History -Count 1
    if($LastCmd)
    {
        $lastId = $LastCmd.Id
        
        Add-Content -Value "# $($LastCmd.StartExecutionTime)" -Path $PSLogPath
        Add-Content -Value "$($LastCmd.CommandLine)" -Path $PSLogPath
        Add-Content -Value "" -Path $PSLogPath
    }

    $nextCommand = $lastId + 1
    $fullPath = Get-Location

    Write-Host "[$($pwd)]" -ForegroundColor "Cyan"
    Write-Host -NoNewline '[' -ForegroundColor "Gray"
    Write-Host -NoNewline "$([System.Environment]::UserName)" -ForegroundColor $userColor
    Write-Host -NoNewline '@'
    Write-Host -NoNewline "$([System.Environment]::MachineName)  $nextCommand" -ForegroundColor $userColor
    Write-Host -NoNewline ']' -ForegroundColor "Gray"
    #Use $host.EnterNestedPrompt() to test a nested prompt. 
    Write-Host -NoNewline " PS$('>' * ($nestedPromptLevel + 1)) " -ForegroundColor $userColor
    Return " "
} 

Here’s how it looks:

PowerShell Prompt

Having the folder path on its own line gives me more space to focus on my account PowerShell command. Here’s how the Administrator prompt looks:

PowerShell Prompt as Administrator

Notice the red text. Almost forgot the nice ASCII. Just use any online ASCII generator and use Write-Host.

Clear-Host
Write-Host '   _       __      __    _     __'
Write-Host '  | |     / /___ _/ /_  (_)___/ /'
Write-Host '  | | /| / / __ `/ __ \/ / __  / '
Write-Host '  | |/ |/ / /_/ / / / / / /_/ /  '
Write-Host '  |__/|__/\__,_/_/ /_/_/\__,_/   '
Write-Host '                                 ' 

That’s all! If you have any other tips, share them in the comments below.

0 comments

Toying with Azure Functions – A Trade Log

Windows PowerShell, Azure Infrastructure
Toying with Azure Functions – A Trade Log post image

Serverless.

The best way for me to learn something is to use it, I’m hands-on. After being barraged by articles and hype around Azure Functions, I decided to try it out. But, I’m not a developer. All the new stuff that the cool kids are playing with today (IoT, Serverless, Bots, AI, etc.) requires real dev skills but that’s just not what I do. Sorry, not sorry. So here we go with PowerShell and Azure Functions. Warning: support for PowerShell is “experimental” and I found that I had to do a lot of workarounds.

The project

I’m going to use Azure Functions to generate a trade log. I’ve used all kinds of stock brokers (Schwab, Scottrade, Fidelity, etc.) but no platform gives you the flexibility and cost of Interactive Brokers ($IBKR). Most of my trades cost $1.00 but you get what you pay for. $IBKR doesn’t have a nice, rich, web interface that shows your trade history. Instead you have to run  your own reports and luckily there’s an API for it. My function is going to create a “up to date” log of the trades that have been executed (a lot of my trading is automated so I might not know when something has been bought or sold).

I’m going to focus on Functions and specifically this project. There’s a lot of great documentation on the Interactive Broker API as well as Azure Functions on their respective official documentation pages.

Setting up

So the first thing I’m going to do is create a new Azure Functions app using PowerShell and starting with the Timer trigger. Let’s call this “GetIBTrades2Blob.” We’re going to go right into the “Integrate” section and configure the following:

Triggers: Timer Trigger called “Daily0200ZZ” that has a schedule of “0 0 2 ***” That means, it will run at ANY day of week, month, or day at 2 hrs, 0 minutes, and 0 seconds. That’s 2:00:00 daily (Zulu time), which is 8 PM Eastern.

image

Inputs: Azure Blob Storage. I have a parameter name “inputBlob” with an explicit path, “trades/tradelog.csv” and a Storage account connection. Azure Functions sets this up nicely just by going through the portal UI. One note is that I struggled a lot with the path. It seems PowerShell doesn’t really support variables here (the default is {file}). I hope this will be fixed at some point but it may just be a limitation of the language.

image

Output: Azure Blob Storage. This is setup almost exactly the same way and the input, we’re just going to use a different parameter name but the same path and connection.

image

All of that just writes your functions.json file for you. Next, we need to write the code in run.ps1.

Code

The code isn’t really relevant for this blog post except for a few key pieces that make everything work. The code might as well just be something like

“hello world” | Out-File –FilePath $outputBlob

But, let’s see what we’re dealing with for Interactive Brokers. First, we need two parameters, a Token Code and a QueryID (see this link). This can be retrieved from your $IBKR account. For the QueryID, I created a new one (that is, a new Flex Query) that just gets my trades for the current day (see this link).

Now that we have those, we can get the trade confirmation report for today using Invoke-WebRequest and doing some parsing.

$token = "000000000000000000000000"
$q = "123456"

$request = "https://gdcdyn.interactivebrokers.com/Universal/servlet/FlexStatementService.SendRequest?t=$token&q=$q&v=3"

$response = Invoke-WebRequest $request -UseBasicParsing

[xml]$xml = $response.Content
[string]$refCode = $xml.ChildNodes.ReferenceCode
[string]$flexUrl = $xml.ChildNodes.Url

$reqData = $flexUrl + "?q=$refCode&t=$token&v=3"
$responseData = Invoke-WebRequest $reqData -UseBasicParsing
   
$content = $responseData.Content

What will be returned here is an array containing the trades returned by the query, for example:

Date/Time,Symbol,Quantity,Price,Amount,Commission,OrderType,TradeDate
20180518;134156,XYZ,-100,5.2,-94,-1,LMT,20180518

Great! But now I need to add this trade to the existing ones (from yesterday, the day before, and before that, etc.).

This is a learning exercise so I decided to get the current days trades. I realize that there is a “Month to date” Flex Query available in $IBKR. One of the things that I learned through this exercise is that appending data is not easy (or possible) in Azure Functions using PowerShell. For C#, many examples exist using Append Blob (some information here on that). So, with that said, we have to do a workaround. The workaround will be to get the existing tradelog.csv and that’s exactly why we added the Azure Blob Storage as an input.

#Remove header information from $content since it will already be there in the file we’re getting
$cleanContent = $content | Select-Object -Skip 1

#Get the current file from blob storage
$inputArray = Get-Content -Path $inputBlob

foreach ($line in $cleanContent)
     {
         $inputArray += $line
     }


Out-File -Encoding ASCII -FilePath $outputBlob -inputObject $inputArray

This is pretty simple. First we remove the heading. Then we loop through each line in the returned results for today’s trade since there could have been more than one trade. The last line is where we store the full text (including the original input) back to Azure Blob Storage. Actually, by adding additional Outputs, you can save the file in many places. For my purposes, I also have the file going to my OneDrive.

In my example, I was working with text. Working with binaries or media would be much harder because of some content type issues, unless you use C# or another supported language. A big thanks to jschmitter, who patiently assisted me working all of this out.

0 comments

Using Azure Policy Sets

Azure Infrastructure

At Microsoft Ignite in September 2017, Ryan Jones (@rjmax) discussed Azure Resource Manager Policies and some enhancements coming soon. See this blog post about the public preview announcement. One of those, was Policy Sets. Policy Sets allow you to group several policies together and assign them as a group. There’s more information at http://aka.ms/azurepolicy. In this blog post, we’re going to explore how to start using them.

Start with existing policies

In order to create a policy set, we need existing policy definitions. You can use some built-in ones but for this example, I’ve created 4 custom policies which are related to storage:

  1. Audit VM’s that don’t use Managed Disks
  2. Deny deployment if Storage Account Blob Encryption is not enabled.
  3. Deny deployment if Storage Account File Encryption is not enabled.
  4. Deny deployment if Storage Account https-only transport (secure transfer required) is not enabled.

The first thing we need to do is get the Policy Definition Id’s. We need the Policy Definition Name to get those. Here, I’ve looked up the names and created an array of the Policy Definitions that I want in my new Policy Set.

$policyNames = @( "audit-managedDisks", 
"deny-NoBlobEncryption", 
"deny-NoFileEncryption", 
"deny-NoHttpsOnly" 
)

Next, I’m going to loop through those and get the Policy Definition Id and store it in a variable called $policyDefinitionId. Since I need to loop through each Policy Definition now, I’m also going to construct an object called $policySetDefinition which we’re going to use later.

$Target=@()
$policyDefinitionIds = @()
$policyNames | 
    ForEach-Object {
        $policyDefinitionId = (Get-AzureRmPolicyDefinition -Name $_ | select -ExpandProperty PolicyDefinitionId)
        $TargetObject = New-Object PSObject –Property @{policyDefinitionId=$policyDefinitionId}
        $Target +=  $TargetObject
    }
$policySetDefinition = $target | ConvertTo-Json

 

Define and Assign the Policy Set

Next we need to create the Policy Set Definition

$policySetParams = @{ 
Name = "policySet-Storage" 
DisplayName = "Storage: Policies to enhance security of Storage Accounts." 
Description = "This initiative contains several Storage Policies to be applied at the subscription level." 
PolicyDefinition = $policySetDefinition 
} 
$policySet =  New-AzureRmPolicySetDefinition @policySetParams -Verbose

Notice that this command takes the $policySetDefinition we created earlier into the PolicyDefinition parameter. Now that we’ve created a Policy Set Definition, we can assign it.

For this example, I’m going to assign it to my Subscription but I need to exclude two Resource Groups. With the new policy language, that’s pretty easy to do. We are also going to define the Sku. The Sku is an object consisting of a name and tier. I’m going with Standard here because I want to enforce this policy set on existing resources. If you only wanted to enforce the policy set on new resources, set this to A0 and Free.

$ExcludedResourceGroup1 = Get-AzureRmResourceGroup -Name "rg-aad" 
$ExcludedResourceGroup2 = Get-AzureRmResourceGroup -Name "securitydata" 
$sku = @{ 
name="A1" 
tier="Standard" 
}

$policyAssignmentParams = @{ 
Name = "StoragePolicySetAssignment" 
DisplayName = "Storage Policy Set" 
Description = "This initiative contains several Storage Policies to be applied at the subscription level." 
PolicySetDefinition = $policySet 
Scope = "/subscriptions/{guid-of-subscription}" 
NotScope = $ExcludedResourceGroup1.ResourceId, $ExcludedResourceGroup2.ResourceId 
Sku = $sku 
} 
$new = New-AzureRmPolicyAssignment @policyAssignmentParams


Visualize

That’s it! Let’s take a look at how this appears in the new Policy UI. Here’s the Assignments blade:

image

And here’s the actual Assignment showing the scope, exclusions, and sku:

image
1 comment

Trusted Sites With IE ESC Turned On

Microsoft, Enterprise

When protecting users from malicious internet sites on servers, we can keep Internet Explorer Enhanced Security Configuration turned on (that’s the default). However, with the setting on, almost no modern website will properly load.

The solution is to add these sites you trust to your Trusted Sites zone. In an enterprise environment, we would leverage Active Directory Group Policy to do this. The way to add specific sites to a zone is well-documented. In short, we use the Sites to Zone Assignment List policy.

However, this doesn’t work with IE ESC turned on. This KB article hints at why. Although it says it applies to Windows Server 2003, I’m working with Windows Server 2016. My machine will be a Remote Desktop host and I want to lock it down. So, how can we keep IE ESC on and allow a specific list of sites to have looser security settings?

It seems there’s two ways this could work. In both cases, we still configure the Sites to Zone Assignment List:

1. Apply the Method 2 workaround listed in the KB article above. You may have to create the keys that are not present.

image

2. This one is much more complicated. We will configure the EscDomains registry key. This key is described in a support article:

The EscDomains key resembles the Domains key except that the EscDomains key applies to those protocols that are affected by the Enhanced Security Configuration (ESC). ESC is introduced in Microsoft Windows Server 2003.

We can use Group Policy Registry Settings to update the registry. Enter the values in HKEY_LOCAL_MACHINE\SOFTWARE\Policies\Microsoft\Windows\CurrentVersion\Internet Settings\ZoneMap\ESCDomains.  Once completed, it will look something like the below:

Group Policy view:

image

Registry view:

image

Regardless of method, you can load up one of your sites and click File, Properties. The Zone should show “Trusted Sites.”

image

One note, this machine also has other Group Policies applied from the Microsoft Security Guidance blog and other policies may change the behavior. Is there a better way? I don’t know, you tell me, comments below.

2 comments

Managed Storage Account SAS Tokens

Azure Infrastructure

Introduction

Building on the previous blog post where we configured Azure Key Vault to automatically rotate Storage Account Keys, this post will discuss SAS tokens (Shared Access Signatures). As a quick refresher, using SAS tokens is the recommended way to interact with your Storage Account. For more information see Using shared access signatures.

Create SAS Definition

In order to create a SAS definition, you will need the setsas permission. We can add this to the list of permissions we used in the previous blog post:

Set-AzureRmKeyVaultAccessPolicy -VaultName $keyVaultName -ResourceGroupName $keyVaultResourceGroupName -UserPrincipalName $upn -PermissionsToStorage set, get, regeneratekey, setsas

Now, we can create a SAS definition. For my example, I want to:

  • Limit the SAS definition the Blob service (and not Tables, Queues, or Files).
  • Name it sas1
  • Limit to https only
  • Limit to only my current IP address (IP whitelist)
  • Limit the validity of the token to 5 days
  • Limit the permissions to Read and Write.

In the command below, I’m first getting my IP address using ipinfo.io site. Then I’m using the Set-AzureKeyVaultManagedStorageSasDefinition to create a new definition.

$ip = Invoke-RestMethod http://ipinfo.io/json | Select -ExpandProperty ip
$sasDefinition = Set-AzureKeyVaultManagedStorageSasDefinition `
-Service Blob `
-ResourceType Container,Object `
-VaultName $keyVaultName `
-AccountName $storageAccountName `
-Name 'sas1' `
-Protocol HttpsOnly `
-IPAddressOrRange $ip `
-ValidityPeriod ([System.Timespan]::FromDays(5)) `
-Permission Write,Read

Once you do this, you’ll see a new secret in your Key Vault. Now, let’s get the secret value.

$secret = Get-AzureKeyVaultSecret -VaultName $keyVaultName -Name ($sasDefinition.sid).Split("/")[-1]
$sasToken = $secret.SecretValueText

Use SAS Definition

That’s it! Now, let’s test this by uploading a file. There’s nothing new here, I’m simply using Set-AzureStorageBlobContent with a context. The context is generated from the SAS token we retrieved in the previous step.

$container = "docs"
$localFile = "C:\Temp\FUNDAMENTALS OF AZURE 2ND ED.pdf"
$blobName = "Fundamentals of Azure.pdf"
$ctx = New-AzureStorageContext -SasToken $sasToken -StorageAccountName $storageAccountName
Set-AzureStorageBlobContent -File $localFile -Container $container -Blob $blobName -Context $ctx –Verbose

There could be many use cases for this. For example, if several users need to upload files to blob storage, you can generate a unique SAS for each one. Another example could be an application querying Key Vault to get a SAS token. This one is documented on the official documentation here.

0 comments