Category Archives: PowerShell - Page 2

Cisco UCS PowerTool Suite – Part 1

I thought I would created a short blog series on a very underrated collection of PowerShell modules from Cisco called the Cisco UCS PowerTool Suite.  The UCS PowerTool Suite was released back in early 2013 and has been steadily growing and maturing since.   The current release of the PowerTool Suite, as of this blog post, is 2.2.1 which contains 5 modules and over 4500 Cmdlets!   Yes that’s right, over 4500 Cmdlets, crazy huh.

PowerTool brings PowerShell and all its goodness to Cisco UCS and allows you to script and automated your UCS management is a very powerful way.  PowerTool can connect to Cisco UCS Manager, UCS Central and UCS IMC (namely C-Series and E-Series).  PowerTool isn’t doing anything special behind the scenes.  It connects via the standard XML APIs that the Java GUI uses to connect to things like UCS Manager, as well as respecting and working with the Management Information Tree (MIT) that UCS is built on.

In Part 1 of this series I run through the basics of installing UCS PowerTool and connecting to your first UCS Manager.

Before you install the UCS PowerTool Suite you need to meet a few requirements.  PowerTool is not currently compatible with PowerShell Core so at present you will need a Windows box running the following.

  • Windows PowerShell 3.0 or higher
  • .NET Framework Version 4.5 or higher
  • PowerShell 4.0 and higher for the DSC module resources

Once you met these requirements you can download the latest version of UCS PowerTool from Cisco.  Then proceed to install from the MSI file.  The installation wizard is straight forward and will copy the modules to your C:\Program Files (x86)\WindowsPowerShell\Modules folder along with three shortcuts to your desktop.  Each shortcut, Cisco IMC PowerTool, Cisco UCS Central PowerTool, and Cisco UCS Manager PowerTool, runs a small startup script that basically loads their respective module.

We don’t need to actually use these shortcuts if we choose not to.  We can just run PowerShell as we normally would and import the modules as needed.  If we’re running Windows Server, though,  these module will actually auto load for us.

Below is what we see when we use the shortcut, Cisco UCS Manager PowerTool.

Below we will delve into connecting to our first UCS Manager, but first let’s run through a few of the basics.  First we run Get-Module -ListAvailable.  This will show us all the modules available on our system.  Below we can see the five Cisco modules we just installed.

PowerTool C:\> Get-Module -ListAvailable

    Directory: C:\Program Files (x86)\WindowsPowerShell\Modules

ModuleType Version    Name                                ExportedCommands
---------- -------    ----                                ----------------
Binary    Cisco.IMC                           {FnResetImcPowerProfile, FnTestImcLd...
Binary    Cisco.UCS.Core                      {Add-UcsHardwareProfile, Get-UcsPowe...
Manifest    Cisco.UCS.DesiredStateConfiguration {Get-UcsConnection, Get-ImcConnection}
Binary    Cisco.UCSCentral                    {Connect-UcsCentral, Disconnect-UcsC...
Binary    Cisco.UCSManager                    {Connect-Ucs, Disconnect-Ucs, Start-...
Script     1.0.1      Microsoft.PowerShell.Operation.V... {Get-OperationValidation, Invoke-Ope...
Binary    PackageManagement                   {Find-Package, Get-Package, Get-Pack...
Binary    PackageManagement                   {Find-Package, Get-Package, Get-Pack...
Script     3.4.0      Pester                              {Describe, Context, It, Should...}
Script    PowerShellGet                       {Install-Module, Find-Module, Save-M...

PowerTool C:\>

Next we run Get-Command -Module Cisco.UcsManager.  This displays all the Cmdlets inside this module, all 4500+ of them!  Once you’ve memorised them all we can move on… just kidding ūüôā

PowerTool C:\> Get-Command -Module Cisco.UcsManager
Cmdlet          Set-UcsWwnInitiator                          Cisco.UCSManager
Cmdlet          Set-UcsWwnPool                               Cisco.UCSManager
Cmdlet          Start-UcsGuiSession                          Cisco.UCSManager
Cmdlet          Start-UcsKvmSession                          Cisco.UCSManager
Cmdlet          Start-UcsServer                              Cisco.UCSManager
Cmdlet          Start-UcsTransaction                         Cisco.UCSManager
Cmdlet          Stop-UcsServer                               Cisco.UCSManager
Cmdlet          Sync-UcsManagedObject                        Cisco.UCSManager
Cmdlet          Undo-UcsTransaction                          Cisco.UCSManager
Cmdlet          Update-UcsCatalogue                          Cisco.UCSManager
Cmdlet          Update-UcsFirmware                           Cisco.UCSManager
Cmdlet          Watch-Ucs                                    Cisco.UCSManager

PowerTool C:\>

To connect to our UCSM we use the Cmdlet Connect-Ucs. To find out how to do this we can use Get-Help to find example syntax.

PowerTool C:\> get-help connect-ucs


    Connects to a UCS

    Connect-Ucs [-Name] <string[]> [-Credential] <PSCredential> [-Port <ushort>] [-NoSsl]
    [-NotDefault] [-Proxy <WebProxy>] [<CommonParameters>]

    Connect-Ucs -LiteralPath <string> -Key <SecureString> [-NotDefault] [-Proxy <WebProxy>]

    Connect-Ucs -Path <string> -Key <SecureString> [-NotDefault] [-Proxy <WebProxy>]

    Connects to a UCS. The cmdlet starts a new session using the specified parameters. One can
    have more than one connections to a server. PowerTool Supports working with multiple default
    servers. This can be enabled by setting SupportMultipleDefaultUcs using

We’re now ready to make our first connection.  In the below example we connect using the Cmdlet Connect-Ucs and save the connection to the variable $handle1.  This gives us the flexibility to connect to multiple UCSM devices at the same time and run commands against them.  Something which I’ll cover more on in a future post.

PowerTool C:\> $handle1 = Connect-Ucs -Name

Running the command gives a credential request dialog box. Enter in the same credentials you normally would when connecting to your UCSM.

If we run Get-UcsPSSession we can display our current session details.  Here you can see that we’re connected to UCS UCSPE-10-1-1-11

PowerTool C:\> Get-UcsPSSession

NumPendingConfigs : 0
Ucs : UCSPE-10-1-1-11
Cookie : 1494751391/e26549b0-557a-4ba7-83a8-c1ae36468ebb
Domains : org-root
LastUpdateTime : 14-May-17 6:43:14 PM
Name :
NoSsl : False
NumWatchers : 0
Port : 443
Priv : {aaa, admin, ext-lan-config, ext-lan-policy...}
PromptOnCompleteTransaction : False
Proxy : 
RefreshPeriod : 600
SessionId : 
TransactionInProgress : False
Uri :
UserName : ucspe
Version : 3.1(2b)
VirtualIpv4Address :
WatchThreadStatus : None

Here’s where things get a little interesting.  We can export this session to an XML file, using Export-UcsPSSession, and with a secure key we can connect to our UCS in the future without providing credential details.

In the below example we export our current session to an XML file called ucspe.xml and type in a secure key.  Next using ConvertTo-SecureString we can export the key we used to a file called ucspe.key which we can use to decrypt our password in the XML file.

PowerTool C:\> Export-UcsPSSession -LiteralPath C:\cisco\ucspe.xml
cmdlet Export-UcsPSSession at command pipeline position 1
Supply values for the following parameters:
Key: ********

PowerTool C:\> ConvertTo-SecureString -String "Password" -AsPlainText -Force | ConvertFrom-SecureString | Out-File ucspe2.key

Now we can use our key file and our XML file to connect to our UCSM without being prompted for credentials.

PowerTool C:\> $key = ConvertTo-SecureString (Get-Content C:\cisco\ucspe.key)

PowerTool C:\> $handle1 = connect-ucs -Key $key -LiteralPath C:\cisco\ucspe.xml

The key file should, of course, be treated as highly sensitive.  Steps should be taken to protect unauthorized people accessing and reading this file.  I find a good way to protect it is by locking down permissions on the file and folder where the XML and key file are stored.  In my case only myself and the Scheduled Task account that requires it can access the file.

Last we should know how to cleanly disconnect from our UCSM session.  This simply requires the use of Disconnect-Ucs.  In the below example we also reference our session in the variable $handle1 which is good practice if we are connecting to multiple UCSM devices.

PowerTool C:\> Disconnect-Ucs -Ucs $handle1

In Part 1 of this series I cover the minimum requirements you need on your system before install PowerTool.  I then go through the fundamental basics of making your first connection to a UCS Manager. Then taking it one step further and showing how we can future connect without providing credentials. Finally I show how to disconnect from the UCSM. In Part 2 of this series I will run through the basics of querying information from UCSM via some of the 4500+ Cmdlets.

Cisco UCS PowerTool Suite ‚Äď Part 1
Cisco UCS PowerTool Suite ‚Äď Part 2
Cisco UCS PowerTool Suite ‚Äď Part 3

Cisco UCS PowerTool Suite
Cisco UCS PowerTool Suite Communities Page

Make PowerShell As Cool As You. Modify Your Default Profile.

Do you feel that PowerShell just isn’t as cool as you? ¬†Do you wish that you could make it cool like yourself? ¬†Me too!

So how do we do this? ¬†By modifying the default PowerShell profile. ¬†The default profile is just a standard ps1 file that runs each time you launch a PowerShell Console. ¬†A quick search online and you’ll find there are five different locations this profile can live -crazy huh. ¬† There’s All Users, ISE, Personal. ¬†The best one to modify I found is the one that lives in your WindowsPowerShell folder in your User Profile path (C:\Users\{Username}\Documents\WindowsPowerShell\). ¬†By default the file probably doesn’t exist but you can easily create it by making a file called profile.ps1¬†and placing in this path.

One of the quickest¬†ways to enhance the PowerShell prompt is to add a little ASCII art to PowerShell when the console is first loaded. ¬† I originally got the idea from a session I went to by Chris Wahl. ¬†Chris used a simple one line piece of ACSII art showing his venting of emotion flipping a table. ¬†You can find a lot of this art at ¬†The issue I found was using the extended ASCII character set was difficult to use. ¬†Most of the time it didn’t translate well using Write-Host in PowerShell.

This gave me the idea of using a here-string to create a multi-line string.  It opens the door to creating large ACSII art images using the standard character set which makes it much more compatible on-screen.  A good starting place for this art is

So how can we achieve this? ¬†As mentioned above we create a profile.ps1 file in ‘C:\Users\{Username}\Documents\WindowsPowerShell\’. ¬†Then paste our art between @” and “@ characters and save it to a variable. ¬†In the below example we call it $block.

We can print it out on the screen using Write-Host which also allows us to change the foreground color using -ForegroundColor.

$block = @"

.     .       .  .   . .   .   . .    +  .
  .     .  :     .    .. :. .___---------___.
       .  .   .    .  :.:. _".^ .^ ^.  '.. :"-_. .
    .  :       .  .  .:../:            . .^  :.:\.
        .   . :: +. :.:/: .   .    .        . . .:\
 .  :    .     . _ :::/:               .  ^ .  . .:\
  .. . .   . - : :.:./.                        .  .:\
  .      .     . :..|:                    .  .  ^. .:|
    .       . : : ..||        .                . . !:|
  .     . . . ::. ::\(                           . :)/
 .   .     : . : .:.|. ######              .#######::|
  :.. .  :-  : .:  ::|.#######           ..########:|
 .  .  .  ..  .  .. :\ ########          :######## :/
  .        .+ :: : -.:\ ########       . ########.:/
    .  .+   . . . . :.:\. #######       #######..:/
      :: . . . . ::.:..:.\           .   .   ..:/
   .   .   .  .. :  -::::.\.       | |     . .:/
      .  :  .  .  .-:.":.::.\             ..:/
 .      -.   . . . .: .:::.:.\.           .:/
.   .   .  :      : ....::_:..:\   ___.  :/
   .   .  .   .:. .. .  .: :.:.:\       :/
     +   .   .   : . ::. :.:. .:.|\  .:/|
     .         +   .  .  ...:: ..|  --.:|
.      . . .   .  .  . ... :..:.."(  ..)"
 .   .       .      :  .   .: ::/  .  .::\


Write-Host $block -ForegroundColor Green

We don’t stop here though. ¬†There’s a few other things we can do. ¬†Most of my code lives in a specific folder. ¬†So we can modify the default folder path that PowerShell opens to by added the below code to the bottom of the profile.ps1 file. ¬†The next time PowerShell opens its defaults to this path.

Set-Location 'C:\Folder\Code'

Let’s now fix that crappy Title Bar and change it to something more inspirational.

$host.ui.RawUI.WindowTitle = 'Know yourself and you will win all battles.'

Finally let’s fix that bland prompt and give it a little color.

function Prompt
    $promptString = "PS " + $(Get-Location) + "&gt;"
    Write-Host $promptString -NoNewline -ForegroundColor Cyan
    return " "

Let’s see what our PowerShell Console now looks like. ¬†Much better! ¬†We can now show people how cool we really are. ¬†When people see our console they will tremble in our PowerShell skillz ūüėõ

Putting it all together, let’s see what the code looks like.

$block = @"

.     .       .  .   . .   .   . .    +  .
  .     .  :     .    .. :. .___---------___.
       .  .   .    .  :.:. _".^ .^ ^.  '.. :"-_. .
    .  :       .  .  .:../:            . .^  :.:\.
        .   . :: +. :.:/: .   .    .        . . .:\
 .  :    .     . _ :::/:               .  ^ .  . .:\
  .. . .   . - : :.:./.                        .  .:\
  .      .     . :..|:                    .  .  ^. .:|
    .       . : : ..||        .                . . !:|
  .     . . . ::. ::\(                           . :)/
 .   .     : . : .:.|. ######              .#######::|
  :.. .  :-  : .:  ::|.#######           ..########:|
 .  .  .  ..  .  .. :\ ########          :######## :/
  .        .+ :: : -.:\ ########       . ########.:/
    .  .+   . . . . :.:\. #######       #######..:/
      :: . . . . ::.:..:.\           .   .   ..:/
   .   .   .  .. :  -::::.\.       | |     . .:/
      .  :  .  .  .-:.":.::.\             ..:/
 .      -.   . . . .: .:::.:.\.           .:/
.   .   .  :      : ....::_:..:\   ___.  :/
   .   .  .   .:. .. .  .: :.:.:\       :/
     +   .   .   : . ::. :.:. .:.|\  .:/|
     .         +   .  .  ...:: ..|  --.:|
.      . . .   .  .  . ... :..:.."(  ..)"
 .   .       .      :  .   .: ::/  .  .::\


Write-Host $block -ForegroundColor Green

Set-Location 'C:\Folder\Code'

$host.ui.RawUI.WindowTitle = 'Know yourself and you will win all battles.'

function Prompt
    $promptString = "PS " + $(Get-Location) + "&gt;"
    Write-Host $promptString -NoNewline -ForegroundColor cyan
    return " "

Pretty simple right. ¬†A few small additions and we’ve brought our console to life. ¬†When all is said and done though, none of this will actually make us better scripters. ¬†But when your spending all day at a console prompt why not bring a little of you into it? ¬†I’d love to know what you do?

Streaming Datasets – PowerShell | PowerCLI | Power BI

A large part of my day is spent scripting in PowerShell, specifically with PowerCLI. ¬†One of the strongest areas¬†of PowerCLI, obviously, is being able to retrieve information. ¬†It’s one of the key use cases, in my opinion, for using PowerCLI in a VMware environment, it’s ability to retrieve information for Capacity planning and reporting.

Recently I’ve been looking at how to consume all that information. ¬†You can obviously export it to a CSV, push it into a database, or something that I’ve been playing around with recently, stream it into Power BI. ¬†Now if you haven’t tried it out yet, PowerBI is an analytics service from Microsoft. ¬†At its core it’s a data warehouse for business intelligence. ¬†But putting all those fancy words aside, I use it to create fancy reports.

Exporting information out of a vCenter environment with PowerCLI is dead simple. ¬†I have dozens of scheduled tasks running all the time doing this. ¬†Where I’ve fallen down, is taking that information and trending it over time. ¬†This is where the Streaming Datasets functionality of Power BI comes in. ¬†Using PowerCLI I can get an Object and Value from vCenter and then Post that directly into Power BI, using their API, and have it instantly graphed in a report. ¬†I can then share that report out to anyone I want. ¬†Power BI lets me do this over and over, almost as fast as I can pull the information out of vCenter.

In the example below I show how to create a trend report over time that displays Total and Available Storage of a vCenter Cluster.  Rather simple, I know, but can easily be adapted to show things like number of running VMs running, reserved resources used, etc, etc.  The skies the limit really.

Before we do any scripting the first thing we do is log into Power BI. ¬†If you don’t have an account, don’t worry, the basic version is free. ¬†Hit the Sign Up link and make sure you select Power BI and not Power BI Desktop for Windows, we want the cloud version.

Once logged in we click on Streaming Datasets in the bottom left¬†under the Datasets category. ¬†This is where we create our¬†initial dataset schema so that it can accept streaming input. ¬†We click on ‘Add streaminig dataset’ in the top right.

Then select the source of data, which will be API and click next.

We give our New Streaming Dataset a name and define a few values.  In this example we will define a Date, Total Storage, and Available Storage value, turn on Historic Data Analysis and click Create.  Make note of your data type to the right of the value.  Date is DateTime and the other two are Numbers.

We’ve now created our schema and are provided with a Push URL address and sample code in a few different formats (we want PowerShell). ¬†If you look carefully we are using an Invoke-RestMethod to Post to Power BI. ¬†This sample code has just provided us the template and hardest part of our PowerShell / PowerCLI script. ¬†Click over the code and copy / pasta it out to use in our script (Paste it at the bottom of the script as it will be the last thing that runs).

Now we actually start on the PowerShell / PowerCLI script. ¬†To keep it as concise as possible. ¬†I’ve skip the process I use to actually connect to the vCenter and retrieve the information out using PowerCLI in the code below. ¬†The real goal here is just to retrieve some values and get that into Power BI. ¬†Line 6 is basically retrieving all shared VMFS datastores in Cluster1. ¬†The important lines to note, though, are 4, 8, and 9 where I store my key values in three variables. ¬†One for Date, one for TotalStorage, and one for¬†AvailableStorage.

Import-Module VMware.VimAutomation.Core
Connect-VIServer -Server host.mydomain.local

$date = Get-Date

$datastore = Get-Cluster -Name Cluster1 | Get-Datastore | Where-Object {$_.Type -eq 'VMFS' -and $_.Extensiondata.Summary.MultipleHostAccess}

$TotalStorage = ($datastore | Measure-Object -Property CapacityMB -Sum).Sum / 1024
$AvailableStorage = ($datastore | Measure-Object -Property FreeSpaceMB -Sum).Sum / 1024 

The additional lines below from 11 onward is the important code. ¬†This is our pasted sample code from Power BI that we will slightly modify to push our values up to Power BI. ¬†Don’t copy mine, as your URL and key will be different. ¬†On lines 13, 14, and 15 we will remove the example values and replace it with our three variables, $Date, $TotalStorage, and $AvailableStorage.

Import-Module VMware.VimAutomation.Core
Connect-VIServer -Server -user "mydomain\username"

$date = Get-Date

$datastore = Get-Cluster -Name Cluster1 | Get-Datastore | Where-Object {$_.Type -eq 'VMFS' -and $_.Extensiondata.Summary.MultipleHostAccess}

$TotalStorage = ($datastore | Measure-Object -Property CapacityMB -Sum).Sum / 1024
$AvailableStorage = ($datastore | Measure-Object -Property FreeSpaceMB -Sum).Sum / 1024 

$endpoint = ""
$payload = @{
"Date" = $Date
"Total Storage" = $TotalStorage
"Available Storage" = $AvailableStorage
Invoke-RestMethod -Method Post -Uri "$endpoint" -Body (ConvertTo-Json @($payload))

Disconnect-VIServer * -Confirm:$false

On the last line I disconnect  from my vCenter and close any sessions.  This helps if running as a scheduled task.  Finally save the script.

And that’s it for the scripting part. ¬†Assuming everything is correct, no connection issues, correct values being retrieved. ¬†All we have to do is run the script and it will send a POST request using Invoke-RestMethod with our three values. ¬†We can now run this script as many times as we want and it will continue to post the current date and time along with Total Storage and Available Storage. ¬†At this point, if we wish, we can turn the script into a scheduled task or just continue to run manually to suit our needs.

We now go back to Power BI and report on what we have.  Back on our Streaming Datasets browser window we click the Create Report icon under actions.  Now this part is going to be very subjective to the end user who wants the report.  But the key data we want is under RealTimeData on the far right.  Select all three values and we get presented with a default view of our data.  Under Visualizations select line chart and now we start to see a more visual representation of our capacity over time.  Under the Analytics section add a trend line and see a basic view of available capacity over time.  Finally hit save and you have a self updating report from streaming data.

For the report to start to look anything like below it will take time and a few sample datasets. ¬†In the below image I’ve mocked up some numbers over time as an example.

Once you have a working script and it’s streaming data to PowerBI it’s really up to you on how to report on it. ¬†The above example, as simple as it is, lays the ground work to more customized and complex reporting that you might not be able to get out of traditional monitoring and reporting software. ¬†The ability is there to even share out the report.

Streaming datasets, as you might have noticed in the UR, is still in beta. ¬†As great as I have found it to be it does have some quirks. ¬†For one you can’t easily modify data you have already streamed up to Power BI. ¬†So if you send incorrect data / values up to Power BI in a streaming dataset it will remain their. ¬†At which point you will have to consider Filters to exclude it in reports.

In summary I think Power BI is a very underrated free tool from Microsoft. ¬†I’ve only just started to scratch the surface of what’s possible with it. ¬†The simplicity of capturing data with PowerShell and sending it to Power BI is well worth the time and effort to try at least once. ¬†So what are you waiting for?

PowerShell on Linux

The big news out of Microsoft last month making headlines is the open sourcing of PowerShell. ¬†Along with this comes the ability to now run PowerShell not just in Windows but also Linux and Mac OS X. ¬†For people close to the PowerShell community this wasn’t unexpected, but make no mistake this is huge news.

I’m really liking this new Microsoft. ¬†They are really embracing this open source stuff. ¬†On first thought it’s not obvious how Microsoft will make money with PowerShell going open source. ¬†But Microsoft isn’t stupid, this is no doubt part of a larger master plan. ¬†With PowerShell so tightly linked to their products they are opening the door to a whole new demographic of users. ¬†I can see PowerShell going open source being a key to getting a new mix of Linux Developers working in Azure. ¬†Something close to my heart is VMware have also announced plans to port over PowerCLI to work with PowerShell for Linux. ¬†As a PowerCLI tragic myself I’ve seen first hand how frustrated Mac users have been that they can’t manage their VMware infrastructure using PowerShell / PowerCLI directly from a Mac.

Microsoft have made it clear this is very early stages of an Alpha release on GitHub. ¬†They are looking for community help to further develop and refine using PowerShell on Linux. ¬†There’s a large number of bug fixes, growing by the day, that they need to work through before we get anywhere close to a production release.

I decided to try it out myself and i’m impressed, the future looks awesome. ¬†Apart from Windows currently the open source version is limited to Ubuntu 14.04 /16.04, CentOS 7, and Mac OS X 10.11.

I had an Ubuntu 14.04 Linux VM that I used testing.  The first thing is to download the appropriate package over at GitHub.

Once downloaded and depending on what OS you’re running you may need to install a few additional libraries first. ¬†In my case it was libnuwind8 and libicu52 using apt-get.¬†After which i was able to install the PowerShell Debian package.¬†

[email protected]:~/Downloads$ sudo apt-get install libunwind8 libicu52
[email protected]:~/Downloads$ sudo dpkg -i powershell_6.0.0-alpha.9-1ubuntu1.14.04.1_amd64.deb

Believe it or not that’s all that is required. ¬†Whatever your Shell of choice is just type ‘powershell

[email protected]:~/Downloads$ powershell
Copyright (C) 2016 Microsoft Corporation. All rights reserved.

PS /home/mukotic/Downloads> 

So what can we do. ¬†Well, it’s still early days. ¬†The first thing i did was just check the version. ¬†I can see we’re running the .Net Core release of PowerShell which comes with Nano Server.

PS /home/mukotic/Downloads> $psversiontable 

Name Value 
---- ----- 
PSVersion 6.0.0-alpha 
PSEdition Core 
PSCompatibleVersions {1.0, 2.0, 3.0, 4.0...} 
GitCommitId v6.0.0-alpha.9 
WSManStackVersion 3.0 
PSRemotingProtocolVersion 2.3 

Looking at what’s available to us it’s still limited to a handful of modules.

PS /home/mukotic/Downloads> Get-Module -ListAvailable 

 Directory: /opt/microsoft/powershell/6.0.0-alpha.9/Modules

ModuleType Version Name ExportedCommands 
---------- ------- ---- ---------------- 
Manifest Microsoft.PowerShell.Archive {Compress-Archive, Expand-Archive} 
Manifest Microsoft.PowerShell.Host {Start-Transcript, Stop-Transcript} 
Manifest Microsoft.PowerShell.Management {Add-Content, Clear-Content, Clear-ItemProperty, Join-Path...} 
Manifest Microsoft.PowerShell.Security {Get-Credential, Get-ExecutionPolicy, Set-ExecutionPolicy, ConvertFrom-SecureString...
Manifest Microsoft.PowerShell.Utility {Format-List, Format-Custom, Format-Table, Format-Wide...} 
Binary PackageManagement {Find-Package, Get-Package, Get-PackageProvider, Get-PackageSource...} 
Script 3.3.9 Pester {Describe, Context, It, Should...} 
Script PowerShellGet {Install-Module, Find-Module, Save-Module, Update-Module...} 
Script 0.0 PSDesiredStateConfiguration {StrongConnect, IsHiddenResource, Write-MetaConfigFile, Get-InnerMostErrorRecord...} 
Script 1.2 PSReadLine {Get-PSReadlineKeyHandler, Set-PSReadlineKeyHandler, Remove-PSReadlineKeyHandler, G...

So those traditional Windows cmdlets will now work against the local Linux box.  Things like Get-Process will return the local running Linux processes.

PS /home/mukotic/Downloads> Get-Process

Handles NPM(K) PM(K) WS(K) CPU(s) Id SI ProcessName 
------- ------ ----- ----- ------ -- -- ----------- 
 0 0 0 0 0.400 1331 549 accounts-daemon 
 0 0 0 0 0.350 1111 111 acpid 
 0 0 0 0 0.000 2248 205 at-spi-bus-laun 
 0 0 0 0 0.040 2264 205 at-spi2-registr 
 0 0 0 0 0.000 147 0 ata_sff

Another thing that’s also worth checking out is Visual Studio Code. ¬†This is another great open source project Microsoft has going. ¬†If you’ve used PowerShell ISE in Windows, think of a stream lined version of that, just more powerful leveraging extensions. ¬†Head over to¬†¬†and download the package.

Installation was also super simple.

PS /home/mukotic/Downloads> sudo dpkg -i code_1.4.0-1470329130_amd64.deb.deb

Then run by typing ‘code’

PS /home/mukotic/Downloads> code

Ubuntu 14.04 - VMware Workstation-000296

I recommend getting the PowerShell extension right off the bat.  Click the Extensions icon on the left, search for PowerShell, and click Install

Ubuntu 14.04 - VMware Workstation-000297

Now we have all the wonders of Intellisense that we are use to in the Windows PowerShell ISE.  I really see Visual Studio Code becoming a future replacement for the Windows PowerShell ISE, which while still in development, has been quite stagnated in recent years.

So there you have it.  Jeffrey Snover, a Technical Fellow, in the Microsoft Enterprise Cloud Group has a great post and video discussing PowerShell going open source that should be checked out.

The next thing I’m hanging out for is PowerCLI on Linux. ¬†A demo is shown in a video in the above link¬†running inside a Docker container. ¬†Expect to soon see a VMware Fling release for us to try out.

Meetups, PowerShell, Expanding My Horizons


I’m not sure what it’s like in other major cities around the world. ¬†But currently Melbourne is going through an IT meetup boom. ¬†On any given week you can find¬†at least one if not multiple¬†meetups going on somewhere in Melbourne. ¬†A big change of years past where we would have only a couple major conferences a year to look forward to.¬†¬†It’s really quite an exciting period for meetups we’re going through.

So what is going on with all these meetups ¬†—Meetup being the new buzz word we’re seeing slowly replacing the traditional User Group we’re all probably use to. ¬†I think it’s in small part to do with the website ¬†Sure, many of these User Groups have existed well before became a thing. ¬†But to find them you had to be part of the right Facebook group, follow the right twitter user, or just learn of it through some word of mouth. ¬†I lost count before on how many User Group meetings¬†I missed by learning about it the next day.

We now have a common place we can visit to find all these User Groups and meetups. ¬†Type in DevOps, PowerShell, VMware and dozens of meetups pop up in your local area. ¬†RSVP and see all the other local users also going, not sure what the meetup is about, post a quick question¬†and receive an answer right back. ¬†There’s an update to a meeting, receive an email notification immediately. ¬†I see it as a symbiotic¬†relationship between a globally accepted meetup site and the user group. ¬†We at the Melbourne VMware User Group have even started using it in conjunction¬†with the traditional VMUG website to extend our community base.


This is how I found out about the recent PowerShell meetup I attended in Melbourne. ¬†With all the scripting¬†I’ve recently been doing in PowerCLI and PowerShell I wanted to expand my horizons a little further and find out how the wider PowerShell community works. ¬†The group has only existed since the start of the year and this was their fourth meetup held in the offices. ¬†The setting for the meetup was very casual and¬†devoid¬†of any advertising or marketing. ¬†That is if you can overlook the office¬†seek logos all over the place. ¬†But considering the worst seek can actually do is find me¬†a new job I’m more than happy to tolerate this¬†ūüôā ¬† Of course there was the obligatory Beer and Pizzas which we all know elevates a good meetup to an awesome meetup.

sccm2012A found the format and atmosphere of this PowerShell meetup very appealing.  Heavy on practical content & demos and light on PowerPoint slides.  The setting around a large boardroom table with beer and pizza in hand also lead to a more comfortable environment to engage with community and presenters.  The meetup tended to have a slant towards DevOps practices using PowerShell rather than using PowerShell.  So less about how to connect to this server or use that cmdlet and more around processes and integration.  I was also lucky enough to receive a copy of the book, Learn System Center Configuration Manager in a Month of Lunches, from its author James Bannan.

Due to work commitments of the organiser, the PowerShell meetup was¬†pushed out a day which turned out conflicted with an Azure meetup on the same night. ¬†With so many IT meetup groups current listed and running in Melbourne. ¬†There’s bound to be a small culling, a kind of survival of the fittest, happen. ¬†So whether this PowerShell meetup group succeeds or not only time will tell. ¬†I certainly hope it does and they continue to find that DevOps centric content it aims for.

Until the next meetup…

Melbourne VMUG Meetup Group

Get-View | Show-Object

I was recent watching a PowerShell presentation where they mentioned a cool module called PowerShellCookbook and in particular discussed a cmdlet in it called Show-Object by Lee Homes. ¬†I instantly knew how perfect and powerful it would be with VMware’s PowerCLI¬†Get-View.

Bare with me for a minute while I lay the ground work with Get-View. ¬†If you’ve ever used Get-View in PowerCLI you’ll know that it brings back a ridiculous wealth of information. ¬†When you run a cmdlet like Get-VMHost it’s really only bringing back a small subset of information back on that object. ¬†Sometimes this is fine but sometimes we need that little bit extra to reach our objective.

For example you can run Get-VMHost esxi01.ukoticland.local

Windows PowerShell ISE-000282

What you get is a default formatted table view displaying only a few key values.  A trick some of us do is then pipe this out to a list.  Get-VMHost esxi01.ukoticland.local | Format-List

Windows PowerShell ISE-000283

Heaps more information right, but it’s still not the full picture. ¬†There’s still a lot of information on this object that we’re missing. ¬†Take the original cmdlet we ran above and this time let’s pipe it to Get-View. ¬†Let’s also store it in a variable called $myHost, just so we can work with it.

$myHost = Get-VMHost esxi01.ukoticland.local | Get-View

Windows PowerShell ISE-000284

Okay, on first glance it doesn’t look like much. ¬†But all those¬†values that start with VMware.Vim are properties that can be drill down into. ¬†For example $myHost.Config¬†and $myHost.Config.Capabilities

Windows PowerShell ISE-000288

So it’s pretty cool right. ¬†We can now¬†start retrieving¬†a huge amount of new information that wasn’t available to use before. ¬†But this is like finding a needle¬†in a haystack. ¬†I know I’ve wasted so much time typing $something dot something dot something in the hopes of finding a value I can work with.

Well finally this brings us to Show-Object.  This is an awesome cmdlet that will let you display the object retrieved with Get-View in a grid view window that you can navigate through similar to a directory in File Explorer.  Using it is as simply as piping our variable to Show-Object.

$myHost | Show-Object

Windows PowerShell ISE-000287

Now we can explore and click¬†around at¬†everything available to us. ¬†As you navigate the object in the top pane for results you’ll get member data in the bottom pane. ¬†I see¬†this becoming a great reference tool to help find what you’re looking for. ¬†Not only that but it will give you the syntax to retrieve the¬†information selected in the view pane.

So how do you get Show-Object?¬† Well, it’s not in PowerShell by default but can easily be obtained from the PowerShell Gallery,¬†which, if new to you, is basically a public repository for PowerShell content. ¬†If you’re using Windows 10 you’re half way there. ¬†If not go get yourself the Windows Management Framework (WMF) 5. This will give you the latest version of the PowerShellGet module. ¬†Then it’s just a matter of typing Install-Module -Name PowerShellCookbook.

Once the module is installed from the PowerShell Gallery, Show-Object is now available to use. ¬†It’s worth noting that PowerShellCookbook comes with a huge array of extra cmdlets also worth exploring.

Finally if you do try out Show-Object and like it, there’s a “jacked up” version of it over at PoshCode by Justin Rich


Using PowerCLI to speed up Question time

It seems like all my posts start of the same way, ‚ÄúI recently had an issue‚Ķ‚ÄĚ, but it‚Äôs true ūüôā ¬† I recently had an issue with snapshots consuming all of the space in a datastore.¬† The culprit was CommVault preforming end of month full backups of VMs.¬† During the backup process it takes a snapshot of the VM prior to that full VMDK backup.

It was first identified with a VM having a pending question in the vCenter Client.


It was also quickly identified as having a snapshot by the xxx-000001.vmdk virtual disk name.  A check of the working snapshots of the VM showed that it was created by CommVault.  A check of the VMs datastores the VM used showed that one of them was full.  It was also a good bet that any other VMs on this datastore with snapshots or thin provisioned disks would be having issues too.

The snapshots couldn’t be deleted as there was no free space on the datastore to commit the transactions from the snapshot VMDK.  I could have grown the size of the datastore but that would have involved Change Requests with changes at the SAN and vSphere level.  Without panicking too much I identified a rather small working VM and migrated it’s storage to another datastore.  The goal here is to free up space on the datastore in preperation to answer the pending questions and get the VM running ASAP.

While the VM was being migrated to a different datastore it gave me a few minutes to identify which VMs had a pending question and would have been in a paused state.  Now there’s probably 17 different ways to approach this but I just went with a direct method.

Using PowerCLI and connecting up the vCenter Server.  I then ran Get-VMQuestion.

Connect-VIServer my_vcenter_server.domain.local


This will by default return all VMs in the vCenter instance that have pending questions.  If you have a large vCenter with 1000+ VMs this will most likely take a while and you might want to target the command to specific datastores or ESXi servers.  In my situation I wanted to make sure I identified all pending questions and didn’t miss anything.


By this point my VM migration just completed so I could now answer all the pending questions and resume the VMs in the paused state.  This can be done by using the Set-VMQuestion.

Get-VMQuestion | Set-VMQuestion ‚ÄďOption ‚ÄúRetry‚ÄĚ -Confirm:$false

It doesn‚Äôt really get simpler then the above command.¬† Identify pending questions with Get-VMQuestion.¬† Pipe it to Set-VMQuestion, answer all with ‚ÄėRetry‚Äô, and use the parameter ‚ÄėConfirm‚Äô so not to get prompted to confirm action.¬† Again probably smarter ways to go about this.¬† You could use the Get-Datastore cmdlet to identify datastores with zero free space.¬† Then target those datastores with Get-VMQuestion.

Get-Datastore my_datastore | Get-VM | Get-VMQuestion

The Get-VMQuestion / Set-VMQuestion is a great PowerCLI way of working smarter not harder. ¬†Whether answering 1 or 100 questions it’s all the same via PowerCLI. ¬†I don’t really know if there’s an easy way to identify and answer pending questions through the vCenter Web or C# client???

On a side note.  Set-VMQuestion isn’t overly intuitive as a cmdlet name for answering a question.  So there is an alias to it called Answer-VMQuestion.  I guess it sticks with PowerShell tradition with the Get / Set verbs.

Connecting to vCloud Director via PowerCLI

I’m currently stuck administrating a vCloud Directory 1.5 environment. ¬†I don’t have any major concerns with vCD 1.5 but sometimes I do find the web portal a little awkward to navigate around. ¬†VMware have done a great job in creating PowerCLI cmdlets that open up access into the vCD APIs.

You can obtain access to the cmdlets via two methods. ¬†You can download PowerCLI from VMware. ¬†You’ll need at least version 5.0.1. ¬†Or you can download PowerCLI for Tenants. ¬†This version contains only the vCD cmdlets and removes all the other vSphere cmdlets not relevant to vCD.

If you’re connecting to vCD over the internet the great thing is you don’t need any extra or special ports opened to use PowerCLI. ¬†Connection is done over HTTPS (Port 443) using the domain name of your Cloud Service Provider’s vCD portal.

You’ll also need your ORG name within vCD. ¬†To find out your ORG name connect up to the vCD web portal. ¬†Navigate to the Administration tab and select General under Settings in the left pane.


Open up PowerCLI.  Use the cmdlet Connect-CIServer to initiate a connection.

Connect-CIServer -Server -org MYORG

You should then be prompted for your vCD login credentials.


Once connect you you can start playing around with some of the more basic get commands.

To view your usage of CPU, Memory and Storage allocation you can use get-orgvdc.

get-orgvdc | ft -autosize


To list all your VMs it’s just a matter of get-civm

get-civm | ft  name, status, cpucount, memorygb -autosize


To list details of a specific VM you can try the follow

get-civm -name MYSERVER | fl


The cmdlets won’t give you the full feature set as the web portal. ¬†Never the less you’ll find that you can speed up a lot of the daily administrative tasks you do. ¬†It’s also a great way of extracting out information for reporting.


Vmware Connect-CIServer 

vCloud Director PowerCLI Community

vCD cmdlet reference

Dude, Where’s my mail statistics?

One thing that really bugged me when Exchange 2007 was released was the removal in the GUI to view mailbox size and total items.  I can’t figure out why?  Though I’m sure there is something written in 3pt text in an obscure TechNet article somewhere?!?!

In Exchange 2000/3 you would drilled down to your Storage Group / Mailbox Store / Mailboxes.  In Exchange 2007/10 it just doesn’t exist.  So how to you get this information?  PowerShell to the rescue… again.  Just another clear indicator that Microsoft wants us using PowerShell more and more.  Using Get-MailboxStatistics we can retrieve this basic information and so much more.


Running the Cmdlet by itself without any parameters will list DisplayName, ItemCount, and StorageLimitStatus for all mailboxes.

We can narrow it down to one user by using the following script.

Get-MailboxStatistics ‚Äďidentity ‚Äútest.user‚ÄĚ

If we want to replicate something similar to what we got in Exchange 2000/3 we can use the following script

Get-MailboxStatistics | ft DisplayName, LastLoggedOnUserAccount, TotalItemSize, ItemCount, LastLogonTime

Using the Format-Table cmdlet we can add a number of different columns in addition to the five above.  Below are columns we can display and filter on.


There are a few things we can now do to clean up our report.¬† TotalItemSize is returned to us in Bytes ‚Äďnot very user friendly.¬† So we can convert this to MB.¬† We can also sort our results by this column as well.

Get-MailboxStatistics | sort totalitemsize ‚ÄďDescending | ft DisplayName, LastLoggedOnUserAccount,@{n="Total Size (MB)";e={$_.totalitemsize.value.toMB()}}, ItemCount, LastLogonTime

So now we have something very similar like the old Exchange System Manager would have provided us on the screen.¬† In those days we would simply right click on our Mailbox container and select Export.¬† We can easily achieve the same thing with PowerShell.¬† Many people just redirect this output to a text file ‚Äďwhich is fine.¬† The better way is to pipe it to a CSV file

Get-MailboxStatistics | sort totalitemsize ‚ÄďDescending | select-object DisplayName, LastLoggedOnUserAccount,@{n="Total Size (MB)";e={$_.totalitemsize.value.toMB()}}, ItemCount, LastLogonTime | Export-CSV ‚Äďpath ‚Äėc:tempMailboxstats.csv‚Äô

Updated: The above line has been modified to include select-object instead of format-table due to using the Export-CSV cmdlet.

We now have a nice comma delimitated output file already sorted by Mailbox size.

Get-MailboxStatistics is a rather simple and easy to use cmdlet.  Once you become comfortable using the cmdlet, you can have a number of pre-defined scripts ready to run just how you like.

If you have a lot of mailboxes and want to disregard mailboxes below a certain size you could filter on TotalItemSize.  The below example will only return mailboxes greater than 100MB.

Get-MailboxStatistics |where {$_.TotalItemSize -gt 100MB} | sort totalitemsize | select DisplayName,ItemCount,@{n="Total Size (MB)";e={$_.totalitemsize.value.toMB()}} |ft -AutoSize

At a later stage I’ll touch on how to get this report to run to a schedule and email the output to yourself or a manager.


Exchange 2003 Mailbox view 

Dynamic Distribution Groups ‚Äďdynamically disturbing

If you haven’t seen or used Dynamic Distribution Groups (DDG) in Exchange before, which rock have you been hiding under?  DDG has been around since Exchange 2007 and its precursor Query-based Distribution groups way back since Exchange 2000.  QBDG never seemed to get too much attention.  With the introduction of Exchange 2007 and PowerShell it’s become a lot easier to use and manage.

DDG works by querying Active Directory for object attributes. For example, Company Name or Department.  So the foundation of fully functioning DDGs is having an actively updated and maintained AD.  As everything Exchange, you can manage DDG in either the Exchange Management Console or via PowerShell.  EMC is great for quick and dirty work but PowerShell is where you can do some cool stuff with DDGs.

DDG has been on my mind recently so this was a great opportunity to not only touch on how to create and manage a DDG but also around some of the issues around them.

Creating a DDG via the console wizard is very easy and self-explanatory.  So I won’t go delve to deep into it.

Step to create are as follows…

Open the EMC

Navigate to Recipient Configuration -> Distribution Groups

Using the Action Pane select new Distribution Dynamic Group

The Dynamic Distribution Wizard will run.  Enter a name for the new distribution group and click next.

Select mailbox types to filter by (default is all) and click next.

Create your conditions and click next.

The final screen gives you a summary detailing your options.  Clicking New will create the DDG.

The console wizard has a number of limitations, namely you only have the option to query off three main attributes, State or Province, Department, and Company plus a half dozen or so custom attributes (which I’ve never used).  For many organisations though this is more than sufficient in creating those generic company and department wide lists.  It’s only when you start working with PowerShell that you can really take advantage of DDGs, but be mindful of how elaborate (or complicated) you get.

The PowerShell command to create a DDG is New-DynamicDistributionGroup.  So as an example, say you set the Company attribute under the Organisation tab within all AD User objects.  We would execute the following PowerShell script to create our new company wide Distribution List.

New-DynamicDistributionGroup ‚Äďname ‚ÄúMy Mailing List‚ÄĚ ‚Äďrecipientfilter ‚Äú(Company ‚Äďeq ‚ÄėMy Company Name‚Äô)‚ÄĚ ‚Äďrecipientcontainer ‚Äėmydomain.local‚Äô

Fairly straight forward right?¬† We specify in recipient filter to use the Company attribute and make sure it equals our company name.¬† If it does, the user will be part of our new distribution list called ‚ÄúMy Mailing List‚ÄĚ.

The important part to make note here of is the ‚Äďrecipientcontainer parameter.¬† By default when you create a DDG in PowerShell it will only run its query against the default Users container.¬† By specifying mydomain.local we are tell the query to run from the root of our domain and include all sub OU containers.

Say you want to now modify your new DDG to query only users in a particular OU called Australia.

Set-DynamicDistributionGroup ‚Äďidentity ‚ÄúMy Mailing List‚ÄĚ ‚ÄďrecipientContainer ‚ÄúOU=Australia,DC=mydomain,DC=local‚ÄĚ

The command is similar to our initial one.  We’re telling the script which DDG to modify but this time we only need to put the parameter we want to change.

To view who has been added to this DDG we would type the follow

$Users = Get-DynamicDistributionGroup ‚ÄďIdentity ‚ÄúMy Mailing List‚ÄĚ

Get-Recipient ‚ÄďRecipientPreviewFilter $Users

What about something we can’t do in the EMC with Dynamic Distribution Lists?  Okay, say we are setting the Managers attribute in our organisation on User objects to specify who their manager / team leader is within AD.  We could create a team mailing list that would add users automatically as they move between managers / teams.

New-DynamicDistributionGroup -name "Team Leader Mailing List" -recipientfilter "((Manager -eq 'cn=Jane Doe,ou=Employees,ou=Australia,dc=mydomain,dc=local') -or (Name -eq 'Jane Doe'))" ‚Äďrecipientcontainer ‚Äėmydomain.local‚Äô

There’s two parts to this command.  Firstly we query any users whose managers is called Jane Doe, and the only way to do this is to specify our manager’s full LDAP path.  The second part is actually telling the script to statically add in the manager of the team -as they won’t be managing themselves and would most likely have a different manager specified.

You can even add users of security groups to a DDG.  I’d only recommend this is very specific circumstances.  Only because you’re basing a dynamic list of a static list.  You may have good reason to do this however (e.g. You may have to add addition recipients but not want them part of the security group).

New-DynamicDistributionGroup -name "My Mailing List" -recipientfilter "((MemberOfGroup -eq My Security Group’) -or (Name -eq 'Jane Doe') -or (Name -eq 'John Doe'))" -recipientContainer 'mydomain.local'

Don’t get carried away with Dynamic Distribution Groups just because you can manage to find a query to add anyone you want.  The above script is a perfect example.

Things to make note of…

Don’t implement a Dynamic Distribution Group if it adds users that haven’t been requested part of a list.  These situations call for tradition distribution groups.  Forcing a DDG onto a team with superfluous users doesn’t achieve anything.

Be careful of users hijacking onto the back of a Dynamic Distribution Group.  DDGs are based on AD attributes.  Depending on your environment, users will be able to update certain attributions on their OU object.  If a user is allowed to update their own Title and your DDGs are based off Titles.  A user can move themselves in and out of Distribution Lists at will.

You will not be able to view DDGs through Outlook.  This is by design as you don’t want to overload AD with constant queries on distribution groups.  Keep this in mind when a user rings up asking why they are not part of a list.

Lastly, remember the -recipientcontainer parameter in your script.¬† I always forget this¬†ūüôā