Category Archives: PowerShell - Page 2

Streaming Datasets – PowerShell | PowerCLI | Power BI

A large part of my day is spent scripting in PowerShell, specifically with PowerCLI.  One of the strongest areas of PowerCLI, obviously, is being able to retrieve information.  It’s one of the key use cases, in my opinion, for using PowerCLI in a VMware environment, it’s ability to retrieve information for Capacity planning and reporting.

Recently I’ve been looking at how to consume all that information.  You can obviously export it to a CSV, push it into a database, or something that I’ve been playing around with recently, stream it into Power BI.  Now if you haven’t tried it out yet, PowerBI is an analytics service from Microsoft.  At its core it’s a data warehouse for business intelligence.  But putting all those fancy words aside, I use it to create fancy reports.

Exporting information out of a vCenter environment with PowerCLI is dead simple.  I have dozens of scheduled tasks running all the time doing this.  Where I’ve fallen down, is taking that information and trending it over time.  This is where the Streaming Datasets functionality of Power BI comes in.  Using PowerCLI I can get an Object and Value from vCenter and then Post that directly into Power BI, using their API, and have it instantly graphed in a report.  I can then share that report out to anyone I want.  Power BI lets me do this over and over, almost as fast as I can pull the information out of vCenter.

In the example below I show how to create a trend report over time that displays Total and Available Storage of a vCenter Cluster.  Rather simple, I know, but can easily be adapted to show things like number of running VMs running, reserved resources used, etc, etc.  The skies the limit really.

Before we do any scripting the first thing we do is log into Power BI.  If you don’t have an account, don’t worry, the basic version is free.  Hit the Sign Up link and make sure you select Power BI and not Power BI Desktop for Windows, we want the cloud version.

Once logged in we click on Streaming Datasets in the bottom left under the Datasets category.  This is where we create our initial dataset schema so that it can accept streaming input.  We click on ‘Add streaminig dataset’ in the top right.

Then select the source of data, which will be API and click next.

We give our New Streaming Dataset a name and define a few values.  In this example we will define a Date, Total Storage, and Available Storage value, turn on Historic Data Analysis and click Create.  Make note of your data type to the right of the value.  Date is DateTime and the other two are Numbers.

We’ve now created our schema and are provided with a Push URL address and sample code in a few different formats (we want PowerShell).  If you look carefully we are using an Invoke-RestMethod to Post to Power BI.  This sample code has just provided us the template and hardest part of our PowerShell / PowerCLI script.  Click over the code and copy / pasta it out to use in our script (Paste it at the bottom of the script as it will be the last thing that runs).

Now we actually start on the PowerShell / PowerCLI script.  To keep it as concise as possible.  I’ve skip the process I use to actually connect to the vCenter and retrieve the information out using PowerCLI in the code below.  The real goal here is just to retrieve some values and get that into Power BI.  Line 6 is basically retrieving all shared VMFS datastores in Cluster1.  The important lines to note, though, are 4, 8, and 9 where I store my key values in three variables.  One for Date, one for TotalStorage, and one for AvailableStorage.

Import-Module VMware.VimAutomation.Core
Connect-VIServer -Server host.mydomain.local

$date = Get-Date

$datastore = Get-Cluster -Name Cluster1 | Get-Datastore | Where-Object {$_.Type -eq 'VMFS' -and $_.Extensiondata.Summary.MultipleHostAccess}

$TotalStorage = ($datastore | Measure-Object -Property CapacityMB -Sum).Sum / 1024
$AvailableStorage = ($datastore | Measure-Object -Property FreeSpaceMB -Sum).Sum / 1024 

The additional lines below from 11 onward is the important code.  This is our pasted sample code from Power BI that we will slightly modify to push our values up to Power BI.  Don’t copy mine, as your URL and key will be different.  On lines 13, 14, and 15 we will remove the example values and replace it with our three variables, $Date, $TotalStorage, and $AvailableStorage.

Import-Module VMware.VimAutomation.Core
Connect-VIServer -Server 10.1.1.201 -user "mydomain\username"

$date = Get-Date

$datastore = Get-Cluster -Name Cluster1 | Get-Datastore | Where-Object {$_.Type -eq 'VMFS' -and $_.Extensiondata.Summary.MultipleHostAccess}

$TotalStorage = ($datastore | Measure-Object -Property CapacityMB -Sum).Sum / 1024
$AvailableStorage = ($datastore | Measure-Object -Property FreeSpaceMB -Sum).Sum / 1024 

$endpoint = "https://api.powerbi.com/beta/83fe1fa2-fa52-4376-b7f0-cb645a5fcfced/datasets/d57970bc-60b3-46e6-b23b-d782431a72be/rows?key=2zEhgN9mu%2BEH%2FI2Cbk9hd2Kw4b5c84YaO6W8gzFcZbBnO6rti3N631Gjw%2FveNXSBxwR84VcWPGOSrheNwQnCbw%3D%3D"
$payload = @{
"Date" = $Date
"Total Storage" = $TotalStorage
"Available Storage" = $AvailableStorage
}
Invoke-RestMethod -Method Post -Uri "$endpoint" -Body (ConvertTo-Json @($payload))

Disconnect-VIServer * -Confirm:$false

On the last line I disconnect  from my vCenter and close any sessions.  This helps if running as a scheduled task.  Finally save the script.

And that’s it for the scripting part.  Assuming everything is correct, no connection issues, correct values being retrieved.  All we have to do is run the script and it will send a POST request using Invoke-RestMethod with our three values.  We can now run this script as many times as we want and it will continue to post the current date and time along with Total Storage and Available Storage.  At this point, if we wish, we can turn the script into a scheduled task or just continue to run manually to suit our needs.

We now go back to Power BI and report on what we have.  Back on our Streaming Datasets browser window we click the Create Report icon under actions.  Now this part is going to be very subjective to the end user who wants the report.  But the key data we want is under RealTimeData on the far right.  Select all three values and we get presented with a default view of our data.  Under Visualizations select line chart and now we start to see a more visual representation of our capacity over time.  Under the Analytics section add a trend line and see a basic view of available capacity over time.  Finally hit save and you have a self updating report from streaming data.

For the report to start to look anything like below it will take time and a few sample datasets.  In the below image I’ve mocked up some numbers over time as an example.

Once you have a working script and it’s streaming data to PowerBI it’s really up to you on how to report on it.  The above example, as simple as it is, lays the ground work to more customized and complex reporting that you might not be able to get out of traditional monitoring and reporting software.  The ability is there to even share out the report.

Streaming datasets, as you might have noticed in the UR, is still in beta.  As great as I have found it to be it does have some quirks.  For one you can’t easily modify data you have already streamed up to Power BI.  So if you send incorrect data / values up to Power BI in a streaming dataset it will remain their.  At which point you will have to consider Filters to exclude it in reports.

In summary I think Power BI is a very underrated free tool from Microsoft.  I’ve only just started to scratch the surface of what’s possible with it.  The simplicity of capturing data with PowerShell and sending it to Power BI is well worth the time and effort to try at least once.  So what are you waiting for?

PowerShell on Linux

The big news out of Microsoft last month making headlines is the open sourcing of PowerShell.  Along with this comes the ability to now run PowerShell not just in Windows but also Linux and Mac OS X.  For people close to the PowerShell community this wasn’t unexpected, but make no mistake this is huge news.

I’m really liking this new Microsoft.  They are really embracing this open source stuff.  On first thought it’s not obvious how Microsoft will make money with PowerShell going open source.  But Microsoft isn’t stupid, this is no doubt part of a larger master plan.  With PowerShell so tightly linked to their products they are opening the door to a whole new demographic of users.  I can see PowerShell going open source being a key to getting a new mix of Linux Developers working in Azure.  Something close to my heart is VMware have also announced plans to port over PowerCLI to work with PowerShell for Linux.  As a PowerCLI tragic myself I’ve seen first hand how frustrated Mac users have been that they can’t manage their VMware infrastructure using PowerShell / PowerCLI directly from a Mac.

Microsoft have made it clear this is very early stages of an Alpha release on GitHub.  They are looking for community help to further develop and refine using PowerShell on Linux.  There’s a large number of bug fixes, growing by the day, that they need to work through before we get anywhere close to a production release.

I decided to try it out myself and i’m impressed, the future looks awesome.  Apart from Windows currently the open source version is limited to Ubuntu 14.04 /16.04, CentOS 7, and Mac OS X 10.11.

I had an Ubuntu 14.04 Linux VM that I used testing.  The first thing is to download the appropriate package over at GitHub. https://github.com/PowerShell/PowerShell

Once downloaded and depending on what OS you’re running you may need to install a few additional libraries first.  In my case it was libnuwind8 and libicu52 using apt-get. After which i was able to install the PowerShell Debian package. 

[email protected]:~/Downloads$ sudo apt-get install libunwind8 libicu52
[email protected]:~/Downloads$ sudo dpkg -i powershell_6.0.0-alpha.9-1ubuntu1.14.04.1_amd64.deb

Believe it or not that’s all that is required.  Whatever your Shell of choice is just type ‘powershell

[email protected]:~/Downloads$ powershell
PowerShell 
Copyright (C) 2016 Microsoft Corporation. All rights reserved.

PS /home/mukotic/Downloads> 

So what can we do.  Well, it’s still early days.  The first thing i did was just check the version.  I can see we’re running the .Net Core release of PowerShell which comes with Nano Server.

PS /home/mukotic/Downloads> $psversiontable 

Name Value 
---- ----- 
PSVersion 6.0.0-alpha 
PSEdition Core 
PSCompatibleVersions {1.0, 2.0, 3.0, 4.0...} 
BuildVersion 3.0.0.0 
GitCommitId v6.0.0-alpha.9 
CLRVersion 
WSManStackVersion 3.0 
PSRemotingProtocolVersion 2.3 
SerializationVersion 1.1.0.1

Looking at what’s available to us it’s still limited to a handful of modules.

PS /home/mukotic/Downloads> Get-Module -ListAvailable 


 Directory: /opt/microsoft/powershell/6.0.0-alpha.9/Modules


ModuleType Version Name ExportedCommands 
---------- ------- ---- ---------------- 
Manifest 1.0.1.0 Microsoft.PowerShell.Archive {Compress-Archive, Expand-Archive} 
Manifest 3.0.0.0 Microsoft.PowerShell.Host {Start-Transcript, Stop-Transcript} 
Manifest 3.1.0.0 Microsoft.PowerShell.Management {Add-Content, Clear-Content, Clear-ItemProperty, Join-Path...} 
Manifest 3.0.0.0 Microsoft.PowerShell.Security {Get-Credential, Get-ExecutionPolicy, Set-ExecutionPolicy, ConvertFrom-SecureString...
Manifest 3.1.0.0 Microsoft.PowerShell.Utility {Format-List, Format-Custom, Format-Table, Format-Wide...} 
Binary 1.0.0.1 PackageManagement {Find-Package, Get-Package, Get-PackageProvider, Get-PackageSource...} 
Script 3.3.9 Pester {Describe, Context, It, Should...} 
Script 1.0.0.1 PowerShellGet {Install-Module, Find-Module, Save-Module, Update-Module...} 
Script 0.0 PSDesiredStateConfiguration {StrongConnect, IsHiddenResource, Write-MetaConfigFile, Get-InnerMostErrorRecord...} 
Script 1.2 PSReadLine {Get-PSReadlineKeyHandler, Set-PSReadlineKeyHandler, Remove-PSReadlineKeyHandler, G...

So those traditional Windows cmdlets will now work against the local Linux box.  Things like Get-Process will return the local running Linux processes.

PS /home/mukotic/Downloads> Get-Process


Handles NPM(K) PM(K) WS(K) CPU(s) Id SI ProcessName 
------- ------ ----- ----- ------ -- -- ----------- 
 0 0 0 0 0.400 1331 549 accounts-daemon 
 0 0 0 0 0.350 1111 111 acpid 
 0 0 0 0 0.000 2248 205 at-spi-bus-laun 
 0 0 0 0 0.040 2264 205 at-spi2-registr 
 0 0 0 0 0.000 147 0 ata_sff

Another thing that’s also worth checking out is Visual Studio Code.  This is another great open source project Microsoft has going.  If you’ve used PowerShell ISE in Windows, think of a stream lined version of that, just more powerful leveraging extensions.  Head over to https://code.visualstudio.com/docs/setup/linux and download the package.

Installation was also super simple.

PS /home/mukotic/Downloads> sudo dpkg -i code_1.4.0-1470329130_amd64.deb.deb

Then run by typing ‘code’

PS /home/mukotic/Downloads> code

Ubuntu 14.04 - VMware Workstation-000296

I recommend getting the PowerShell extension right off the bat.  Click the Extensions icon on the left, search for PowerShell, and click Install

Ubuntu 14.04 - VMware Workstation-000297

Now we have all the wonders of Intellisense that we are use to in the Windows PowerShell ISE.  I really see Visual Studio Code becoming a future replacement for the Windows PowerShell ISE, which while still in development, has been quite stagnated in recent years.

So there you have it.  Jeffrey Snover, a Technical Fellow, in the Microsoft Enterprise Cloud Group has a great post and video discussing PowerShell going open source that should be checked out.

https://azure.microsoft.com/en-us/blog/powershell-is-open-sourced-and-is-available-on-linux/

The next thing I’m hanging out for is PowerCLI on Linux.  A demo is shown in a video in the above link running inside a Docker container.  Expect to soon see a VMware Fling release for us to try out.

Meetups, PowerShell, Expanding My Horizons

20160714_184947

I’m not sure what it’s like in other major cities around the world.  But currently Melbourne is going through an IT meetup boom.  On any given week you can find at least one if not multiple meetups going on somewhere in Melbourne.  A big change of years past where we would have only a couple major conferences a year to look forward to.  It’s really quite an exciting period for meetups we’re going through.

So what is going on with all these meetups  —Meetup being the new buzz word we’re seeing slowly replacing the traditional User Group we’re all probably use to.  I think it’s in small part to do with the website meetup.com.  Sure, many of these User Groups have existed well before meetup.com became a thing.  But to find them you had to be part of the right Facebook group, follow the right twitter user, or just learn of it through some word of mouth.  I lost count before meetup.com on how many User Group meetings I missed by learning about it the next day.

We now have a common place we can visit to find all these User Groups and meetups.  Type in DevOps, PowerShell, VMware and dozens of meetups pop up in your local area.  RSVP and see all the other local users also going, not sure what the meetup is about, post a quick question and receive an answer right back.  There’s an update to a meeting, receive an email notification immediately.  I see it as a symbiotic relationship between a globally accepted meetup site and the user group.  We at the Melbourne VMware User Group have even started using it in conjunction with the traditional VMUG website to extend our community base.

CnUF-fZUsAEfGWd

This is how I found out about the recent PowerShell meetup I attended in Melbourne.  With all the scripting I’ve recently been doing in PowerCLI and PowerShell I wanted to expand my horizons a little further and find out how the wider PowerShell community works.  The group has only existed since the start of the year and this was their fourth meetup held in the seek.com.au offices.  The setting for the meetup was very casual and devoid of any advertising or marketing.  That is if you can overlook the office seek logos all over the place.  But considering the worst seek can actually do is find me a new job I’m more than happy to tolerate this 🙂   Of course there was the obligatory Beer and Pizzas which we all know elevates a good meetup to an awesome meetup.

sccm2012A found the format and atmosphere of this PowerShell meetup very appealing.  Heavy on practical content & demos and light on PowerPoint slides.  The setting around a large boardroom table with beer and pizza in hand also lead to a more comfortable environment to engage with community and presenters.  The meetup tended to have a slant towards DevOps practices using PowerShell rather than using PowerShell.  So less about how to connect to this server or use that cmdlet and more around processes and integration.  I was also lucky enough to receive a copy of the book, Learn System Center Configuration Manager in a Month of Lunches, from its author James Bannan.

Due to work commitments of the organiser, the PowerShell meetup was pushed out a day which turned out conflicted with an Azure meetup on the same night.  With so many IT meetup groups current listed and running in Melbourne.  There’s bound to be a small culling, a kind of survival of the fittest, happen.  So whether this PowerShell meetup group succeeds or not only time will tell.  I certainly hope it does and they continue to find that DevOps centric content it aims for.

Until the next meetup…

Melbourne VMUG Meetup Group

Get-View | Show-Object

I was recent watching a PowerShell presentation where they mentioned a cool module called PowerShellCookbook and in particular discussed a cmdlet in it called Show-Object by Lee Homes.  I instantly knew how perfect and powerful it would be with VMware’s PowerCLI Get-View.

Bare with me for a minute while I lay the ground work with Get-View.  If you’ve ever used Get-View in PowerCLI you’ll know that it brings back a ridiculous wealth of information.  When you run a cmdlet like Get-VMHost it’s really only bringing back a small subset of information back on that object.  Sometimes this is fine but sometimes we need that little bit extra to reach our objective.

For example you can run Get-VMHost esxi01.ukoticland.local

Windows PowerShell ISE-000282

What you get is a default formatted table view displaying only a few key values.  A trick some of us do is then pipe this out to a list.  Get-VMHost esxi01.ukoticland.local | Format-List

Windows PowerShell ISE-000283

Heaps more information right, but it’s still not the full picture.  There’s still a lot of information on this object that we’re missing.  Take the original cmdlet we ran above and this time let’s pipe it to Get-View.  Let’s also store it in a variable called $myHost, just so we can work with it.

$myHost = Get-VMHost esxi01.ukoticland.local | Get-View

Windows PowerShell ISE-000284

Okay, on first glance it doesn’t look like much.  But all those values that start with VMware.Vim are properties that can be drill down into.  For example $myHost.Config and $myHost.Config.Capabilities

Windows PowerShell ISE-000288

So it’s pretty cool right.  We can now start retrieving a huge amount of new information that wasn’t available to use before.  But this is like finding a needle in a haystack.  I know I’ve wasted so much time typing $something dot something dot something in the hopes of finding a value I can work with.

Well finally this brings us to Show-Object.  This is an awesome cmdlet that will let you display the object retrieved with Get-View in a grid view window that you can navigate through similar to a directory in File Explorer.  Using it is as simply as piping our variable to Show-Object.

$myHost | Show-Object

Windows PowerShell ISE-000287

Now we can explore and click around at everything available to us.  As you navigate the object in the top pane for results you’ll get member data in the bottom pane.  I see this becoming a great reference tool to help find what you’re looking for.  Not only that but it will give you the syntax to retrieve the information selected in the view pane.

So how do you get Show-Object?  Well, it’s not in PowerShell by default but can easily be obtained from the PowerShell Gallery, which, if new to you, is basically a public repository for PowerShell content.  If you’re using Windows 10 you’re half way there.  If not go get yourself the Windows Management Framework (WMF) 5. This will give you the latest version of the PowerShellGet module.  Then it’s just a matter of typing Install-Module -Name PowerShellCookbook.

Once the module is installed from the PowerShell Gallery, Show-Object is now available to use.  It’s worth noting that PowerShellCookbook comes with a huge array of extra cmdlets also worth exploring.

Finally if you do try out Show-Object and like it, there’s a “jacked up” version of it over at PoshCode by Justin Rich

 

Using PowerCLI to speed up Question time

It seems like all my posts start of the same way, “I recently had an issue…”, but it’s true 🙂   I recently had an issue with snapshots consuming all of the space in a datastore.  The culprit was CommVault preforming end of month full backups of VMs.  During the backup process it takes a snapshot of the VM prior to that full VMDK backup.

It was first identified with a VM having a pending question in the vCenter Client.

get-vmquestion01

It was also quickly identified as having a snapshot by the xxx-000001.vmdk virtual disk name.  A check of the working snapshots of the VM showed that it was created by CommVault.  A check of the VMs datastores the VM used showed that one of them was full.  It was also a good bet that any other VMs on this datastore with snapshots or thin provisioned disks would be having issues too.

The snapshots couldn’t be deleted as there was no free space on the datastore to commit the transactions from the snapshot VMDK.  I could have grown the size of the datastore but that would have involved Change Requests with changes at the SAN and vSphere level.  Without panicking too much I identified a rather small working VM and migrated it’s storage to another datastore.  The goal here is to free up space on the datastore in preperation to answer the pending questions and get the VM running ASAP.

While the VM was being migrated to a different datastore it gave me a few minutes to identify which VMs had a pending question and would have been in a paused state.  Now there’s probably 17 different ways to approach this but I just went with a direct method.

Using PowerCLI and connecting up the vCenter Server.  I then ran Get-VMQuestion.

Connect-VIServer my_vcenter_server.domain.local

Get-VMQuestion

This will by default return all VMs in the vCenter instance that have pending questions.  If you have a large vCenter with 1000+ VMs this will most likely take a while and you might want to target the command to specific datastores or ESXi servers.  In my situation I wanted to make sure I identified all pending questions and didn’t miss anything.

get-vmquestion02

By this point my VM migration just completed so I could now answer all the pending questions and resume the VMs in the paused state.  This can be done by using the Set-VMQuestion.

Get-VMQuestion | Set-VMQuestion –Option “Retry” -Confirm:$false

It doesn’t really get simpler then the above command.  Identify pending questions with Get-VMQuestion.  Pipe it to Set-VMQuestion, answer all with ‘Retry’, and use the parameter ‘Confirm’ so not to get prompted to confirm action.  Again probably smarter ways to go about this.  You could use the Get-Datastore cmdlet to identify datastores with zero free space.  Then target those datastores with Get-VMQuestion.

Get-Datastore my_datastore | Get-VM | Get-VMQuestion

The Get-VMQuestion / Set-VMQuestion is a great PowerCLI way of working smarter not harder.  Whether answering 1 or 100 questions it’s all the same via PowerCLI.  I don’t really know if there’s an easy way to identify and answer pending questions through the vCenter Web or C# client???

On a side note.  Set-VMQuestion isn’t overly intuitive as a cmdlet name for answering a question.  So there is an alias to it called Answer-VMQuestion.  I guess it sticks with PowerShell tradition with the Get / Set verbs.

Connecting to vCloud Director via PowerCLI

I’m currently stuck administrating a vCloud Directory 1.5 environment.  I don’t have any major concerns with vCD 1.5 but sometimes I do find the web portal a little awkward to navigate around.  VMware have done a great job in creating PowerCLI cmdlets that open up access into the vCD APIs.

You can obtain access to the cmdlets via two methods.  You can download PowerCLI from VMware.  You’ll need at least version 5.0.1.  Or you can download PowerCLI for Tenants.  This version contains only the vCD cmdlets and removes all the other vSphere cmdlets not relevant to vCD.

If you’re connecting to vCD over the internet the great thing is you don’t need any extra or special ports opened to use PowerCLI.  Connection is done over HTTPS (Port 443) using the domain name of your Cloud Service Provider’s vCD portal.

You’ll also need your ORG name within vCD.  To find out your ORG name connect up to the vCD web portal.  Navigate to the Administration tab and select General under Settings in the left pane.

vcd_connect01

Open up PowerCLI.  Use the cmdlet Connect-CIServer to initiate a connection.

Connect-CIServer -Server portal.vpdc.domain.com -org MYORG

You should then be prompted for your vCD login credentials.

vcd_connect02

Once connect you you can start playing around with some of the more basic get commands.

To view your usage of CPU, Memory and Storage allocation you can use get-orgvdc.

get-orgvdc | ft -autosize

vcd_connect03

To list all your VMs it’s just a matter of get-civm

get-civm | ft  name, status, cpucount, memorygb -autosize

vcd_connect04

To list details of a specific VM you can try the follow

get-civm -name MYSERVER | fl

vcd_connect05

The cmdlets won’t give you the full feature set as the web portal.  Never the less you’ll find that you can speed up a lot of the daily administrative tasks you do.  It’s also a great way of extracting out information for reporting.

References

Vmware Connect-CIServer 

vCloud Director PowerCLI Community

vCD cmdlet reference

Dude, Where’s my mail statistics?

One thing that really bugged me when Exchange 2007 was released was the removal in the GUI to view mailbox size and total items.  I can’t figure out why?  Though I’m sure there is something written in 3pt text in an obscure TechNet article somewhere?!?!

In Exchange 2000/3 you would drilled down to your Storage Group / Mailbox Store / Mailboxes.  In Exchange 2007/10 it just doesn’t exist.  So how to you get this information?  PowerShell to the rescue… again.  Just another clear indicator that Microsoft wants us using PowerShell more and more.  Using Get-MailboxStatistics we can retrieve this basic information and so much more.

Get-MailboxStatistics

Running the Cmdlet by itself without any parameters will list DisplayName, ItemCount, and StorageLimitStatus for all mailboxes.

We can narrow it down to one user by using the following script.

Get-MailboxStatistics –identity “test.user”

If we want to replicate something similar to what we got in Exchange 2000/3 we can use the following script

Get-MailboxStatistics | ft DisplayName, LastLoggedOnUserAccount, TotalItemSize, ItemCount, LastLogonTime

Using the Format-Table cmdlet we can add a number of different columns in addition to the five above.  Below are columns we can display and filter on.

AssociatedItemCount
Database
DatabaseName
DeletedItemCount
DisconnectDate
DisplayName
Identity
ItemCount
LastLoggedOnUserAccount
LastLogoffTime
LastLogonTime
LegacyDN
MailboxGuid
ServerName
StorageGroupName
StorageLimitStatus
TotalDeletedItemSize
TotalItemSize

There are a few things we can now do to clean up our report.  TotalItemSize is returned to us in Bytes –not very user friendly.  So we can convert this to MB.  We can also sort our results by this column as well.

Get-MailboxStatistics | sort totalitemsize –Descending | ft DisplayName, LastLoggedOnUserAccount,@{n="Total Size (MB)";e={$_.totalitemsize.value.toMB()}}, ItemCount, LastLogonTime

So now we have something very similar like the old Exchange System Manager would have provided us on the screen.  In those days we would simply right click on our Mailbox container and select Export.  We can easily achieve the same thing with PowerShell.  Many people just redirect this output to a text file –which is fine.  The better way is to pipe it to a CSV file

Get-MailboxStatistics | sort totalitemsize –Descending | select-object DisplayName, LastLoggedOnUserAccount,@{n="Total Size (MB)";e={$_.totalitemsize.value.toMB()}}, ItemCount, LastLogonTime | Export-CSV –path ‘c:tempMailboxstats.csv’

Updated: The above line has been modified to include select-object instead of format-table due to using the Export-CSV cmdlet.

We now have a nice comma delimitated output file already sorted by Mailbox size.

Get-MailboxStatistics is a rather simple and easy to use cmdlet.  Once you become comfortable using the cmdlet, you can have a number of pre-defined scripts ready to run just how you like.

If you have a lot of mailboxes and want to disregard mailboxes below a certain size you could filter on TotalItemSize.  The below example will only return mailboxes greater than 100MB.

Get-MailboxStatistics |where {$_.TotalItemSize -gt 100MB} | sort totalitemsize | select DisplayName,ItemCount,@{n="Total Size (MB)";e={$_.totalitemsize.value.toMB()}} |ft -AutoSize

At a later stage I’ll touch on how to get this report to run to a schedule and email the output to yourself or a manager.

Appendix.

Exchange 2003 Mailbox view 

Dynamic Distribution Groups –dynamically disturbing

If you haven’t seen or used Dynamic Distribution Groups (DDG) in Exchange before, which rock have you been hiding under?  DDG has been around since Exchange 2007 and its precursor Query-based Distribution groups way back since Exchange 2000.  QBDG never seemed to get too much attention.  With the introduction of Exchange 2007 and PowerShell it’s become a lot easier to use and manage.

DDG works by querying Active Directory for object attributes. For example, Company Name or Department.  So the foundation of fully functioning DDGs is having an actively updated and maintained AD.  As everything Exchange, you can manage DDG in either the Exchange Management Console or via PowerShell.  EMC is great for quick and dirty work but PowerShell is where you can do some cool stuff with DDGs.

DDG has been on my mind recently so this was a great opportunity to not only touch on how to create and manage a DDG but also around some of the issues around them.

Creating a DDG via the console wizard is very easy and self-explanatory.  So I won’t go delve to deep into it.

Step to create are as follows…

Open the EMC

Navigate to Recipient Configuration -> Distribution Groups

Using the Action Pane select new Distribution Dynamic Group

The Dynamic Distribution Wizard will run.  Enter a name for the new distribution group and click next.

Select mailbox types to filter by (default is all) and click next.

Create your conditions and click next.

The final screen gives you a summary detailing your options.  Clicking New will create the DDG.

The console wizard has a number of limitations, namely you only have the option to query off three main attributes, State or Province, Department, and Company plus a half dozen or so custom attributes (which I’ve never used).  For many organisations though this is more than sufficient in creating those generic company and department wide lists.  It’s only when you start working with PowerShell that you can really take advantage of DDGs, but be mindful of how elaborate (or complicated) you get.

The PowerShell command to create a DDG is New-DynamicDistributionGroup.  So as an example, say you set the Company attribute under the Organisation tab within all AD User objects.  We would execute the following PowerShell script to create our new company wide Distribution List.

New-DynamicDistributionGroup –name “My Mailing List” –recipientfilter “(Company –eq ‘My Company Name’)” –recipientcontainer ‘mydomain.local’

Fairly straight forward right?  We specify in recipient filter to use the Company attribute and make sure it equals our company name.  If it does, the user will be part of our new distribution list called “My Mailing List”.

The important part to make note here of is the –recipientcontainer parameter.  By default when you create a DDG in PowerShell it will only run its query against the default Users container.  By specifying mydomain.local we are tell the query to run from the root of our domain and include all sub OU containers.

Say you want to now modify your new DDG to query only users in a particular OU called Australia.

Set-DynamicDistributionGroup –identity “My Mailing List” –recipientContainer “OU=Australia,DC=mydomain,DC=local”

The command is similar to our initial one.  We’re telling the script which DDG to modify but this time we only need to put the parameter we want to change.

To view who has been added to this DDG we would type the follow

$Users = Get-DynamicDistributionGroup –Identity “My Mailing List”

Get-Recipient –RecipientPreviewFilter $Users

What about something we can’t do in the EMC with Dynamic Distribution Lists?  Okay, say we are setting the Managers attribute in our organisation on User objects to specify who their manager / team leader is within AD.  We could create a team mailing list that would add users automatically as they move between managers / teams.

New-DynamicDistributionGroup -name "Team Leader Mailing List" -recipientfilter "((Manager -eq 'cn=Jane Doe,ou=Employees,ou=Australia,dc=mydomain,dc=local') -or (Name -eq 'Jane Doe'))" –recipientcontainer ‘mydomain.local’

There’s two parts to this command.  Firstly we query any users whose managers is called Jane Doe, and the only way to do this is to specify our manager’s full LDAP path.  The second part is actually telling the script to statically add in the manager of the team -as they won’t be managing themselves and would most likely have a different manager specified.

You can even add users of security groups to a DDG.  I’d only recommend this is very specific circumstances.  Only because you’re basing a dynamic list of a static list.  You may have good reason to do this however (e.g. You may have to add addition recipients but not want them part of the security group).

New-DynamicDistributionGroup -name "My Mailing List" -recipientfilter "((MemberOfGroup -eq My Security Group’) -or (Name -eq 'Jane Doe') -or (Name -eq 'John Doe'))" -recipientContainer 'mydomain.local'

Don’t get carried away with Dynamic Distribution Groups just because you can manage to find a query to add anyone you want.  The above script is a perfect example.

Things to make note of…

Don’t implement a Dynamic Distribution Group if it adds users that haven’t been requested part of a list.  These situations call for tradition distribution groups.  Forcing a DDG onto a team with superfluous users doesn’t achieve anything.

Be careful of users hijacking onto the back of a Dynamic Distribution Group.  DDGs are based on AD attributes.  Depending on your environment, users will be able to update certain attributions on their OU object.  If a user is allowed to update their own Title and your DDGs are based off Titles.  A user can move themselves in and out of Distribution Lists at will.

You will not be able to view DDGs through Outlook.  This is by design as you don’t want to overload AD with constant queries on distribution groups.  Keep this in mind when a user rings up asking why they are not part of a list.

Lastly, remember the -recipientcontainer parameter in your script.  I always forget this 🙂

EMC Message Tracking vs PowerShell Message Tracking

Okay so technically it’s all PowerShell right?!?  Well yes and no… more yes.  Sure, when you use the EMC Message Tracking tool, it creates the PowerShell command at the bottom of the window for you to copy and modify as you see fit.  For simple reports for one user on one Exchange server this is fine.  As soon as you need to run more complex reports for multiple user and servers you’ll quick see that doing it directly from PowerShell is much more effective.

Recently I was asked to run a number of reports based off 100+ users on multiple Exchange servers.  The EMC Message Tracking GUI is limited to running reports on one user and one server at a time so this just wouldn’t cut it for my situation.

So how do you make get-messagetrackinglog retrieve reports on multiple users and servers in one script?  Unfortunately you can’t just put multiple email addresses or servers after a parameter.

The trick here is to use the foreach-object cmdlet.  We can place all our users and servers into a variable and then use foreach (alias to foreach-object) to loop through all of our arguments.

In the below example I also use a variable for an output file and starting / ending date ranges.  In my situation it was about recycling the script for different reports as quickly as possible.

$servers = "server1","server2"
$senders = "[email protected]","[email protected]"
$outputfile = "C:tempoutput.csv

$start = “DD/MM/YYYY HH:MM:SS”
$end = “DD/MM/YYYY HH:MM:SS”

$(foreach ($sender in $senders) {
$(foreach ($server in $servers) {

Get-MessageTrackingLog -Sender $sender -Start $start -End $end -EventID “SEND” -Server $server -resultsize unlimited | select Timestamp, @{Name=”Recipients”;expression={$_.Recipients}}, MessageSubject, Sender

})}) | sort-object -property timestamp | Export-CSV -path $outputfile

We’ve now turned a basic command into quite a flexible script for receiving message logs out of exchange.

If you’re going to work out of Excel (or something similar) the sort-object is probably superfluous.  Where it’s useful is if you choose to output to the screen to quickly check what’s being returned.

By replacing

Export-CSV -path $outputfile

With

Format-Table –autosize

You can quickly see what results are returned.  Just remember to increase your screen buffer through the Powershell Window properties, else some columns might not be returned.

WARNING: 2 columns do not fit into the display and were removed.

Finally a note on the -resultsize parameter.  Without it your results would be limited to 1000.