Configure An HTTP Internet Proxy In vRealize Network Insight

Well, we’re up to vRealize Network Insight 3.7 and still no GUI way of setting an internet proxy.  Usually you would configure an HTTP Internet proxy via the VAMI but that also doesn’t exist for reasons I don’t quite understand???  Never the less the process to configure an HTTP Internet proxy can be performed via the CLI easy enough.  Reasons why we might want to do this, apart from the obvious of gaining internet access where none exist without a proxy, is so we can check for software updates and connect to support.

The command we use is set-web-proxy

(cli) set-web-proxy -h
usage: set-web-proxy [-h] {show,enable,disable} …

set the web http proxy (for Internet access)

positional arguments:
show show current configured http proxy state
enable enable http proxy
disable disable http proxy

optional arguments:
-h, --help show this help message and exit

1. First thing we do is connect up to an interactive console session or SSH into our vRNI boxes with the consoleuser account.  The default password for consoleuser if you haven’t changed it is ark1nc0ns0l3

2. Type in set-web-proxy show

You will see something similar to below.

(cli) set-web-proxy show
Http proxy connection disabled

3. Next we set our HTTP proxy using set-web-proxy enable.  This will stop and start a few services but not cause any disruption the to running of vRNI.  Below is an example with a proxy address and port.

(cli) set-web-proxy enable --ip-fqdn vrni-platform.mydomain.local --port 3128
Stopping services
* Stopping DNS forwarder and DHCP server dnsmasq [ OK ]
nginx stop/waiting
launcher-service stop/waiting
* Starting DNS forwarder and DHCP server dnsmasq [ OK ]
Enabling http proxy connections…
Http proxy connection enabled
Connected to http proxy vrni-platform.mydomain.local:3128
* Stopping DNS forwarder and DHCP server dnsmasq [ OK ]
Starting services
* Starting DNS forwarder and DHCP server dnsmasq [ OK ]
nginx start/running, process 5337
launcher-service start/running, process 5415

4. We run set-web-proxy show again.

(cli) set-web-proxy show
Http proxy connection enabled
Connected to http proxy vrni-platform.mydomain.local:3128

5. Finally we can run show-connectivity-status

A bunch of network information will be returned along with connectivity status of a few URLs.

Upgrade connectivity status ( Passed
Support connectivity status ( Disabled
Registration connectivity status ( Passed

Web Proxy connectivity status: Passed

Over in the settings page of vRNI you should now see some green icons indicating Upgrade Server Reachable.


Offline Upgrade vRealize Network Insight

Earlier this week VMware release the latest update to vRealize Network Insight, version 3.7.  If you jumped on this new update as I did you might have been caught out by a bad build (  Upgrading to this version had a DNS issue that caused a communication issue between the Platform appliance and the Proxy appliance.  The version was quickly pulled and replaced a day later with a new and working build,  It’s unlikely that you will have this old build but before upgrading it’s best to check.

vRNI can be updated in two ways.  An Online upgrade via the GUI and an Offline upgrade via the CLI.  There are a few reasons why you might want to perform an Offline upgrade.  Cluster upgrades can only be performed via an Offline upgrade.  Your vRNI appliance might not have internet access.  Or like me you have configured a proxy server on your vRNI appliances but because vRNI wants to make your life difficult it doesn’t detect new updates.

The Offline upgrade can only be performed on version 3.5 or 3.6 and is very similar to previous upgrades.

1. Download the ZIP bundle from VMware.

2. Snapshot both your Platform and Proxy appliances or live life like a cowboy.

3. Copy (SCP) the ZIP bundle to both appliances (Platform & Proxy).

I had difficulties using WinSCP due to the limited console access given by the appliance so I used pscp.exe that comes with the Putty package.  The location to where you can copy the bundle to can also be a bit of a challenge.  I chose /home/consoleuser/downloads/ using user consoleuser.

Below is the command I ran from a PowerShell prompt from my Windows box.

PS C:\Program Files (x86)\PuTTY> .\pscp.exe -scp E:\temp\VMware-vRealize-Network-Insight. [email protected]:~/home/consoleuser/downloads/

4. SSH over to the Platform appliance with the user account consoleuser which has to be upgraded first. The default password for consoleuser in vRNI is ark1nc0ns0l3

5. Run the package-installer command to upgrade the appliance.

Below is an example of the command I ran.

package-installer upgrade --name /home/consoleuser/downloads/VMware-vRealize-Network-Insight.

The upgrade process can take a while so be patient.  A successful upgrade should look similar to below.

login as: consoleuser
[email protected]’s password:
vRealize Network Insight Command Line Interface
(cli) package-installer upgrade --name /home/consoleuser/downloads/VMware-vRealize-Network-Insight.
Do you want to continue with upgrade? (y/n) y
It will take some time…
Successfully upgraded

6. SSH over to the Proxy appliance now with the same user account consoleuser.

7. Run the same command as on the Platform appliance.

package-installer upgrade --name /home/consoleuser/downloads/VMware-vRealize-Network-Insight.

As with the Platform upgrade it will take some time and the output after the upgrade should look the same

8. (Optional) Run show-version and confirm you are running the latest version build on each appliance.

That’s all there is to it.  Stopping and Starting services aren’t necessary.  As with no need to reboot appliances.

You can now open up a web browser and login to your upgraded vRNI platform appliance. Check that everything looks over in settings.


Melbourne VMUG 2017 Recap

Last week the Melbourne VMware User Group held its final vBeers event of the year, closing out another massive year for the group.  It’s been a lot of hard work to bring the group to the end of the year but it’s been hugely rewarding to be part of.  Throughout the year I met a lot of people being introduced to VMUG for the first time.  Received some amazingly positive feedback from both new and old faces to the group.  And of course made some awesome new friends.  All of which really makes the hard work worth while.

As always Melbourne VMUG year kicked off its year with our annual UserCon in March, an all day free conference run by the community for the community.  For an event held on the other side of the world to many it’s amazing to see our UserCon really start to make a name for itself.  This couldn’t have been more demonstrated with the calibre of international guests eager to come out and support us this year.  From our opening Keynote speakers of Duncan Epping and Amy Lewis, to our closing keynote speakers William Lam and Alan Renouf.  Not to mention Emad Younis, Josh Atwell, and Rebecca Fitzhugh.  An amazing group of people whom I have a new level of respect for.

Following on from our UserCon we held three quarterly meetings throughout the year.  A few notable sessions that really stood out for me were The Register’s, Simon Sharwood, and VMware APJ CTO, Bruce Davie.  Some very insightful sessions that really made you thing about our industry.  New for this year we introduced a closing survey to all our meetings.  The responses received from these surveys would go into shaping subsequent meetings.  It’s been great to see the community support this initiative and be more involved in shaping how future VMUGs runs in Melbourne.  A big thank you to everyone that participated in these surveys!

Back in November a few of us on the committee travelled up to Sydney to help represent not just Melbourne but VMUG Australia at vForum Australia.  I’ve already spoken about vForum in a previous post so there’s not much more I can add to that.  It was a great experience to be behind the booth promoting VMUG to a whole new audience from around the country.  I now have a new level of respect for the hard work vendors and the people put in behind the booth at these kind of events.

Carrying on from previous years we continued to support and run separate vBeers events in between our UserCon and quarterly meetings.  It’s been great to see Melbourne vBeers really coming into it’s own this year.  The more intimate setting allowed for some great conversations that aren’t always possible at regular VMUG meetings.  As with our quarterly meetings many new faces have starting becoming regulars and support from vendors has increased allowing us to do more sponsored nights and drinks.  I look forward to meeting many new and old faces over vBeers next year.

It’s hard to convey in a short post everything that Melbourne VMUG has achieved this year.  Hopefully some of these numbers will help speak to that.  1 UserCon, 3 Quarterly meetings, 5 vBeers nights, 10 community presentations, 12 VMware presentations, and 6 Vendor presentations.  Some ammazing numbers that sets a high benchmark for next year.  I’m extremely appreciative of all those that presented and helped to give back to the community this year.

As always a huge thank you to the committee and supporters of Melbourne VMUG over 2017.  In particular Tyson Then, fellow co-leader.  I could not ask for a nicer guy to be a co-leader with.  Along with fellow committee members Andrew Duancey, Brett Johnson, Damien Calvert, and recently departed Craig Waters.  As well as some of our regular VMware liaisons Mo Jamal, Kev Gorman, and Chris Garrett.  The list does goes on, i’m sure I’ve missed many, just know your your time and effort has been greatly appreciated 🙂

I hope everyone has a great Christmas and an awesome New Year.  Attention now turns towards our next UserCon in March 2018.  Hope to see you all then!

vForum Australia 2017 Recap

Another year and another vForum has come and gone.  This has really become a stand out event for me on the local calendar.  For the Australia region this is the closest we can get to VMworld without actually going to VMworld.  This year vForum had returned back to the Sydney Convention Centre which has recently been rebuilt.  Unlike previous years the event had moved from a two day event to one.

My frame of reference for vForum is fairly small as this was only my second vForum Australia (Actually there was a vForum Roadshow in Melbourne a few years back too).  Without a doubt the biggest improvement made was the location.  Bring vForum to the centre of Sydney on Darling Harbour was a big win.  Hotels are plentiful and the views are amazing.

I arrived in Sydney from Melbourne the day before vForum.  My manager from Brisbane was down in Sydney on unrelated work so this was a good opportunity to catch up in person for a few drinks on George St (I can’t believe construction on the light rail is still happening on George St which I also recall from last years vForum).

While I had every intention on going, unfortunately I didn’t make VMDownUnderground this year, an event organised and run by the Sydney VMUG crew the night before vForum.  Last year’s VMDownUnderground was a  great event but I had used the opportunity to have dinner with fellow work colleagues on Darling Harbour.  Being Melbourne based and having most of my team in Sydney I don’t get this opportunity often.

This year I was not only representing myself and my organisation but also VMUG as the Melbourne Leader.  With the help of VMUG HQ and the vForum event planners the local VMUG Australia chapters pooled our time and resources to run a booth.  There were ~40 vendors on the showroom floor this year.  VMware and the event planners did a great job with vendor layout with all locations being great.  We, VMUG, were lucky enough to secure a prime location across from VMware in the centre of the showroom right next to the VMware charity water challenge.

While my day started off at 7 AM helping to setup and prepare the VMUG booth.  The official start of vForum Australia was the keynote at 9 AM with VMware COO Sanjay Poonen opening.  The attendance for the keynote was huge.  The entire keynote hall was almost completely full, a real great buzz to it.  The keynote sessions ran till just after 11 AM.  At which point a large proportion of attendees to the keynote left the event (or possibly went to the side events).  Though that didn’t deter from the atmosphere during the remainder of vForum.

Foot traffic around and to the VMUG booth was nothing short of amazing this year.  Having a Claw Machine full of plush toys at our booth I’m sure didn’t hurt either.  This was a huge success in drawing attendees to our booth.  Not only attendees but vendors and VMware staff were lining up for a game.  One of our original goals, as VMUG Australia, was to promote the upcoming Sydney and Melbourne UserCons but we quickly switched to brand awareness for VMUG.  I was amazed to find out so many people still hadn’t heard of VMUG!

vForum Australia ended with the after party at Hard Rock Cafe right next to the convention centre.  A great opportunity to wind down with friends and finally grab some food and drinks.  Compared to last year’s vForum party with Rouge Traders playing (whom I’m a big fan of).  Hard Rock was a slightly more subdue affair.  It did lead to a more intimate setting where you could have more meaningful conversations with people, so in that regards a success.

I still had a little bit left in me after Hard Rock.  So before calling it a night I headed back to my hotel to drop off my swag and have a shower before heading out for a few drinks and cocktails with some vForum friends at Palmer and Co.  A small underground bar set in a 1920s speakeasy style.

While I would have like to see vForum as a two day event, particularly with the addition of Transform Security and Empower Digital Workspace events running at the same time.  Whatever the format VMware and vForum always put on a great event for attendees.  I’m already looking forward to next year with catching up and meeting new people in the community.

HaveIBeenPwned PowerShell Module

If you haven’t heard of Have I Been Pwned, firstly what are you doing?  It’s a site created by fellow Aussie Troy Hunt.  Troy aggregates data breaches as they become public into a searchable database. One of the primary goals of Have I Been Pwned is to raise security awareness around data breaches to the public.

As a bit of a learning exercise to myself, I created a PowerShell Module that leverages the APIs.  The module contains five Functions, Get-PwnedAccount, Get-PwnedBreach, Get-PwnedDataClass, Get-PwnedPassword, and Get-PwnedPasteAccount. I like to think of the HaveIBeenPwned PowerShell Module as an Enabler. By itself it does nothing more than what the site does. But by leveraging the Power of PowerShell and returning the results in object format the data can be easily manipulated for many other purposes.

Installing and using the Module and Functions is very simple. Ideally you will be running PowerShell 5 or above which will allow you to easily download and install from the PowerShellGallery. If you’re not on PowerShell 5 I’d highly recommend you download the WMF 5.1 (Windows Management Framework) which includes PowerShell 5.

Installing the module is simply a matter of typing the following.

PS F:\Code> Install-Module -Name HaveIBeenPwned

Once installed you can view all the Functions available with the following command.

PS F:\Code> Get-Command -Module haveibeenpwned 

CommandType     Name                                               Version    Source                                                                               
-----------     ----                                               -------    ------                                                                               
Function        Get-PwnedAccount                                   1.1        HaveIBeenPwned                                                                       
Function        Get-PwnedBreach                                    1.1        HaveIBeenPwned                                                                       
Function        Get-PwnedDataClass                                 1.1        HaveIBeenPwned                                                                       
Function        Get-PwnedPassword                                  1.1        HaveIBeenPwned                                                                       
Function        Get-PwnedPasteAccount                              1.1        HaveIBeenPwned      

The two main Functions are Get-PwnedAccount and Get-PwnedPassword.

The first, Get-PwnedAccount, will enumerate if an account, based off an email address, has been found in the Have I Been Pwned list of data breaches.

PS F:\Code> Get-PwnedAccount -EmailAddress [email protected]

In the above example all breaches are listed where the account used [email protected] as the email address. Which is huge by the way.

The second and slightly more controversial, Get-PwnedPassword, will take a password and confirm if it has been identified in a data breach.  Get-PwnedPassword will accept a password in three different formats.  Plain text, Secure String, and SHA1 hash.

PS F:\Code> Get-PwnedPassword -SHA1 AB87D24BDC7452E55738DEB5F868E1F16DEA5ACE

In the above example a SHA1 hash was generated offline using Quick Hash GUI.  Get-PwnedPassword will then send that Password or SHA1 hash in the body of a HTTPS request to Have I Been Pwned.  Now, obviously, what can been see as the controversial part off this is not only do you have to trust Have I Been Pwned but also this PowerShell Function.

All Functions come with Help and Examples which can be view using Get-Help.  For example.

PS F:\Code> Get-Help Get-PwnedPassword -Examples

The Module and all Functions can be found in the PowerShellGallery for download.  The Module can also been found in my public GitHub Project  All code can been view and sanity checked and is free to consume.


Lastly, I thought I might show how you can go one step further from simply enumerating an individual account. Many organisation’s IT departments create and manage accounts for their staff. They also provide security awareness training in protecting online accounts. An organisation could take a CSV list of their staff’s email addresses, import that list into PowerShell, and run it against the Get-PwnedAccount Function and identify if any of their staff have been involved in a data breach.

In the below example I import a small CSV file I have created with a list of email addresses. Then using half a dozen lines of code I iterate through the CSV list of email addresses and identify all the accounts that have been involved in a data breach. Using this information I can pro-actively notify staff to review these accounts.

$emails = Import-Csv F:\email_list.csv
foreach ($email in $emails) {
    $email = $email.accounts
    $results = Get-PwnedAccount -EmailAddress $email
    if ($results.status -ne 'Good') {
        foreach ($result in $results) { 
            $breach = $result.title
            Write-Output "Email address $email has been found in a $breach breach"
    Start-Sleep -Milliseconds 1500

And sample output after running the above code.

Email address [email protected] has been found in a Yahoo breach
Email address [email protected] has been found in a Youku breach
Email address [email protected] has been found in a Zomato breach
Email address [email protected] has been found in a 000webhost breach
Email address [email protected] has been found in a 17 breach
Email address [email protected] has been found in a Adobe breach
Email address [email protected] has been found in a Bell (2017 breach) breach

Download Links

Recap: VCP-NV Certification (2V0-642)

Earlier this week I took and passed the VCP-NV (2V0-642) exam.  I do have to say it was a really good experience.  It’s one of the few exams I really did enjoy studying for and sitting.  So I thought I might use this as an opportunity to post a short recap of my experience and what I used to study and pass the exam.

Getting some of the technicalities out the way all of which can be found at VMware’s VCP-NV landing page.  The 2V0-642 exam is VMware’s updated version 2 of the original VCP-NV exam which officially came out back in 2015.  Back then it was a 120 questions and by all accounts much harder than this new revised version.  This revised exam, based on NSX 6.2, is 2 hours long and 77 questions with a standard 300 passing score out of 500.  If you currently hold a VCP the process to certification is fairly straight forward.  Take and pass the 2V0-642 exam and earn certification.  If you don’t hold a VCP you have a number of pre-requisites to meet.  Again, all of which can be found at the VCP-NV landing page.

So first how was the exam?  As I mentioned above, a really good experience.  Gone are the days of having to take a pre-exam survey.  Just acknowledge the Terms and Conditions and the exam begins immediately -Awesome.  The questions were well laid out, clear, and descriptive enough to understand.  Of course it wouldn’t be a real exam without one or two confusing questions and there were a few of them, but only a few.  The exam questions are all weighted so at the end of the day it is a level playing field for everyone.

So what was my process for studying for this exam?

I guess firstly I’ve attended many presentations and watched a number of high level videos on NSX but nothing really deep on the product, nothing really exam helpful.  A few months back (the week before VMWorld) I attended the 5-day Install, Configure, Manage course on NSX 6.2.  This was a great course and a good primer into learning to use NSX.  Very helpful grasping the fundamentals in being able to get started.  Well recommended for everyone getting started.

Next came actually using the product in a real lab environment.  I think this is a requirement!  Bare minimum you should be using VMware’s Hands on Labs but even better is to have your own environment.  I’m lucky enough to be preparing for a production deployment and had a test lab to deploy and play with.  Having your own environment constantly available is hard to beat.

vBrownBag YouTube videos!  There is a VCP-NV series available on YouTube.  The videos are based on the original VCP-NV exam and are a few years old but still very relevant.  Actually still extremely relevant.  There’s eight videos to hunt around for which cover the original objectives with the exception of Troubleshooting.  The Objectives match up very closely.  The 2V0-642 exam has one main new Objective which covers Cross-vCenter.

In terms of reading material i would highly recommend going through the official NSX online docs pages.  Lots of mindless reading but you will find that exam questions come straight out of that.  And truthfully you will learn a huge amount doing that.  Just remember to focus on version 6.2.  I’d also recommend the Cross-vCenter NSX Installation Guide PDF from VMware.  This is also in the online docs but I found the PDF easier to consume which I found to be hugely informative and the exam did test heavily on this for me.  So I was very thankful to have focused on this reading.

And that was basically it.  Practice hands on what you have learnt and read.  Troubleshoot in your lab as you are building it out.  A few solid study days on the weekend and you should be in a really good position to take and pass the exam.


HTTP Error 500 Post Upgrade to vCloud Director 9.0

This week I decided to jump on the upgrade bandwagon along with a number of other excited people in the vExpert Slack group.  While most, if not all, had success stories I unfortunately ran into some post upgrade portal issues.

The upgrade process to version 9.0 was no different from previous releases.  I followed my regular upgrade process which went off without issue.  When I went to log into the Administrator Portal I was faced with an HTTP Error 500 page.  Argh!


Problem accessing /cloud/saml/login/alias/vcd. Reason:

Server Error

Caused by:

javax.servlet.ServletException: org.opensaml.saml2.metadata.provider.MetadataProviderException: No IDP was configured, please update included metadata with at least one IDP at at at com.vmware.vcloud.web.NestedFilterChain.doFilter( at com.vmware.vcloud.web.UnfirewalledFilterChainProxy.doFilter( at$VirtualFilterChain.doFilter(

To my surprise tenant Portals were fine and able to log in.  This was Admin Portal specific.

Checking the release notes I knew there was a breaking change with Federation and SAML which required you re-register your organization with your SAML IDP.  That’s fine I thought, were not using SAML.  And besides the notes seem to indicate you make the change post upgrade.

System administrators cannot use an existing vSphere SSO configuration to authenticate to vCloud Director.

Federation for the System organization has changed in this release. The System organization can now use any SAML IDP, not just the vSphere Single Sign-On service. Existing federation settings for the System organization are no longer valid and are deleted during the upgrade.

Workaround: Re-register your organization with your SAML IDP. See “Enable Your Organization to Use a SAML Identity Provider” in the vCloud Director Administrator’s Guide

Turns out, though, we were in fact using SAML, or at least had it enabled in a non functioning state.  So despite the release notes stating that it would be deleted, it appeared to remain in a broken state post upgrade and now was preventing the Portal from loading at all.

The solution turned out to be relatively easy with VMware GSS help.  Login to the Admin Portal specifying the full URL to the login.jsp file with your standard System Administrator account.


Navigate to the Administration Page and then to Federation.  Untick Use SAML Identity Provider and Apply.

The change should take effect immediately.  Logout and back in as you normally would to the portal without the trailing /cloud/login.jsp.

While I’m sure this was a corner case please take note of your SAML settings.  If you don’t use it, make sure you don’t have it enabled.

VMware Update Manager (VUM) Fails To Load Within vSphere Web Client

I recently upgraded my lab VCSA from version 6.5 (Build Number 5705665) to version 6.5 U1 (Build Number 6671409).  After the upgrade I noticed that VMware Update Manager was no longer working correctly.  Navigating around the various VUM pages I received the same consistent error message.

interface com.vmware.vim.binding.integrity.VcIntegrity is not visible from class loader

VUM management page

VUM Tab within an ESXi host

Checking the vCenter services within Administration > System Configuration they all appeared Up and Running.  Though all services were running I never the less restarted the VMware Update Manager service which unfortunately didn’t help.  I also tried restarting a few other services without much success.  So rather than just continuing to randomly restart services I decided to take a tougher approach and restart all services from the CLI.

After the stopping and starting of all vCenter services, which took a few minutes, VUM was back up and running again within the vSphere Web Client.  While this was a fairly drastic step to take, so would have been rebooting the vCenter server, which I’m glad I managed to avoid.

I’ve previous written about restarting vCenter services.  The process is quite simple.   First connect up to the CLI of the VCSA box.  Then run the below two commands.  Both the stopping and starting of services will take a few minutes each.  Once the services are restart the Web Client will take a further few minutes to fully start up and be accessible.  If all is successful Update Manager should be accessible once again.

Command> service-control --stop --all

Command> service-control --start --all

Restarting all the vCenter services like this is obviously a disruptive action.  Connectivity to vCenter will be dropped while the services restart.  Usually restarting all services on vCenter via the CLI is my last ditch attempt to resolve an issue before I attempt a reboot of the appliance.  While restarting the VCSA might have been the easiest thing to do to resolve this issue it’s not always necessary.

Could not establish trust relationship for the SSL/TLS Secure Channel – Invoke-WebRequest

I’ve recently been playing around with VMware’s REST APIs in VCSA 6.5 using PowerShell. I’ve been using a lot of Invoke-WebRequest and Invoke-RestMethod to do my work. Chris Wahl has a great primer on how to get started here.

One issue that I ran into very quickly working again my VCSA was a certificate trust relationship error. I’ve run into this error numerous times in the past.

PS F:\Code> Invoke-WebRequest -Uri -Method Post -Headers $head
Invoke-WebRequest : The underlying connection was closed: Could not establish trust relationship for the SSL/TLS secure channel.
At line:1 char:1
+ Invoke-WebRequest -Uri ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (System.Net.HttpWebRequest:HttpWebRequest) [Invoke-WebRequest], WebException
+ FullyQualifiedErrorId : WebCmdletWebResponseException,Microsoft.PowerShell.Commands.InvokeWebRequestCommand

The first time I ran into this error I was stumped for while finding a solution. Ultimately it comes down to using Self-Signed Certificates in vCenter, as most of us do.  In general using Invoke-WebRequest or Invoke-RestMethod against a server using a Self-Signed Certificate will cause this error, it’s not just related to vCenter.

The solution is quite simple.  I found a snippet of code some time back that I keep on hand in this situation.  It basically ignores certificate validate in PowerShell allowing you to make a connection with Invoke-WebRequest.  All you have to do it paste this code into your PowerShell session before you run Invoke-WebRequest against a server with a Self-Signed Certificate.

if (-not ([System.Management.Automation.PSTypeName]'ServerCertificateValidationCallback').Type)
$certCallback = @"
    using System;
    using System.Net;
    using System.Net.Security;
    using System.Security.Cryptography.X509Certificates;
    public class ServerCertificateValidationCallback
        public static void Ignore()
            if(ServicePointManager.ServerCertificateValidationCallback ==null)
                ServicePointManager.ServerCertificateValidationCallback += 
                        Object obj, 
                        X509Certificate certificate, 
                        X509Chain chain, 
                        SslPolicyErrors errors
                        return true;
    Add-Type $certCallback

Once you run the code you will be able to now successfully make a connection.

I’ve seen some simple one liner solutions for Self-Signed Certificates but none of them seemed to work for me.  Whereas the above snippet of code has always worked.  Obviously bypassing certificate validate is not something you want to run on a global scale in PowerShell but this code works great for your current session only.

If there is a simpler way to bypass certificate validation I’d love to hear it.

Store Multiple Pure Storage Connections In A PowerShell Array

I’ve recently been playing around with the Pure Storage PowerShell modules. I’ve found the Pure cmdlets to be quite extensive and easy to use. Quite a nice change from PowerShell Cmdlets of other traditional storage vendors. One thing, though, that I found a little annoying was that I had to store a connection for a Pure Array into a PowerShell object and constantly reference that object in each cmdlet I ran. Not a big deal normally but where I ran into an issue was wanting to connect to multiple Pure Arrays at the same time and being able to run and iterate against them all at the same time. I quickly came to realise that the cmdlets themselves are designed to run against one Pure Array at a time.

Initially I thought I could store multiple connections to a variable using the += operator. But this lead to the following error.

C:\Code>   $arrays = New-PfaArray -EndPoint purearray1 -ApiToken 'b2342442-ebb2-5673-a452-c443f562cb7' -IgnoreCertificateError

C:\Code>   $arrays += New-PfaArray -EndPoint purearray2 -ApiToken '6523ff23-32ac-2890-9843-2e4e9543672' -IgnoreCertificateError
Method invocation failed because [PurePowerShell.PureArray] does not contain a method named 'op_Addition'.
At line:1 char:1
+ $array += New-PfaArray -EndPoint purearray2 -A ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (op_Addition:String) [], RuntimeException
+ FullyQualifiedErrorId : MethodNotFound

A quick inspection of the data type of the variable created using GetType shows that it is a System.Object and not an Array. By default creating a connection to a Pure Array using New-PfaArray and storing that to a variable will cast it as an object.

C:\Code>   $arrays.GetType()

IsPublic IsSerial Name                                     BaseType          
-------- -------- ----                                     --------    
True     False    PureArray                                System.Object

This is easily fixed by setting the data type for our variable to [array] when we create it.

[array]$arrays = New-PfaArray -EndPoint purearray1 -ApiToken 'b2342442-ebb2-5673-a452-c443f562cb7' -IgnoreCertificateError
[array]$arrays += New-PfaArray -EndPoint purearray2 -ApiToken '6523ff23-32ac-2890-9843-2e4e9543672' -IgnoreCertificateError

Now when we check the data type we see it’s System.Array.

C:\Code>   $arrays.GetType()

IsPublic IsSerial Name                                     BaseType      
-------- -------- ----                                     --------       
True     True     Object[]                                 System.Array    

Checking the variable again we can see we have two records.

C:\Code>   $arrays

Disposed : False
EndPoint :
UserName :
ApiVersion : 1.7
Role : StorageAdmin
ApiToken : b2342442-ebb2-5673-a452-c443f562cb7

Disposed : False
EndPoint :
UserName :
ApiVersion : 1.7
Role : StorageAdmin
ApiToken : 6523ff23-32ac-2890-9843-2e4e9543672

Using this new variable with a Pure Storage Cmdlet is just a matter of specify the line in the array representing the Pure Storage Array we want using square brackets.

C:\Code>   Get-PfaArrayId -Array $arrays[0]

version revision             array_name           id                                  
------- --------             ----------           --  
4.8.10 201705102013+977fb3c  purearray1           b2342442-ebb2-5673-a452-c443f562cb7b

Where this array we created really becomes handy is when using it with foreach loops. We can now rap our Cmdlets in a foreach loop and iterate through all our Pure Storage Arrays.

C:\Code>   $results = foreach ($array in $arrays) {
Get-PfaArrayId -array $array

C:\Code>   $results | ft

version revision             array_name           id                                  
------- --------             ----------           --    
4.8.10 201705102013+977fb3c  purearray2           6523ff23-32ac-2890-9843-2e4e9543672
4.8.10 201705102013+977fb3c  purearray1           b2342442-ebb2-5673-a452-c443f562cb7

This is just a simple example but now we can start enumerating across all our Pure Storage arrays and easily start manipulating objects returned.

I really like the Pure Storage PowerShell modules but I really hope that a future update allows for easier working with multiple Pure Arrays. Hopefully allowing their Cmdlets to work against multiple arrays at the same time.