Receiving alerts from Nutanix Prism / Prism Central using REST with PowerShell

Today i wanted to add another section to my health check scripts, i wanted to get all unresolved, not acknowledged alerts from Nutanix Prism Central.
Nutanix has a nice REST explorer , you can check / test everything from your Nutnix Prism website and then transfer that to a script. Since i have already Prism Central, i don’t need to connect to each cluster Prism, you can do it though with this script. Just give the prism ip/url instead of prism central.
So below is a dirty script that does the work:

#to generate the password use:
#read-host -assecurestring | convertfrom-securestring | out-file C:\password.txt
$username = 'admin'
$PasswordFilePath = 'c:\password.txt'
$passwd= get-content $PasswordFilePath| convertto-securestring
$Header = @{"Authorization" = "Basic "+[System.Convert]::ToBase64String([System.Text.Encoding]::UTF8.GetBytes($username+":"+$password ))}
$PrismNode = ''
$uri= "https://${PrismNode}:9440/PrismGateway/services/rest/v1/alerts/?resolved=false&acknowledged=false"

add-type @"
   using System.Net;
   using System.Security.Cryptography.X509Certificates;
   public class TrustAllCertsPolicy : ICertificatePolicy {
       public bool CheckValidationResult(
           ServicePoint srvPoint, X509Certificate certificate,
           WebRequest request, int certificateProblem) {
           return true;
[System.Net.ServicePointManager]::CertificatePolicy = New-Object TrustAllCertsPolicy

$myTimeZone=[System.TimeZoneInfo]::FindSystemTimeZoneById("Central European Standard Time")

#We can check TimeZones with: [system.timezoneinfo]::GetSystemTimeZones()
#[System.TimeZoneInfo]::ConvertTimeFromUtc( date , timezone )  -> converts time to given timezone
#(New-Object -Type DateTime -ArgumentList 1970, 1, 1, 0, 0, 0, 0).AddSeconds([math]::Floor(1430056923792344/1000000)) -> Converts time from timestamp , time was in Useconds hence /1000000

$response=Invoke-RestMethod -Uri $uri -Header $Header
Foreach($alarm in $response.entities)
#Create context hash for each alarm Binding ContextTypes with ContextValues
foreach ($name in $alarm.ContextTypes) 
#Watch out for empty entries ( no idea why there are null entries for Type that correspond to a value )
#Swap the {var} with value from our $context hash
	Foreach ($ContextType in ([regex]::Matches($alarm.message,'{(.*?)}')).Value )
		$alarm.message = $alarm.message -replace $ContextType,$context[$ContextTypeName]
$createdTime = ([System.TimeZoneInfo]::ConvertTimeFromUtc((New-Object -Type DateTime -ArgumentList 1970, 1, 1, 0, 0, 0, 0).AddSeconds([math]::Floor($alarm.createdTimeStampInUsecs/1000000)),$myTimeZone)).ToString()
$alarm | select-object -property @{n='CreatedTime';e={$createdTime}}, severity, alertType, message

In response we should get all the alerts like on the screenshot below:

Or even from out-gridview:


I am not 100% sure if i am doing it correct though, but i compared the text from prism, and that looks ok what i generate. I should spend more time on googling about the Nutanix REST API first i suppose 😉 but i had only few hours to do that. In the response we receive alert entities with a property ‘message’ , as you can see it keeps a placeholders for data like {ip_address} or {reboot_timestamp_str}, which we need to translate in order to see full message.


I might be wrong here or doing something in weird way but i guess outcome is accurate:
We take the {var} and replace it with value.
We have property contextTypes and contextValues, each entry from Type has its value in corresponding row from contextValues. I saw that sometimes contextType row is null, but the corresponding value has some string though. Looks like a duplicate text of another Type that is covered there, so i assumed that is safe not to take it with me to context hash.
That’s how example hash for an alert looks like that i am building:
So regex looks for {xxx} then replaces it with the corresponding value from context hash. For each alert i am building the context hash as alerts can be different and they have different contextTypes.
The time is also not exactly clear for me as ‘timestamp_str’ is way different than the createdTimeStampInUsecs. Anyway i have output done in the same way as Prism shows on the website.
Time in createdTimeStampInUsecs is an unix timestamp and has to be converted to ‘windows datetime’, this timestamp is written in microseconds hence we are dividing it by 1000000. After this is done , i am converting that date to my timezone.
All alerts that i am getting are those which are not acknowledged neither resolved. Authentication for Nutanix REST is described here for example:
So that concludes my first attempt to connect to Nutanix PRISM via REST using Powershell 😉
I hope it will help to start with other projects.

Custom report about Custom Fields in virtual infrastructure

This time i wanted to create a report on information which is put in the custom fields for my vms. You can do it very easy using the :

get-vm VM1 | get-annotation

What is more , you can specify which particular custom field you want to retrieve, for example if you cant a custom field called ContactPerson you could write:

get-vm VM1 | get-annotation -CustomAttribute "ContactPerson"

Now it is ok to use it like this, but using that : get-vm | % { get-annotation } would take you quite a while when ran in VI > 100 vms, and even more in > 1000 vms.
I have written short one liner to get all the data and put it into a csv file so i could minupulate data using filters and conditional formatting.

$(foreach ($cluster in get-cluster)  { get-view -ViewType VirtualMachine -SearchRoot $ | % { $vVM=$_; $_.Summary.CustomValue | select @{N="VM Name";E={$vVM.Name}},@{N="Cluster";E={$}},Value,Key,@{N="Key Name";E={$vKey=$_.Key;($vVM.AvailableField|?{$_.Key -eq $vKey}).Name}} }}) | export-csv c:\customfields.csv

You will receive a file in c:\customfields.csv , which will include information as on the screenshot below:

Report is generated in about 8-10 sec on a 1000vms + infrastructure, so it is faster than with the get-annotation. Feel, free to modify it, add other custom columns so it will meet your expectations.
I had to go to two places in order to complete this report, 1 is the Summary.CustomValue section from the get-view virtualmachine object, and second is the AvailableFiled section. Summary.CustomValue will tell you what values are set on which Keys(Key number), and using AvailableField you match a pair KeyNo=KeyName. I am doing a cluster loop, to get the have the information in which cluster vm resides. If needed you can change this to datacenters, or just use without cluster/datacenter container.
The above example was pulling only existing annotations data, so you will not see any vm there which did not have the custom fields filled out. I made a new script to pull data on each vm, each annotation even if the vm does not have any annotation data. What’s more, instead of having each row on “annotation entry” for vm, i have create now an output where you have 1 vm per row, and in columns you have all annotations. I hope this one will be more useful for everybody.

foreach ($cluster in get-cluster)  {
    foreach ($vmview in (get-view -ViewType VirtualMachine -SearchRoot $ {
        $vm=New-Object PsObject
        Add-Member -InputObject $vm -MemberType NoteProperty -Name VMname -Value $vmview.Name
        Add-Member -InputObject $vm -MemberType NoteProperty -Name Cluster -Value $cluster.Name
        foreach ($CustomAttribute in $vmview.AvailableField){
            Add-Member -InputObject $vm -MemberType NoteProperty -Name $CustomAttribute.Name -Value ($vmview.Summary.CustomValue | ? {$_.Key -eq $CustomAttribute.Key}).value
$VMs|Export-Csv c:\annotation-report.csv

I decided to build my own object, and then fill it with all properties. It is building property columns dynamically, so when you have 5 you get 5 columns, if 10 then 10 etc…
And what is more important 😉 it is easier to read this one !

Scripting games 2012 finished!

So the 2012 Scripting Game have finished. Final results were presented.  As for me i finished on 24th place. There were 373 participants in the beginner section. Although i did not win (hahahahahaha 😉 ) i still think that it was fun to participate in this event. You were given real life challenges. While studying even those “pretty simple” tasks i learned a lot of new tricks that are really helpful.  I will post my sg2012 solutions later. Everybody should try this event, i know for sure that in next year i will try again, but i will try to do the advanced ones.  Judges were most of the time ok. 70% of them were very helpful, 30% of them should not participate again in this event as judges, as they are destroying the fun. I will not say anything new, as everybody says that judging script without the comment is really ……….. i learned a lot from judges who pointed out my mistakes in the script. That is really helpful because you will remember that mistake for a long time. Sometimes you just do not understand why your script is not rated 5 stars… that’s really  annoying when you do not know what did you wrong, or if 1 judge gives you 3, and other judge gives you 5 and says that;s the fastest execution…  like in my solution below.

Still, it was great opportunity to learn powershell, i already can say that it helped me a lot in using it with vmware cmdlets.So if you will be feeling bored next year, make sure you will enroll to this event !

Check vm vmnic vlan

I wanted to get a list of vms that are using specific vlan so i wrote this one:

foreach($vm in (get-vm)){
if (Get-NetworkAdapter -vm $ |?{$_.NetworkName -eq "network23" -or $_.NetworkName -eq "network65"}){$}

Ok, that was working but it was VERY VERY VERY VERY VERY slow…

So i wrote it using get-view :

$vms=get-view -viewtype virtualmachine
foreach($vmview in $vms )
$ntwkview=get-view -id $
foreach($ntwkentry in $ntwkview){
if ($ -eq "network23" -or $ -eq "network65")

This one needed only 3 minutes to complete on a ~1000vm infrastructure. I stop the first script after 30minutes of running as it would probably finish next day …

The script is not perfect… yet 😉 but it will do the job and what’s most important quickly. I hope this will save you some time on reporting.

ESXi filesystem disk usage script

So what if you would like to get a report on filesystem disk usage on esxi box. For now afaik there is no cmdlet in powercli that could do that for us. So in order to get this data you can use plink to invoke command remotely via ssh

Frist, download plink.exe from putty website.

In this example i put plink.exe on c:\ drive

$str1=’c:\plink.exe -v -pw ”v53HYfsd#$%$%^^&*” -l root ‘
$str2=’ df -h’
“Host Size Used Avail Use Mounted” >> $outfile
foreach($esxentry in (Get-VMHost|?{$_.Powerstate -eq “PoweredOn”})){
$ >> $outfile
$result=Invoke-Expression -Command $command
foreach($resultline in 1..$result.length){
$result[$resultline] >> $outfile

So, assuming you will have plink on c:\ and you will be in powercli session while connected to VC server, you can just copy/paste those lines and you will receive csv ready file in $outfile location. After this go to excel and do Data->import from text-> Select “space” as delimeter, and you are don. After that you can also select the USE column, and do conditional formatting for example.

If you want to make report for specific hosts that are in some DC folder, use the -Location switch in this line

foreach($esxentry in (Get-VMHost -Location “specific DC”|?{$_.Powerstate -eq “PoweredOn”})){

Script will only print information for hosts that are powered and and that are not in maintenance mode. You can use this solution also in different ways. Change the $str2 variable to whatever command you feel like and you can format the output to adjust it to new command. in $str1 you can give password with $ sign for example as well. In the last foreach loop i am not taking the first line [0] because it is the header which i have included in the beginning so there and the csv file is more readable because of this.  This is great base for everybody, modify it as you want and have fun 😉

PS.  If you have established the connection to esxi hosts before you can give to plink option -batch, if not don’t use it, and you will be ask the question about ssh key fingerprint

I will also write other version of this script, that covers a case where you don’t use plain password in -pw,  but ssh keys. If you know better way of getting the filesystem disk usage leave a comment!

Managing remote windows box

I will have this 1 post for all useful commands.

Rebooting remote windows box

(gwmi win32_operatingsystem -Computername XXXXXXX -cred (get-credential)).Win32Shutdown(6)

Please check also to see possible flags you can set for the reboot.

Checking uptime of a remote windows box, for example after the reboot

[Management.ManagementDateTimeConverter]::ToDateTime((Get-WmiObject Win32_OperatingSystem -Computername XXXXXX -cred (get-credential)).LastBootUpTime)

Checking network configuration of a remote windows box

Get-WmiObject -Class Win32_NetworkAdapterConfiguration -Filter IPEnabled=TRUE -ComputerName XXXXXX –Cred  (get-credential) | select IPSubnet,DefaultIPGateway,WINSPrimaryServer,WINSSecondaryServer,IPAddress,DNSServerSearchOrder

I will try to update this post each time i find something useful.