Wednesday, September 19, 2018

Coloring in a HTML table for readablity.

When I first started doing health checks, I wanted to get the results to have some color.
I found a suggestion that the color codes could be added after. So I set up the testing script to put key words in the answers in the table and then after the table is converted to  HTML use a search and replace to substitute the HTML tags that would change the table color.

The first part is code from Microsoft that give the basic table design. 
The second part creates some headers and footers, and then creates the HTML table block for the message.
The Third part colors the table with HTML tags, using a replace command.
The final section assembles all the pieces into the final HTML document.



#HTML Header
$H = "<html>
    <style>
        BODY{background: # CAD3CC; font-family: Arial; font-size: 8pt;}
        H1{font-size: 16px;}
        H2{font-size: 10px;}
        H3{font-size: 12px;}
        H3.Pass{color: green}
        H3.Fail{color: red}
        TABLE{border: 1px solid black; border-collapse: collapse; font-size: 8pt;}
        TH{border: 1px solid black; background: #dddddd; padding: 5px; color: #000000;}
        TD{border: 1px solid black; padding: 5px; }
        td.Pass{background: #7FFF00;}
        td.Warn{background: #FFE600;}
        td.Fail{background: #FF0000; color: #ffffff;}
        </style>
    "


# Labels for HTML
$T = "<H1>Prod Application test</H1><BR>"
$B = $T + "<H2>Non-Prod Application test $Healhcheck " + $Warning + 
      "<H2><br> This is a test for Application servers <br><P> </H2> "
If ($E) { 
      $EReport = '<H1> <font color="red">ERROR REPORT:</H2></font><P><BR><BR> ' 
        }
$PostContent = "$EReport $E <BR>This report was generated on server $ENV:COMPUTERNAME <p>"
$DataReport = $Report | ConvertTo-Html -Fragment

#Crayon coloring code
#Coloring crayon the table.
$B += (($DataReport -replace "<td>Fail ", '<td class="Fail">') 
          -replace "<td>Pass ", '<td class="Pass">') 
          -replace "<td>Warn ", '<td class="Warn">'


#Now form the code into a final web page for emailing.


$Body = ConvertTo-Html -Head $H -Title $T -body $B -PreContent $PostContent



Start a scheduled task with a GUI menu.

Want to make a script run under other credentials?

You can create a scheduled task. Which can be fired off from a menu.

The menu script is below.
it takes advantage of the cool Out-GridView command.

$task = $true
While ($task)
{
    Write-host "Looking up available Healthchecks" -ForegroundColor Green
    Clear-Variable task
    Get-ScheduledTask -TaskPath "\AdminTasks\" |
    Select TaskName, State, Description |
    Out-GridView -OutputMode Multiple |
    foreach{
        if ($_)
        {
            $_ |
            Get-ScheduledTask |
            Tee-Object -Variable task |
            Start-ScheduledTask
            $task |
            Get-ScheduledTask
        }
    }
}


Wednesday, September 12, 2018

Parallel PowerShell - Part 3

Speaking of making code go faster...


I learned at the PowerShell Summit this year from Joshua King that doing repetitive tasks or loops one way may be quite a bit slower than doing it one way vs another. I learned that functions get compiled by the PowerShell engine, and don't continue to get interpreted.  I do recommend his presentation to understand how he did his research and validation. Whip Your Scripts into Shape: Optimizing PowerShell for Speed

This I had done without realizing the performance improvement is provided. It was chosen to work better with the Workflow process.

It appears that Workflows get compiled by the interpreter, and get the same speed up.  So these options are Good methods to speed things up. Workflows give the capability to run things in parallel.  Using functions within the workflow the method I used to use the parallel functions of Workflow and made the code easier to write and less restrictive. 

But back to my original story. 
One of the things I was concerned about was, what would happen if the server I tasked to do part of the testing, didn't respond for some reason. Would my Health Check crash? return a big red error? or worse. just stop dead and hang the script with no return. 
Fortunately, I haven't had a script zombie yet doing this in the parallel version of the script. But it was a plague in the previous versions. (a zombie is a script that never stops, waiting indefinitely. )
I did make sure to incorporate timeout settings on the web calls to ensure that the computer would not wait forever for a return. There were other checks that would get hung, so I made sure to verify the server was functioning properly before I ran those.  This seems to be where my scripts get stuck. 

The problem became running tasks in parallel, was what would happen if the server was off or nonresponsive. Fortunately, the parallel requests didn't hang like the previous versions. They did the opposite, they returned nothing.

So I had to devise a method to discover which tests had not returned. So I winced and then tried for a Jimmy Neutron brain blast...  that didn't work.
I tried things and came up with a loop that would check to see if each item in the original list of servers to test was in the results.

One approach was to review the errors in the shell variable  $ERROR.  but that didn't work out in some cases.
So to get a full picture, I found that comparing the list of servers submitted with the resulting Responses.


$Missing = $Servers | ?{ $result.PSComputerName -notcontains $_ }


So this gives us the ability to create some additional entries of the response to mark the missing responses.

$MissingServers = $Missing | foreach {
    $MissingServer = @{
        "URI"               = "https://$($_):443/App/Service.svc?wsdl";
        "StatusCode"        = "Fail 00";
        "StatusDescription" = "Fail N/A";
        "Result1"           = "N/A";
        "Result2"           = "N/A s"
        }
    New-Object -TypeName PSObject -Property $MissingServer
    } #Close foreach

So you can combine your results and get a complete table with :

$Result += $MissingServers

Where $Result is the results of the successful test.

Tuesday, May 8, 2018

Parallel PowerShell - Part 2

My next challenge was that I had to check multiple URLs for each server. Plus I also needed to check if a server was online, and do some other checks and validations to verify its status before I went and tried to open a remote PowerShell session (WinRm) session to it. I didn't want to check servers I was simply going to timeout on.  So I had 2 cases where I needed to work out a way to run multiple tasks a the same time within a single PowerShell script.

One way to do that is with Jobs.
Another is to use Windows Workflows. I went with workflows...

I like Workflows, as they are more like Function calls. and they have a nice coding ascetic. Plus there are some other benefits at the run time that they take advantage of.  Which I didn't know at the time, but the presentations at the PowerShell Summit this year, I have a new appreciation of how to make things go faster.

When I first investigated Workflows, I found them very hard to work with. There is a number of restricted commands, and things that Workflows won't let you do. Then one day, I did an experiment, I wanted to see if I could get a workflow to execute a function, and not be so restrictive about what I could do.  It worked, and it worked very well.

What I later learned, is that Function calls are compiled by the PowerShell interpreter when it processes the script. That makes Functions run quickly.  As it also turns out Workflows also get compiled, so the combination becomes very efficient for the computer to run.  Efficiency is an important component of Speed. This turned out to be a lucky coincidence for my work, however, I learned a lot by doing it. However, I lost that code in the jumble of activity and had to re-invent the wheel later on.

This method is reversed order than what you would normally;y  think of in a script.
you have to create a Function, and then a Workflow before you can execute them.
However you structure your code, I think its good to keep such blocks easily understood.
Because the next guy to maintain your code, could be you in a year, and you are going to have to figure out what you did, with a vague memory.

Example code

function Test-webrequest { ... }

Workflow Get-VIPTest
{
    param ([array]$Servers)
    foreach –parallel ($Server in $Servers)
    { Test-webrequest -uri $Serve }
} #end workflow

$VIPS = "https://outbound.corp.net/abc/isalive", "https://outbound.corp.net/abc/isalive"

Get-VIPTest $VIPS |
Select-Object Connect, Code, URL, Errormsg, Success |
Tee-Object -Variable VIPReport |
Out-host



First I create the function to do all the dirty work.
function Test-webrequest {  ...  }

here I leave a lot of detail out, but its a sophisticated check of the URL it is given.
One painful but important lesson I tested and learned, is that each workflow instance session is isolated from the other PowerShell session variables, $Using:variable and other things are not functional, you end up with null responses.
On the upside, each workflow instance session is isolated, if it fails, and crashes, it won't take the other parallel sessions down. Of course, there are reasonable limits to this. For example, if one task goes runaway on resource consumption and crashes the OS, your program is failed at a higher Order.

The I create a Workflow and set the workflow to call the function I created.
Workflow Get-VIPTest
  { param ([array]$Servers)
    foreach –parallel ($Server in $Servers)
    { Test-webrequest -uri $Serve }
  } #end workflow

The key feature that enables multi-threading in the workflow, is the -Parallel switch in the foreach cmdlet .


    foreach –parallel ($Server in $Servers)


I minimize the portion of the Workflow code and use the Function to do all the dirty work.
Some of the things you can't do in a workflow, such as out to write-host, have tested are ignored. I am sure in other cases but you will get failures if you go far from the limits of what workflows can accomplish. Such as Read-host.  So its best to heed the guidelines the best you can in a workflow.

I next define the list of URL's in my example to check.  Of course, this would be different for whatever systems you were testing, and what your function code might do.

$VIPS = "https://outbound.corp.net/abc/isalive", …

This very well could be imported from a CSV JSON, Text, or etc file.
I have found that the  ".Trim()" command is a great precaution on importing data, as an extra non-printing character such as a tab or space, can cause some code to throw an error because of the invisible character. (this can be quite baffling. )

Then there is the Kickoff.
Get-VIPTest $VIPS 

This kicks off the Workflow with all the URL's in a system.string List.

Get-VIPTest $VIPS |
     Select-Object Connect, Code, URL, Errormsg, Success | 
     Tee-Object -Variable VIPReport | 
     Out-host

and then in my own inside out method, I catch the output with a Tee-Object, in $VIPReort. and put the results as they come into the screen so I can watch the action. I have had some fuss with some cmdlets in the Flow holding up the output. For example, Sort-object will hold up things so it can process the whole stream to do its work.

It is kind of dull seeing no results for a large number of tasks, it's like watching a toaster...

In my previous studies, I found that the number of the parallel process was by default limited to 5 at a time. I will look for a link for that later.

This workflow approach is great for tasks that take a variety of different completion times. In a long queue of tasks,  one task could sit waiting for completion and another queue would be freed up to start another session.  This would be similar to a bank with a row of multiple tellers in a bank all handling requests, with a single feed line. As each teller becomes available, another customer could be served.
Of course, all the tasks would need to be driven by the same Function call.  You get the advantage of all of them running in parallel and the benefit of averaging the completion time across the group of them as a bonus.

In my next installment, I will go over the problem of when one of your tasks never returns for whatever reason.

Part 3

Tuesday, May 1, 2018

Parallel Powershell - Part 1

Speaking of making code go faster...


I learned at the PowerShell Summit this year from Joshua King that doing repetitive tasks or loops one way may be quite a bit slower than doing it one way vs another. I learned that functions get compiled by the PowerShell engine, and don't continue to get interpreted.  I do recommend his presentation to understand how he did his research and validation. Whip Your Scripts into Shape: Optimizing PowerShell for Speed

This I had done without realizing the performance improvement it provided. It was chosen to work better with the Workflow process.

It appears that Workflows get compiled by the interpreter, and get the same speed up.  So these options are Good methods to speed things up. Workflows give the capability to run things in parallel.  Using functions within the workflow the method I used to use the parallel functions of Workflow and made the code easier to write and less restrictive. 

But back to my original story. 
One of the things I was concerned about was, what would happen if the server I tasked to do part of the testing, didn't respond for some reason. Would my Health Check crash? return a big red error? or worse. just stop dead and hang the script with no return. 
Fortunately, I haven't had a script zombie yet doing this in the parallel version of the script. But it was a plague in the previous versions. (a zombie is a script that never stops, waiting indefinitely. )
I did make sure to incorporate timeout settings on the web calls to ensure that the computer would not wait forever for a return. There were other checks that would get hung, so I made sure to verify the server was functioning properly before I ran those.  This seems to be where my scripts get stuck. 

The problem became running tasks in parallel, was what would happen if the server was off or nonresponsive. Fortunately, the parallel requests didn't hang like the previous versions. They did the opposite, they returned nothing.

So I had to devise a method to discover which tests had not returned. So I winced and then tried for a Jimmy Neutron brain blast...  that didn't work.
I tried things and came up with a loop that would check to see if each item in the original list of servers to test was in the results.

One approach was to review the errors in the shell variable  $ERROR.  but that didn't work out in some cases.
So to get a full picture, I found that comparing the list of servers submitted with the resulting Responses.


$Missing = $Servers | ?{ $result.PSComputerName -notcontains $_ }


So this gives us the ability to create some additional entries of the response to mark the missing responses.

$MissingServers = $Missing | foreach {
    $MissingServer = @{
        "URI"               = "https://$($_):443/App/Service.svc?wsdl";
        "StatusCode"        = "Fail 00";
        "StatusDescription" = "Fail N/A";
        "Result1"           = "N/A";
        "Result2"           = "N/A s"
        }
    New-Object -TypeName PSObject -Property $MissingServer
    } #Close foreach

So you can combine you results and get a complete table with :

$Result += $MissingServers

Where $Result are the results of the sucessful test.

Friday, April 27, 2018

Powershell Escape characters.

A while back I couldn't find a definitive list of the regular escape characters for PowerShell. So I wrote a program that would generate them all and give me a clue for each letter what I might get.

I have since lost the code, and the output was a little cryptic, particularly with nonprinting characters like Tab, space, Carriage return, Line feed, and Bell.

The Bell character "`a" was interesting. But it doesn't always make it back to me from a session.

The first 2 are for the quoting characters, if you need to embed a quote in the string you are creating, this is one easy way to get them placed literally, and not interpreted by the compiler.

As it was noted it the PluralSight Gotcha video, these are limited to Powershell, you will find if you pass a string to .net or some other software, it may have a different escape character, and different behavior.


Notes:
  • Single quote ' means that what is enclosed is Literal not to be processed
  • Double quote " means that what is enclosed is interpreted.
  • The Backtick is ` and not the single quote '  (apostrophe)
  • The key is typically on the left side of the keyboard upper left corner. 

PowerShell Escape character table:

ASCII
Letter
Esc
Name
Result ASCII
34
"
`"
Double Quote
34
39 
' 
`'
Single Quote
39
48
0
`0
Null
00
96
`
``
Back-tick
96
97
a
`a
Bell
07
98
b
`b
Backspace
08
102
f
`f
Line Feed
12
110
n
`n
New Line
10
114
r
`r
Carriage Return
13
116
t
`t
Tab
09
118
V
`v
Vertical tab
11

*   They are Case Sensitive...  "`n" -ne  "`N"  etc. 


Monday, April 23, 2018

Out-Internet_explorer

Output to a web browser? But why?
I was fishing around in the PowerShell Studio Snippets and a ran across one Snippet:called Trace.

I wanted to use this snippet to upgrade the output of my troubleshooting tool, ServerPulse. An update of a script that I originally wrote to give text output. With this connection to Internet Explorer, I could create something that was a bit easier on the eyes and stop the junior engineers from puckering when they looked at it. ServerPulse was a text-based self-updating dashboard which I thought was rather functional. however, I could see a bit of aesthetics to making it browser output.  I think there is a lot of possibilities with formatting the output as HTML and then it could be delivered by file, email,  or to a web browser.

This update would give me time to rethink what and how I was gathering the data. Keeping the live data refreshing feature of the troubleshooting tool would be important.

I did some tweaking on the base snippet.

I played with this on my workstation, and when I tried it on a server I got some weird errors.

Problem is this requires a certain DLL to function - "Microsoft.mshtml.dll"
It may not be loaded on a server or workstation in some conditions. I have read that MS office installs it.  Here is a more detailed article in how to add it if you get the error "Property 'innerhtml' cannot be found on this object; make sure it exists and is settable."
Check out - http://www.dlldownloader.com/microsoft-mshtml-dll/ with a detailed explanation of the problem.

After fixing that, I was able to use it any computer that had the dll, I read that the Microsoft Office suite added this dll, which was the cause of the initial confusion with the server version, Which had no reason for it to have MS office installed.

So I created the following function from the snippet to post things to the Web page. You can repeatedly call this function to update information if you want to refresh or expand on the output. The flexibility of the Web browser interface is almost without limit.

But... You have to feed it HTML.
Besides hello world, I thought a basic clock would be in order.
Clock code to HTML A little adjustment to the function like so:
And when feed the output the clock code to the function it pops up a window like this:
" Out-IE ($report) "

IE WINDOW:

Hard to look at and kind of cryptic. But re-using some old MS script example, I can append a block of formatting to make it a little prettier.

* NOTE: If your playing with this on the Shell prompt, Make sure you get the $IE = ""In there someplace between sessions or you won't get a new window...  which was confusing at first for me.
And this is a feature for being able to update the data.

The Refined  IE output portion now looks like:




Now some more example of use.

Building on that code I added another table and gleaned some additional info from the local system. Then you can loop the code to keep it updating the output on IE. The next step was to add some more useful info: And I added some more useful info. An example finished script:

Output of this looks like:


And this version updates the info on the screen, you can see the values change, and the clock ticking.

Wednesday, April 18, 2018

Formatting Output in KB, MB, GB, TB ....

One of the things that bugged me about PowerShell since I first worked with it was its penchant to give me File directory listings with the file length in bytes. DIR in the CMD shell did it for me.

The textbook answer, of course, is to simply divide the output by 1KB, 1MB, 1GB  as these constants are set in the environment. The obvious problem with this is that it makes the assumption you know what size you're going to get. or that you want the answer in all the same units. But what if you have limited space to display the info, and you don't want to put  23B in the same column with 1293834556 in the same column.

So recently I set about to find a way to have a function that would auto-scale the Byte prefix. I played with the various number types available and put it to the test to see how high I could go with the prefixes. A list I got from Wikipedia.

Experimenting, I worked out an algorithm for this, and it should work cross-platform.
I will update this when I hear of it being tested in something other than Windows Power Shell (WPS). I created these 2 functions, one to work with KB and the second to work with standard SI 1000 units. The second is just a bonus from figuring out the byte calculations.

I added a few features to the function to make it more flexible in use. I added the "-digits" switch so you could control how many decimal places you got on the output. and I also added the "-Units" switch so that you could change from the default "B" to Bytes or whatever fits your need.

I then wracked my brain with what to call it.
Having watched SAO – Ordinal Scale. On Blu-ray with my wife recently, I looked up Ordinal and it seems to fit. Prefix would also work, but that may have other implications.

While I was tweaking it for this article I thought I could make a minor change and have it do the same autoscale for regular 1000 based items. So this could also be used to deal with anything from Dollars to  Miles or anything you have to describe a large number of.

If you new to functions, you can incorporate this code into your script, or you can load it into memory.  This is called Dot sourcing. Google 'Dot source PowerShell function'

Convert-kBOrdinal.ps1

<#  
    .NOTES
    ===========================================================================
     Created with:  SAPIEN Technologies, Inc., PowerShell Studio 2018 v5.5.150
     Created by:    Rich Stoddart
     Filename:     Convert-kBOrdinal.ps1    
    ===========================================================================
    .DESCRIPTION
        Functions to format output by standard Metric Prefixs for binary (1024) 
        and decimal (1000) units. 
        Created to format data concisely on reports, 
        this will process things to as large a number you cn old in a [decimal] 
        type. 
    .EXAMPLE
        gci |% { New-Object -TypeName PSObject -Property ( @{Length = ($_.Length | Convert-kBOrdinal) ; Name = $_.Name}})
        (gi .\install.exe ).length | Convert-kBOrdinal
#>

function Convert-kBOrdinal
{
    param (
        [Parameter(ValueFromPipeline, Position = 0, Mandatory = $true)]
        [decimal]$Num,
        [Parameter(Position = 1, Mandatory = $false)]
        [int]$digits = 2,
        [Parameter(Position = 2, Mandatory = $false)]
        [string]$Units = "B"
    )
    
    if ($NUM -lt 1E+28)
    {
        [int]$Ordinal = ([math]::Floor((($Num.ToString()).Length - 1)/ 3))
        [string]$Digit = [math]::Round($Num / [math]::pow(1024, $Ordinal), $digits)
        [string]$Suffix = "$(((' KMGTPEZYB').Substring($Ordinal, 1)).Trim())$Units"
    }
    Else { "OverRange" }
    return "$Digit$Suffix"
}

function Convert-kOrdinal
{
    param (
        [Parameter(ValueFromPipeline, Position = 0, Mandatory = $true)]
        [decimal]$Num,
        [Parameter(Position = 1, Mandatory = $false)]
        [int]$digits = 2,
        [Parameter(Position = 2, Mandatory = $false)]
        [string]$Units
    )
    
    if ($NUM -lt 1E+28)
    {
        [int]$Ordinal = ([math]::Floor((($Num.ToString()).Length - 1)/ 3))
        [string]$Digit = [math]::Round($Num / [math]::pow(1000, $Ordinal), $digits)
        [string]$Suffix = "$(((' KMGTPEZYB').Substring($Ordinal, 1)).Trim())$Units"
    }
    Else { "OverRange" }
    return "$Digit$Suffix"
}