DEV Community

Amacc
Amacc

Posted on

Complex Powershell Functions

The other day I mentioned my distaste for using loops like foreach in
powershell. I feel like that for most cases where a foreach loop is used
there is generally a lot more elegant way to use the pipeline. There was
a reference to the Out-IniFile script that has been uploaded. I think this is
actually a great example of a function that is probably a little more
complicated than it should be.

First I would say that the foreach should probably go, there is a great
function that can be used on powershell dictionaries called GetEnumerator.
This will take the key value pairs and split them into an array of objects
with the parameters Name and Value. This works great with the pipeline
since functions can use the named paramater version to get the data to its
own typed variable. This helps since you can leverage the powershell type
system to ensure that invalid data wont be passed down. This eliminates
type checks like !($($InputObject[$i].GetType().Name) -eq "Hashtable").

Next I would say that structure needs to be updated to be optimized for
utilizing the pipeline to its fullest. In that pursuit of that goal and keeping
things dry I try to organize my scripts based on common actions, so if your
writing things to file can you batch the operations in a single place
and just take an array of events. This Out-IniFile script actually lends
itself pretty well to that since there are really only a few commands

  • Parse input
  • write verbose
  • Write to file

This would be easily translated to a couple of real simple functions.
So the first function is goint to take an input object, break it apart
into strings and then write the content to file.

function out-inifile{
    param(
        [Parameter(ValueFromPipeline)][hashtable] $InputObject,
        $FilePath,
        $Encoding
    )
    process{
        $InputObject.GetEnumerator() |  # Split the input item by name & value
            Out-INI                  |  # Get the strings to write
            Add-Content -Path $FilePath -Value $_ -Encoding $Encoding
    }
}

So then the second function just needs to accept a Name and a Value (using
the type system for validation) and pass the new strings downstream. So
within a function you can return data two ways, first with the good old
return statement, second by emitting the data by placing it on its own
line. The first is useful for controlling code flow, say there is need to
return an error code and stop processing. The second is useful for sending
multiple items as an array to a pipeline processor (this works similarly to
the yield keyword in python).

Function Out-INI{
    param(
        [Parameter(ValueFromPipelineByPropertyName)][string] $Name,
        [Parameter(ValueFromPipelineByPropertyName)][hashtable] $Value
    )
    process {
        "[$Name]"                            # Emit Header
        $Value.GetEnumerator() |
            %{ "$($_.Name)=$($_.Value)"}     # Emit key=value
        ""                                   # Emit empty line
    }
}

This lets us capture the header, all the internal keys then the final empty
line. Then the first method will write all the lines to the file.

This code can be found on the powershell gallery at
https://www.powershellgallery.com/packages/PowershellTools/1.1.1

Top comments (1)

Collapse
 
mcc85s profile image
Michael C Cook Sr.

I wrote a response for you in regards to your suggestions here.

.GetEnumerator() is definitely a big help, and I implemented it into a new revision of the original function.

reddit.com/r/PowerShell/comments/e...

-MC