Logging PowerShell Scripts

As you progress from running commands at the console, to scripting tedious work, to automating entire processes you still need to keep an eye on whats running. Nobody likes figuring out what went wrong on in a process by debugging your script. You should integrate logging into any scripts that run unattended. Logging saves time and helps surface errors in your scripts. Luckily, saving to a log file in PowerShell is simple.

Out-File

Out-file is a very simple way to handle writing to a file. Simply put Out-File at the end of your pipeline to save output to a file.

$File = "~\Desktop\Out-File.log"
'[Error]: Service failed to restart' | Out-File -File $File -Append

You can also catch errors and put the contents of the error into a file.

$File = "~\Desktop\Out-File.log"

try {
    Invoke-RestMethod -URI 'https://doesnotexist' -ErrorAction Stop
} catch {
    $ErrorOutput = $_
    $ErrorOutput | Out-File -File $File -Append
}

Tee-Object

The Tee-Object cmdlet allows you to split the direction of the pipeline. If you have a Linux background this will sound familar. The pipeline output still continues on but you can save off pipeline output into a file. This makes it easy to see what’s going on when your at the console since the output still appears on the screen but you can still save output for later viewing.

$File = "~\Desktop\Tee-Object.log"
$Result = Get-Process -Name PowerShell | 
    Tee-Object -FilePath $File -Append 

$Result | 
    Select-Object -Property Name,CPU,ID | 
    Tee-Object -FilePath $File -Append

You can see that the first command saves the output of Get-Process to both the file on your desktop as well as in the $Results variable. Then, we pipe the previous results out, manipulate the output via Select-Object, then call Tee-Object again. This time, the output goes to the console window as well as the log file.

Using Tee-Object is a great way to keep pipeline output in the console and in a permanent log. As with Out-File, remember to use the -Append parameter. You probably don’t want to overwrite the log file with each write.

Write to the Event Log

$Logname = 'Getchell.org'
$Source = 'Automated Script'

$NewLog = @{
    Logname = $Logname
    Source = $Source
}

try {
    New-EventLog @NewLog -ErrorAction Stop
} catch [System.InvalidOperationException] {
    # Ignore if source already exists
}

$InformationalLog = @{
    Logname = $Logname
    Source = $Source
    EntryType = 'Information'
    Message = 'Verbose informational message'
    EventId = 19
}

Write-EventLog @InformationalLog

Other Considerations

Object Serialization

Working with objects can be a pain to format for logging purposes. If you use the default pipeline view, you may lose data due to the PowerShell formatting system. A quick way to get around this is to pipe out to ConvertTo-Json before writing the pipeline to a log. This will serialize the object so it can be written to disk without data loss. ConvertTo-Json will also expand any nested properties, something Select-Object -Property * and the Format-* cmdlets won’t do.

$File = "~\Desktop\object-json.log"

Get-Service -Name BITS | 
    ConvertTo-Json | 
    Tee-Object -FilePath $File -Append

Log Filenames

Another thing to consider is to name logs with timestamp data. If you ever have to debug your automated script you’ll thank yourself later for keeping each run of your process in separate files. I usually use the following convention:

$Log = ".\log\process-name-$( Get-Date -Format FileDateTime ).log"

Get-Date has a built-in format that keeps invalid characters out a date/time string for use with filenames. I reference this constantly.

Viewing Logs in Real-Time

I prefer to keep logs as files as I have more experience with grep and with the Event Viewer. For log running processes I tail the open log files to see progress in real-time. This is great for debugging scheduled tasks where you aren’t at the console.