Jump to content
Sign in to follow this  

Powershell Log collection

Recommended Posts

I have been doing a bit of powershell to configure and interact with various Windows versions.  I built up some core scripts to use as my own kind of workshop for system review and administration.  I wanted to drop an example script to chat about.

One of the things I struggled to understand starting out was string substitution and being able to define a variable that would also consistently output to a file path of my choosing.  TL;DR on that resolution is to wrap the other variable you are calling (example: file paths) in a $() block.  As seen below, I call my Computername environmental variable so it can be used in the output of file names and logs.

# getEventLogs: Maintenance collection script.

$boxName = $env:COMPUTERNAME
$outEvt01 = ".\$($boxName)_EventLog_Apps.csv"
$outEvt02 = ".\$($boxName)_EventLog_System.csv"
$outSvc01 = ".\$($boxName)_Service-RunStates.log"
$outPorts01 = ".\$($boxName)_Network-Ports.log"
$outTask01 = ".\$($boxName)_Tasklist.log"
$outSchTsk01 = ".\$($boxName)_Scheduled-Tasks.log"
Filter timestamp {"Logs collected at $(Get-Date -Format "yyyy-MM-dd HH mm ss")"}

# Application Event Log most recent 100 messages.
Get-EventLog application -newest 100 | Export-Csv $outEvt01
timestamp | Out-File -Append $outEvt01 -Encoding ASCII
Get-EventLog system -newest 100 | Export-Csv $outEvt02
timestamp | Out-File -Append $outEvt02 -Encoding ASCII

# Collect service list and current state of each.
Get-Service | Sort-Object status | Format-Table -AutoSize | Out-File $outSvc01
timestamp | Out-File -Append $outSvc01

# Get process list with relevant details at time of script exec.
cmd /c netstat -aon > $outPorts01
timestamp | Out-File -Append $outPorts01

cmd /c tasklist > $outTask01
timestamp | Out-File -Append $outTask01

Get-ScheduledTask | Select TaskName, State, TaskPath | Sort-Object -Property TaskPath | Format-table -wrap | Out-File $outSchTsk01
timestamp | Out-File -Append $outSchTsk01

# Wrap all these output into update state / append single file.

# Stamp date and Time into said merged output.

Starting out at the top, I defining a variable for the powershell equivalent of environmental variables in the OS like %computername%.  Trust me here, you don't want to try and call a %variable% in a powershell script.  That's what line 1 is for.

Each of the following defined variables are my output paths for the collections.  I use .csv exports for larger data sets, since the default Table outputs can heavily chop data to fit the terminal output.

Brief OCD DBA note.  Being a fan of Databases and Microsoft SQL, I really value a good | (pipe) to run:

| Select *

after a command.  You can filter that raw output for fields you want to have outputted by writing a custom Select pipe.  There is an example of that for Scheduled Tasks, I just wanted to word out the logic as that took me some time to figure out that is how I can see what my options are for selecting output fields.

The other variables for file path are so I do not have to add the same string twice or more.  As you can see on the actual commands, I add an Out-File -Append to insert the Date string to each file.

Filter timestamp is my means for defining the date output string.  That time will be for when the script is run, so each file will have a matching output time.  Think of filter in this context as an easier Function.

The rest of the script uses either Powershell cmdlets or OS level commands to obtain the data I am looking for and saving to the output files.  I experimented both ways to see what output best matches the task and output I want to work with.

The Export Events logs are pretty simple in calling the 100 most recent events, saving that to a .csv, then adding the Date string at the end of said file.

Service list is sorted and exported to a .log file with the Date string added (as the date will be added for the other 4 output files as well).

' cmd /c ' calls a windows command but ignores keywords for powershell on that line.  Huge helpful thing to know when trying to process content by use of an OS-level command.  Otherwise you will see really esoteric issues you would rather not have to figure out the secret means of why they are failing.  cmd /c is quite nice.  FYI.

Neat.  We are at the part I rambled above in relation to databases and filtering content.  I did not need many of the details in the raw output from showing all the parameters of that Powershell cmdlet.  Selecting the relevant fields, I then sort based on the TaskPath field (to put the non-OS tasks first in the list), apply a -wrap text for the Format-Table output of that cmdlet, then output the data into a local file.

I have done some scripts with loop and condition evaluations but I will stop here for the moment.  If you want to gather some information about an environment, hopefully this example gets you in the right direction for your data collections.

Let me end with a link to a great resource. SS64 has some good resources and examples.  They have been very helpful in conjunction with the Windows Powershell manuals.

Edited by Pic0o
Markup and Select instead of Where in 2nd code example.

Share this post

Link to post

This thread is more like my personal notes than a guide.  Especially since the consistent file path variable was something I recently got my head around.  Hopefully that made sense in it's current format.

I wanted to mention you will get limited results if you do not run the Powershell from an administrator-level escalated prompt.  Otherwise stuff like the Get-ScheduledTask will not show all jobs on said machine.

Share this post

Link to post
Sign in to follow this