- Tom Kopchak
- May 04, 2020
- Tested on Splunk Version: 7.3 and 8.0
Are you looking to bring Windows PowerShell logs into Splunk? This tutorial will walk you through the configuration process to get these transcription logs into Splunk.
Over the past few months, I’ve talked to a number of clients who were interested in bringing Windows PowerShell transcription logs into Splunk. One challenge associated with this log is it’s not the easiest to enable, especially since it isn’t enabled by default in Windows. They also aren’t the easiest to work with–a lot of the information I ran into when navigating through this project ended up being inaccurate or incomplete.
My goal in this tutorial is to simplify the process of enabling PowerShell transcription logging and demonstrate how to get this data into Splunk. I can’t say this process is perfect, and I know that there will be areas of improvement, but I hope it will be helpful.
Last year when planning the National Collegiate Penetration Testing Competition (@NationalCPTC), I posed a challenge to our research and monitoring team–we needed a scripted mechanism for enabling several types of logging, including PowerShell transcription logs. Tim Ip took on this challenge and produced several of the scripts that we used in the final environment build. He has made these tools available on GitHub.
This script is a standalone method for enabling this logging and writing the files to C:\pstrans\. If you need to write the output elsewhere, the script can be modified to accommodate. If you’re not comfortable with downloading random scripts from the Internet and running them on your computers and servers (as you probably should be), feel free to deconstruct this script and modify it to your specifications and needs.
For our audio-visual learners, I've created a video demo walking you through how to get everything set up in Windows to write log files. You may view the demo below.
This PowerShell script makes registry changes and must be run as an administrator. Start by running PowerShell as an administrator and searching for PowerShell. Right click on the icon, and then choose “run as administrator” (feel free to use another method to accomplish this if you prefer).
Next, you’ll probably be tempted to navigate to the location of the script and run it. However, if you simply try to run the script on a default Windows deployment, it will be blocked:
Instead, try running the script with an execution policy specified:
powershell.exe -ExecutionPolicy Bypass -File .\powershell_logging.ps1
Important security note: There are tons of ways to change PowerShell permissions to allow for scripts to run. I chose this method because it doesn’t require a permanent permissions change that you may forget to unset after completing this task.
Next, reboot your system. Once the system reboots, run a PowerShell session and you’ll be able to see a file created for each session:
Sample PowerShell Transcript:
Now that we have this data, let’s get it into Splunk!
For getting this data into Splunk, I’m going to make a few assumptions:
With that in place, let’s get started with collecting this data!
I’ve created a Splunk Add-on, now available on Splunkbase, to support the line-breaking and parsing of the PowerShell logs. This add-on should be installed on your search heads and indexers, as it contains both search and index time operations. If your Universal Forwarders send data to Splunk via Heavy Forwarders, you’ll want this app to be installed on the Heavy Forwarders in place of the indexers.
We’ve also made this app available on Github to allow for additional eventtypes and tags to be added. If you identify any improvements, please let me know.
Once we have the app deployed to our Splunk infrastructure, we will also need to create a new inputs app which we will deploy to any Windows machines where we want to collect PowerShell logs. For this example, we’ll call this app uf_powershell_inputs.
This app contains an inputs.conf file, with the following configuration:
#Monitor PowerShell transcript logs [monitor://C:\pstrans\*\*.txt] sourcetype = powershell:transcript index = powershell disabled = 0 multiline_event_extra_waittime = true time_before_close = 300 #Monitor PowerShell Windows Event Logs [WinEventLog://Microsoft-Windows-PowerShell/Operational] disabled = 0 renderXml = 1 index = powershell source = XmlWinEventLog:Microsoft-Windows-PowerShell/Operational sourcetype = XmlWinEventLog
A note on the time_before_close configuration: This is needed to account for proper event breaking of interactive PowerShell sessions. The value of 300 seconds (5 minutes) is a trade-off between not breaking sessions and avoiding excessive event indexing delays. However, an interactive session with a user delay of more than 5 minutes will end up getting broken into separate sessions with this configuration, and these events will not include header information–including the user who executed the command.
While it might be tempting to avoid event breaking issues by increasing the timeout to a larger value, doing so will result in delays in ingesting data, as the Universal Forwarder will keep waiting at least this time before sending an event to Splunk. In this example, events show up after 5 minutes.
What value you choose will depend on your environment and your use case for this data.
The Splunk app has field extractions and tagged event types to provide a starting point for using this data. Since these logs are inherently free-form (other than the header), there aren’t many field extractions available. However, I’ve created a number of tags to identify some common uses of PowerShell that may warrant further investigation.
I’ve also created a tutorial on using these logs utilizing the data collected in the 2019 National CPTC dataset. This data was collected before we worked out some of the line breaking issues, so it’s not perfect, but it will provide an adequate example of how the event breaking and tagging will work. You may view the demo below.
Hopefully this tutorial is helpful for both configuring Windows to create PowerShell transcription logs and making sense of the results in Splunk. These aren’t the easiest logs to work with, and there will be additional attacker techniques that aren’t covered in the initial release of this app. I’m excited to hear about additional use cases that we can get added to this app–with your help.
This project would not have been possible without the help of many others, including:
If you're looking for something different than the typical "one-size-fits-all" security mentality, you've come to the right place.