As a server administrator, one of my duties is to review log files for anomolous behavior, errors, or other issues. For me, combining all the daily logs into one big file is much easier to review.
Yes, I do daily reviews of the logs, which is a tedious job in itself. My monthly task involves combining all the daily logs for the month into one file, then ingesting the data into a spreadsheet. From there, I can make all kinds of pretty graphs, charts, and tables which reflects traffic trends over the month.
But the first step is compiling the data for easy ingestion into my spreadsheet. I use a Windows-based web host, so it uses Internet Information Services (IIS). IIS is configured to create daily log files. Importing each log file into a spreadsheet is tedious.
I used the following PowerShell script to combine all the log files into one handy file:
$WorkPath="c:\temp\logfiles\"
$OutputFile=$WorkPath + "JoinedFile.log"
$LogFiles=get-childitem -Filter *.log -Path $WorkPath
ForEach($LogFile in $LogFiles)
{
$CurrentFile=$Logfile.FullName
$CurrentFile
Get-Content $CurrentFile | Out-File -append $OutputFile -Encoding ascii
}
What it Does:
The first two lines simply set a couple of default values, one for the location of the files ($WorkPath), and one for the name of the output file ($OutputFile).
Next, a list of the files is gathered up into $LogFiles. For this exercise, we are only interested in files with the .log extension, so it is specified in the Filter switch.
Inside the ForEach loop, the full name of the current file is acquired ($CurrentFile), and then displayed on screen. It’s always good to have an indication of progress visible, so the user can see what is going on.
The last line of the loop grabs the content of $CurrentFile, and pipes it out to $OutputFile, in append mode.
Simple, straightforward, and quick. Got a better way? Show us in the comments.