Every now and again I stumble across a tool that does exactly what I am looking for no if's or but's just exactly what I need. It is a rare moment and something to savour.
Today it happened and the tool is Microsoft's Log Parser. This is a tool that knows how to read IIS and Windows event logs out of the box, it allows you to specify filters and columns using a SQL-like syntax and dump the result straight into a SQL server database. How freakin amazing is that! To build this yourself would involve writing scripts that use WMI or .NET to interrogate the logs then produce text files in a BCP friendly format then shelling out to the BCP utility to import to a DB, in other words lots of work. This can now all be achieved in one line of code:
LogParser -i:EVT -o:NAT "SELECT * INTO c:Output.txt FROM System"
The example above takes all the fields from the Windows System Event Log and places them into a text file called Output.txt. Once you have installed Log Parser you can copy that line of code into a command prompt on any windows machine and if you have the necessary permissions you will dump the entire system event log to a text file. It's quick too I processed ~2600 log entries to a text file in 0.7 secs on my laptop.
I am still investigating all the features, if you specify the SQL output format -o:SQL you have either to setup a DSN to your database or supply the Sql server & database (if using Sql Auth you also need username & password). After the INTO you specifiy the table name, when the command executes if the table does not exist it will be created for you using the columns in the input format. If want to import into an existing table the column names, data types and order of columns must match. Read the excellent help file that comes with the install. See the example below.
Finally there is one more cool feature I would like to mention, if you wish continually upload log entries from a log file as it grows but you wish ignore records you have already processed you are in luck! Use the iCheckPoint parameter and Log Parser will create a file where it stores the position it last read up to in the log file you are processing. This removes the need to clear out the database and reload everything which means a lot less horsepower will be required for processing your logs. You can reduce the intervals between log uploads to make your database more current.
It also has a Com+ api so you can call it directly from C# or VB. I can't believe it took me this long to discover such a useful tool.
SQL Server Example:
LogParser -i:EVT -o:SQL "SELECT * INTO EventLog FROM System" -server:LaptopSqlExpress -database:Keeper -driver:"SQL Server" -createTable:ON
If you have IIS set to create Daily log files (the default) then a bit of code is needed in your batch file to determine what the current filename will be. The following example will work on an XP/2003 server that has is local date format set to the UK. If you are deploying to a server with US date format you will need change code that sets the CurrentLogFile variable.
REM Use on servers set to UK date
REM SET the path to the IIS Log Files Folder. NOTE: If your IIS server has multiple Websites you may have to change the path.
ECHO Attempting to process IIS Log File: %LogFilesFolder%%CurrentLogFile%
LogParser -i:IISW3C -o:SQL "SELECT * INTO IISLog FROM %LogFilesFolder%%CurrentLogFile%" -server:LaptopSqlExpress -database:Keeper -driver:"SQL Server" -createTable:ON
Update: Found a Gotcha
I discovered a gotcha with using the flag -createTable:ON when importing IISLogs into SQL server, this will set the column length to 255 max. Often columns like csReferrer, csUriQuery, csUriStem and csUserAgent will exceed 255 chars. To fix this problem simply increase the length of the columns.
I used csCookie: varchar(5000), csUriQueryString: varchar(5000), csUriStem: varchar(1000), csUserAgent: varchar(1000)