html
When parsing IIS7 logs in W3C format, developers often encounter conflicting information about the time-taken
field's unit of measurement. Microsoft's official documentation states it's measured in milliseconds, while older W3C specifications suggest seconds.
// Microsoft TechNet reference:
"The length of time that the action took, in milliseconds."
// W3C specification (1995):
"Time taken for transaction to complete in seconds"
To resolve this, I analyzed actual IIS7 log entries:
#Fields: date time s-ip cs-method cs-uri-stem cs-uri-query s-port cs-username c-ip cs(User-Agent) sc-status sc-substatus sc-win32-status time-taken
2023-01-15 08:23:45 192.168.1.1 GET /default.aspx - 80 - 10.0.0.1 Mozilla/5.0 200 0 0 46
The value 46 makes sense as milliseconds (46ms) rather than seconds (which would be extremely slow).
When processing logs programmatically, always treat time-taken
as milliseconds. Here's a C# parsing example:
public class LogEntry {
public DateTime Timestamp { get; set; }
public int TimeTakenMs { get; set; }
// Other properties...
public static LogEntry Parse(string line) {
var parts = line.Split(' ');
return new LogEntry {
Timestamp = DateTime.Parse(parts[0] + " " + parts[1]),
TimeTakenMs = int.Parse(parts[parts.Length - 1])
// Other parsing...
};
}
}
The discrepancy stems from evolution of standards. While early web servers used seconds for simplicity, modern systems like IIS switched to milliseconds for finer-grained measurements. The W3C spec referenced is from 1995 (WD-logfile.html), while Microsoft's implementation reflects current practice.
- Always verify with actual log data
- Document your assumption (milliseconds for IIS7+)
- Add unit conversion comments in your code
- Test edge cases (very large values might indicate issues)
// PowerShell example to calculate avg response time
$logs = Import-Csv -Path "ex*.log" -Delimiter " "
$avgTime = ($logs | Measure-Object -Property time-taken -Average).Average
Write-Host "Average response time: $avgTime ms"
When working with logs from different web servers:
Server | time-taken Unit |
---|---|
IIS7+ | Milliseconds |
Apache | Microseconds |
Nginx | Milliseconds |
When parsing IIS7 logs in W3C Extended Format, developers often encounter confusion regarding the time-taken
field's measurement unit. This discrepancy stems from conflicting documentation sources.
Microsoft's official documentation clearly states:
The length of time that the action took, in milliseconds.
(Source: Microsoft TechNet)
Whereas the W3C consortium's original specification indicates:
Time taken for transaction to complete in seconds
(Source: W3C Logfile Format Working Draft)
Through empirical testing with IIS7.5, we can confirm Microsoft's implementation uses milliseconds. Here's a sample log entry:
2023-05-15 14:22:10 W3SVC1 GET /test.aspx 200 0 0 1245
# time-taken value: 1245 (1.245 seconds)
When processing these logs programmatically, always treat values as milliseconds. Here's a C# parsing example:
public double ParseTimeTaken(string logLine)
{
var parts = logLine.Split(' ');
if (parts.Length > 8)
{
return double.Parse(parts[8]) / 1000; // Convert ms to seconds
}
throw new FormatException("Invalid W3C log format");
}
For accurate performance metrics:
- When comparing with other systems using seconds, convert values
- For millisecond precision calculations, use raw values
- Aggregate values should maintain consistent units
Common log analysis tools handle this automatically:
# LogParser query example
SELECT
TO_TIMESTAMP(date, time) AS timestamp,
cs-uri-stem AS request,
time-taken/1000.0 AS seconds
FROM ex*.log