Inaccurate performance counter timer values in Windows Performance Monitor

Posted by krisg on Stack Overflow See other posts from Stack Overflow or by krisg
Published on 2010-03-17T02:38:30Z Indexed on 2010/03/17 2:41 UTC
Read the original article Hit count: 685

Filed under:
|
|
|

I am implementing instrumentation within an application and have encountered an issue where the value that is displayed in Windows Performance Monitor from a PerformanceCounter is incongruent with the value that is recorded.

I am using a Stopwatch to record the duration of a method execution, then first i record the total milliseconds as a double, and secondly i pass the Stopwatch's TimeSpan.Ticks to the PerformanceCounter to be recorded in the Performance Monitor.

Creating the Performance Counters in perfmon:

var datas = new CounterCreationDataCollection();
datas.Add(new CounterCreationData 
{
    CounterName = name, 
    CounterType = PerformanceCounterType.AverageTimer32
});

datas.Add(new CounterCreationData 
{
    CounterName = namebase, 
    CounterType = PerformanceCounterType.AverageBase
});

PerformanceCounterCategory.Create("Category", "performance data",
    PerformanceCounterCategoryType.SingleInstance, datas);

Then to record i retrieve a pre-initialized counter from a collection and increment:

_counters[counter].IncrementBy(timing);
_counters[counterbase].Increment();

...where "timing" is the Stopwatch's TimeSpan.Ticks value.

When this runs, the collection of double's, which are the milliseconds values for the Stopwatch's TimeSpan show one set of values, but what appears in PerfMon are a different set of values.

For example... two values recorded in the List of milliseconds are:

23322.675, 14230.614

And what appears in PerfMon graph are:

15.546, 9.930

Can someone explain this please?

© Stack Overflow or respective owner

Related posts about c#

Related posts about Performance