How do you measure latency in low-latency environments?

Posted by Ajaxx on Stack Overflow See other posts from Stack Overflow or by Ajaxx
Published on 2009-08-05T21:28:01Z Indexed on 2010/05/10 15:24 UTC
Read the original article Hit count: 315

Filed under:
|

Here's the setup... Your system is receiving a stream of data that contains discrete messages (usually between 32-128 bytes per message). As part of your processing pipeline, each message passes through two physically separate applications which exchange the data using a low-latency approach (such as messaging over UDP) or RDMA and finally to a client via the same mechanism.

Assuming you can inject yourself at any level, including wire protocol analysis, what tools and/or techniques would you use to measure the latency of your system. As part of this, I'm assuming that every message that is delivered to the system results in a corresponding (though not equivalent) message being pushed through the system and delivered to the client.

The only tool that I've seen on the market like this is TS-Associates TipOff. I'm sure that with the right access you could probably measure the same information using a wire analysis tool (ala wireshark) and the right dissectors, but is this the right approach or are there any commodity solutions that I can use?

© Stack Overflow or respective owner

Related posts about latency

Related posts about measurement