What characteristic of networking/TCP causes linear relation between TCP activity and latency?

Posted by DeLongey on Server Fault See other posts from Server Fault or by DeLongey
Published on 2012-06-03T00:12:30Z Indexed on 2012/06/03 4:43 UTC
Read the original article Hit count: 483

Filed under:

The core of this problem is that our application uses websockets for real-time interfaces. We are testing our app in a new environment but strangely we're noticing an increasing delay in TCP websocket packets associated with an increase in websocket activity.

For example, if one websocket event occurs without any other activity in a 1-minute period, the response from the server is instantaneous. However, if we slowly increase client activity the latency in server response increases with a linear relationship (each packet will take more time to reach the client with more activity).

For those wondering this is NOT app-related since our logs show that our server is running and responding to requests in under 100ms as desired. The delay starts once the server processes the request and creates the TCP packet and sends it to the client (and not the other way around).

Architecture This new environment runs with a Virtual IP address and uses keepalived on a load balancer to balance the traffic between instances. Two boxes sit behind the balancer and all traffic runs through it. Our host provider manages the balancer and we do not have control over that part of the architecture.

Theory Could this somehow be related to something buffering the packets in the new environment?

Thanks for your help.

© Server Fault or respective owner

Related posts about tcp