TCP/IP RST being sent differently in different browsers.

Posted by Brian on Stack Overflow See other posts from Stack Overflow or by Brian
Published on 2010-03-19T08:10:08Z Indexed on 2010/05/09 17:18 UTC
Read the original article Hit count: 176

Filed under:
|
|
|

On Mac OS X (10.6), if I start a YouTube video download and pull the Ethernet cable for 5 or so seconds, then plug it back in, I get varying results depending on the browser. With Opera and Chrome, after I plug the cable back in the video continues to load. But with Safari and Firefox, it never does.

Using Wireshark to look at the traffic, I found that Opera and Chrome simply ACK the first packet from YouTube after the cable has been plugged back in, but Safari and Firefox set the RST flag (0x4) in the TCP header and no more traffic follows.

I can put a HUB in between the machine and the internet connection, the problem goes away and all four browsers continue loading the video when the cable is plugged back into the HUB. Again, looking at the Wireshark logs, it's evident that the machine doesn't see the Mulitcast connection close and there is simply a delay in the packets flowing through.

So it seems that if Safari and Firefox sees a Multicast connection close, and then later see data on that same connection, they will send a RST.

My question is why? What is the correct course of action, and why are 2/4 browsers doing it one way, while the other 2/4 are doing it another way? Is there somewhere in the code that I can see where this is happening in Firefox, for instance?

Thank you very much.

© Stack Overflow or respective owner

Related posts about tcp

Related posts about tcpip