Search Results

Search found 301 results on 13 pages for 'fiddler'.

Page 2/13 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Is there a way to configure Fiddler to intercept HTTP calls from a Windows service?

    - by Mike Chess
    We're in the process of replacing an old (5+ years) Windows service application built with VS2005 that makes an HTTP GET call. There are several things that make this difficult (such as the web server is on the customer's network and we can't connect directly to it) and, unfortunately, we'd prefer not to take down the running system to replace it with a WinForm version that can be monitored by Fiddler. The new code appears to be doing everything correctly, but, alas, it is failing to authenticate. Is there a way to configure Fiddler (2.2.9.1) to intercept HTTP calls from a Windows service?

    Read the article

  • Using PHP cURL with an HTTP Debugging Proxy

    - by Kane
    I'm using the app "Fiddler" to debug a GET attempt to a website via PHP cURL. In order to see the cURL traffic I had to specify that the cURL connection use the Fiddler proxy (see code below). $ch = curl_init(); curl_setopt($ch, CURLOPT_HTTPPROXYTUNNEL, 1); curl_setopt($ch, CURLOPT_PROXY, '127.0.0.1:8888'); curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 5); curl_setopt($ch, CURLOPT_TIMEOUT, 10); curl_setopt($ch, CURLOPT_HEADERFUNCTION, 'read_header'); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt($ch, CURLOPT_USERAGENT, $user_agent); curl_setopt($ch, CURLOPT_REFERER, "http://domain.com"); curl_setopt($ch, CURLOPT_HTTPHEADER, $headers); curl_setopt($ch, CURLOPT_COOKIEJAR, "my_cookies.txt"); curl_setopt($ch, CURLOPT_COOKIEFILE, "my_cookies.txt"); curl_setopt($ch, CURLOPT_URL, "http://domain.com"); $response = curl_exec($ch); But the problem is that in Fiddler I can only see this: Request (domain.com is just an alias): CONNECT domain.com:80 HTTP/1.1 Response: HTTP/1.1 200 Blind-Connection Established If I manually load the website in a browser Fiddler gives me WAY more information. I can see the cookies, the header information, and what I'm receiving via the GET. Any ideas why Fiddler can't see more useful information from PHP cURL? Edit: I tried turning on the "Enable HTTPS Decryption" option inside Tools / Fiddler Options / HTTPS (which I'm not sure why I'd need to use as I didn't tell cURL to use HTTPS). Unfortunately, by changing this setting I now get a Response of: HTTP/1.1 502 Connection failed Edit: If it helps, the app "Charles" shows me WAY more information than Fiddler, but I really want to figure out Fiddler since I like it better.

    Read the article

  • Looking for Fiddler2 help. connection to gateway refused? Just got rid of a virus

    - by John Mackey
    I use Fiddler2 for facebook game items, and it's been a great success. I accessed a website to download some dat files I needed. I think it was eshare, ziddu or megaupload, one of those. Anyway, even before the rar file had downloaded, I got this weird green shield in the bottom right hand corner of my computer. It said a Trojan was trying to access my computer, or something to that extent. It prompted me to click the shield to begin anti-virus scanning. It turns out this rogue program is called Antivirus System Pro and is pretty hard to get rid of. After discovering the rogue program, I tried using Fiddler and got the following error: [Fiddler] Connection to Gateway failed.Exception Text: No connection could be made because the target machine actively refused it 127.0.0.1:5555 I ended up purchasing SpyDoctor + Antivirus, which I'm told is designed specifically for getting rid of these types of programs. Anyway, I did a quick-scan last night with spydoctor and malware bytes. Malware picked up 2 files, and Spydoctor found 4. Most were insignificant, but it did find a worm called Worm.Alcra.F, which was labeled high-priority. I don’t know if that’s the Anti-Virus Pro or not, but SpyDoctor said it got rid of all of those successfully. I tried to run Fiddler again before leaving home, but was still getting the "gateway failed" error. Im using the newest version of firefox. When I initially set up the Fiddler 2.2.8.6, I couldn’t get it to run at first, so I found this faq on the internet that said I needed to go through ToolsOptionsSettings and set up an HTTP Proxy to 127.0.0.1 and my Port to 8888. Once I set that up and downloaded this fiddler helper as a firefox add-on, it worked fine. When I turn on fiddler, it automatically takes my proxy setting from no proxy (default) to the 127.0.0.1 with Port 8888 set up. It worked fine until my computer detected this virus. Anyway, hopefully I've given you sufficient information to offer me your best advice here. Like I said, Spydoctor says the bad stuff is gone, so maybe the rogue program made some type of change in my fiddler that I could just reset or uncheck or something like that? Or will I need to completely remove fiddler and those dat files and rar files I downloaded? Any help would be greatly appreciated. Thanks for your time.

    Read the article

  • Having trouble Getting "RTSP over HTTP"

    - by Muhammad Adeel Zahid
    There is an axis camera that is connected to our site (camba.tv) through axis one click connection component (which acts as proxy). We can communicate with this camera only through http by setting the proxy to our OCCC server's address. If we want to get RTSP streams (h.264) we are only left with "RTSP over HTTP" option. For this I have followed axis VAPIX 3 documentation section 3.3. I issue requests through fiddler but don't get any response. But when i put the URL (axrtsphttp://1.00408CBEA38B/axis-media/media.amp) in windows media player (with proxy set to OCCC server 212.78.237.156:3128) the player is able to get RTSP stream over HTTP after logging in. I have created a request trace of communication between camera and windows media player through wireshark and the request that brings the stream looks like http://1.00408cbea38b/axis-media/media.amp HTTP/1.1 x-sessioncookie: 619 User-Agent: Axis AMC Host: 1.00408CBEA38B Proxy-Connection: Keep-Alive Pragma: no-cache Authorization: Digest username="root",realm="AXIS_00408CBEA38B",nonce="000a8b40Y0100409c13ac7e6cceb069289041d8feb1691",uri="/axis-media/media.amp",cnonce="9946e2582bd590418c9b70e1b17956c7",nc=00000001,response="f3cab86fc84bfe33719675848e7fdc0a",qop="auth" HTTP/1.0 200 OK Content-Type: application/x-rtsp-tunnelled Date: Tue, 02 Nov 2010 11:45:23 GMT RTSP/1.0 200 OK CSeq: 1 Content-Type: application/sdp Content-Base: rtsp://1.00408CBEA38B/axis-media/media.amp/ Date: Tue, 02 Nov 2010 11:45:23 GMT Content-Length: 410 v=0 o=- 1288698323798001 1288698323798001 IN IP4 1.00408CBEA38B s=Media Presentation e=NONE c=IN IP4 0.0.0.0 b=AS:50000 t=0 0 a=control:* a=range:npt=0.000000- m=video 0 RTP/AVP 96 b=AS:50000 a=framerate:30.0 a=transform:1,0,0;0,1,0;0,0,1 a=control:trackID=1 a=rtpmap:96 H264/90000 a=fmtp:96 packetization-mode=1; profile-level-id=420029; sprop-parameter-sets=Z0IAKeNQFAe2AtwEBAaQeJEV,aM48gA== RTSP/1.0 200 OK CSeq: 2 Session: 3F4763D8; timeout=60 Transport: RTP/AVP/TCP;unicast;interleaved=0-1;ssrc=060922C6;mode="PLAY" Date: Tue, 02 Nov 2010 11:45:24 GMT RTSP/1.0 200 OK CSeq: 3 Session: 3F4763D8 Range: npt=0- RTP-Info: url=rtsp://1.00408CBEA38B/axis-media/media.amp/trackID=1;seq=7392;rtptime=4190934902 Date: Tue, 02 Nov 2010 11:45:24 GMT [Binary Stream Content] But when i copy this request to fiddler, I only get 200 status code with content-type set to application/x-rtsp-tunneled and there is no stream data. The only thing i do different with stream is to use Basic in authorization header instead of Digest and I do not get 401 (Un authorized) status code. Can anyone explain what's happening here? How can I write request sequences to get stream in fiddler? If it is needed, I can upload the wireshark request dump somewhere.

    Read the article

  • How to reliably categorize HTTP sessions in proxy to corresponding browser' windows/tabs user is viewing?

    - by Jehonathan
    I was using the Fiddler core .Net library as a local proxy to record the user activity in web. However I ended up with a problem which seems dirty to solve. I have a web browser say Google Chrome, and the user opened like 10 different tabs each with different web URLs. The problem is that the proxy records all the HTTP session initiated by each pages separately, causing me to figure out using my intelligence the tab which the corresponding HTTP session belonged to. I understand that this is because of the stateless nature of HTTP protocol. However I am just wondering is there an easy way to do this? I ended up with below c# code for that in Fiddler. Still its not a reliable solution due to the heuristics. This is a modification of the sample project bundled with Fiddler core for .NET 4. Basically what it does is filtering HTTP sessions initiated in last few seconds to find the first request or switching to another page made by the same tab in browser. It almost works, but not seems to be a universal solution. Fiddler.FiddlerApplication.AfterSessionComplete += delegate(Fiddler.Session oS) { //exclude other HTTP methods if (oS.oRequest.headers.HTTPMethod == "GET" || oS.oRequest.headers.HTTPMethod == "POST") //exclude other HTTP Status codes if (oS.oResponse.headers.HTTPResponseStatus == "200 OK" || oS.oResponse.headers.HTTPResponseStatus == "304 Not Modified") { //exclude other MIME responses (allow only text/html) var accept = oS.oRequest.headers.FindAll("Accept"); if (accept != null) { if(accept.Count>0) if (accept[0].Value.Contains("text/html")) { //exclude AJAX if (!oS.oRequest.headers.Exists("X-Requested-With")) { //find the referer for this request var referer = oS.oRequest.headers.FindAll("Referer"); //if no referer then assume this as a new request and display the same if(referer!=null) { //if no referer then assume this as a new request and display the same if (referer.Count > 0) { //lock the sessions Monitor.Enter(oAllSessions); //filter further using the response if (oS.oResponse.MIMEType == string.Empty || oS.oResponse.MIMEType == "text/html") //get all previous sessions with the same process ID this session request if(oAllSessions.FindAll(a=>a.LocalProcessID == oS.LocalProcessID) //get all previous sessions within last second (assuming the new tab opened initiated multiple sessions other than parent) .FindAll(z => (z.Timers.ClientBeginRequest > oS.Timers.ClientBeginRequest.AddSeconds(-1))) //get all previous sessions that belongs to the same port of the current session .FindAll(b=>b.port == oS.port ).FindAll(c=>c.clientIP ==oS.clientIP) //get all previus sessions with the same referrer URL of the current session .FindAll(y => referer[0].Value.Equals(y.fullUrl)) //get all previous sessions with the same host name of the current session .FindAll(m=>m.hostname==oS.hostname).Count==0 ) //if count ==0 that means this is the parent request Console.WriteLine(oS.fullUrl); //unlock sessions Monitor.Exit(oAllSessions); } else Console.WriteLine(oS.fullUrl); } else Console.WriteLine(oS.fullUrl); Console.WriteLine(); } } } } };

    Read the article

  • IE6 secure and nonsecure error - nothing in Fiddler/no iframes

    - by seengee
    Hi, I've got the issue of IE6 showing the "secure and nonsecure items" error on an SSL page. Looking into it though none of the usual causes seem to apply. There are no calls to http://, there are no iframes in the page, fiddler and httpfox both show only requests to https:// - what else can i check? In Firefox and IE there is nothing to suggest there is mixed content at all

    Read the article

  • Untraceable malicious browser calls

    - by MaximusOMaximus
    I installed Fiddler 4 Beta to do some HTTP tracing. I found a lot of calls being made to sites like : facebook, collegehumor and a bunch of other sites I've never visited. Could not trace what/who is initiating these calls as I do not see any Windows Processes. No one else is connected to my network. I use both Google Chrome and IE10 on a Windows 7 box. Please help me tracing and removing these malicious HTTP calls.

    Read the article

  • How can I monitor ports on Windows?

    - by Olav
    What is the simplest way on "local" (1*) Windows, for known ports, to: Find out if it is used. Find out as much as possible about what is behind the port. Find out as much as possible about traffic through the port. Find out if something else is interfering with the port and traffic to it. I have used Fiddler in the past, but I think that's mostly HTTP? I don't if Wire-shark does more? I think there is a tool closely integrated with Windows? Which one? (5). I am looking at NMap, but its seems to be more a suite of tools, and a high entry level. 1*: Primarily this is for what happens inside my Windows Machine, but if necessary, I can for example use a VM, or the wireless connection.

    Read the article

  • How can I verify that javascript and images are being cached?

    - by BestPractices
    I want to verify that the images, css, and javascript files that are part of my page are being cached by my browser. I've used Fiddler and Google Page Speed and it's unclear whether either is giving me the information I need. Fiddler shows the HTTP 304 response for images, css, and javascript which should tell the browser to use the cached copy. Google Page Speed shows the 304 response but doesn't show a Transfer Size of Zero, instead it shows the full file size of the resource. Note also, I have seen Google Page Speed report a 200 response but then put the word (cache) next to the 200 (so Status is 200 (cache)), which doesnt make a lot of sense. Any other suggestions as to how I can verify whether the server is sending back images, css, javascript after they've been retrieved and cached by a previous page hit?

    Read the article

  • Fiddler2 to debug SL

    - by Nair
    I have a SL3 + Ria service application that I want to trace what is are the calls made between the client and the server. Since I am debugging the application in localhost, I am unable to see any trace in fiddler. I tried http://localhost.:port/websitename/page.aspx and I got "The requested URL could not be retrieved" message. If i remove '.' between localhost and port my page show up but there is no fiddler capture. How would one go about to see/capture all the calls made between the client and service in localhost? Thanks,

    Read the article

  • How Can I Determine if HTTP Requests/Responses are compressed in IE7?

    - by DTS
    I'm trying to use Fiddler (v2.2.2.0) to see if HTTP traffic through IE7 is being compressed. I'm not seeing Accept-Encoding or Content-Encoding request/response headers being sent/returned and I do not need to decode the response data once it's arrived, which leads me to believe that the responses are NOT coming back compressed. However, when making the same requests using FireFox 3.5.7, I could see through FireBug that FF was sending Accept-Encoding and YSlow at least thought my data was coming back compressed. A comment in this question: http://stackoverflow.com/questions/897989/using-fiddler-to-check-iis-compression suggested that a proxy server may be to blame for stripping out headers and decompressing the content for security reasons. I am using Verizon FIOS for my broadband at home and am now wondering if Verizon is proxying my HTTP traffic? In short, how can I positively confirm/deny that responses are coming back compressed through IE? Thanks.

    Read the article

  • Iframe form not submittin in IE but working in Firefox

    - by Younes
    I have got a form that posts values to a page in a wizard. When i'm loading this form in a Iframe everything is working fine in Firefox, it will get me to the second step of the wizard and maintains the values i filled in. When im testing this in Internet Explorer i am not getting to the second step, instead of that it returns me to the first step of the wizard with all fields being blank. When i check this in Fiddler i see that im getting a different response when i'm posting the form in the Iframe from Firefox compared to Internet Explorer. How can i make this work for all browsers? What am I doing wrong? This is what i get back from Fiddler: Firefox Post: Ressult Protocol Host URL Body Caching Content-Type Process Comments Custom 1 302 HTTP www.dmg.eu /brugman/budgetplanner/aanmelden.php 0 no-store, no-cache, must-revalidate, post-check=0, pre-check=0 Expires: Thu, 19 Nov 1981 08:52:00 GMT text/html; charset=UTF-8 firefox:6116 Get: # Result Protocol Host URL Body Caching Content-Type Process Comments Custom 2 200 HTTP www.dmg.eu /brugman/budgetplanner/ 40.677 no-store, no-cache, must-revalidate, post-check=0, pre-check=0 Expires: Thu, 19 Nov 1981 08:52:00 GMT text/html; charset=UTF-8 firefox:6116 Internet Explorer Post: Result Protocol Host URL Body Caching Content-Type Process Comments Custom 73 302 HTTP www.dmg.eu /brugman/budgetplanner/aanmelden.php 0 no-store, no-cache, must-revalidate, post-check=0, pre-check=0 Expires: Thu, 19 Nov 1981 08:52:00 GMT text/html; charset=UTF-8 iexplore:536 Get: Result Protocol Host URL Body Caching Content-Type Process Comments Custom 74 302 HTTP www.dmg.eu /brugman/budgetplanner/ 0 no-store, no-cache, must-revalidate, post-check=0, pre-check=0 Expires: Thu, 19 Nov 1981 08:52:00 GMT text/html; charset=UTF-8 iexplore:536 Hope someone knows what the diff is :).

    Read the article

  • WCF client throws SocketException if I'm also running a Fiddler

    - by user437291
    Hi I'm trying to use Fiddler2 to inspect SOAP messages exchanged between WCF client and WCF service ( both client and service are running on same machine). But problem is, whenever I use Fiddler2, WCF client reports "EndpointnotFoundException:There was no endpoint listening at http://a-PC:8100 that could accept the message à System.Net.WebException: Unable to connect to the remote server - à System.Net.Sockets-SocketException:An attempt was made to access a socket in a way forbidden by its access permissions 127.0.0.1:8888" Thank you

    Read the article

  • How to view ASMX SOAP using Fiddler2?

    - by outer join
    Does anyone know if Fiddler can display the raw SOAP messages for ASMX web services? I'm testing a simple web service using both Fiddler2 and Storm and the results vary (Fiddler shows plain xml while Storm shows the SOAP messages). See sample request/responses below: Fiddler2 Request: POST /webservice1.asmx/Test HTTP/1.1 Accept: */* Referer: http://localhost.:4164/webservice1.asmx?op=Test Accept-Language: en-us User-Agent: Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.1; Trident/4.0; .NET CLR 1.1.4322; .NET CLR 2.0.50727; .NET CLR 3.0.04506.30; .NET CLR 3.0.04506.648; .NET CLR 3.5.21022; .NET CLR 3.0.4506.2152; .NET CLR 3.5.30729; InfoPath.2; MS-RTC LM 8) Content-Type: application/x-www-form-urlencoded Accept-Encoding: gzip, deflate Host: localhost.:4164 Content-Length: 0 Connection: Keep-Alive Pragma: no-cache Fiddler2 Response: HTTP/1.1 200 OK Server: ASP.NET Development Server/9.0.0.0 Date: Thu, 21 Jan 2010 14:21:50 GMT X-AspNet-Version: 2.0.50727 Cache-Control: private, max-age=0 Content-Type: text/xml; charset=utf-8 Content-Length: 96 Connection: Close <?xml version="1.0" encoding="utf-8"?> <string xmlns="http://tempuri.org/">Hello World</string> Storm Request (body only): <?xml version="1.0" encoding="utf-8"?> <soap:Envelope xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/"> <soap:Body> <Test xmlns="http://tempuri.org/" /> </soap:Body> </soap:Envelope> Storm Response: Status Code: 200 Content Length : 339 Content Type: text/xml; charset=utf-8 Server: ASP.NET Development Server/9.0.0.0 Status Description: OK <?xml version="1.0" encoding="utf-8"?> <soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema"> <soap:Body> <TestResponse xmlns="http://tempuri.org/"> <TestResult>Hello World</TestResult> </TestResponse> </soap:Body> </soap:Envelope> Thanks for any help.

    Read the article

  • Chrome sending of packets to random destinations upon reconnect/disconnect

    - by f0x
    Noticed an interesting thing whilst debugging one of my websocket applications that Google Chrome will push out 3 http requests upon a network connection status changing; Quite disconcerting and looks almost as if some malware is checking out to a random server. I don't quite understand the why though since they all return a 502 or have no response code at all since the destination does not exist. On Disconnect: Reconnect: I guess the main question is this normal and what the use is; howcome they wouldn't go for a dns lookup that actually exists?

    Read the article

  • Does Visual Studio Localhost ASP.NET debugging allow caching?

    - by Joel
    The title pretty much says it all, but here's the issue. I have an generic handler that returns Javascript; the only line of code that deals with caching that I put in is the following: context.Response.Cache.SetCacheability(HttpCacheability.Public); context.Response.Cache.SetExpires(DateTime.Now.AddYears(1)); context.Response.ContentType = "text/javascript"; context.Response.Write("result"); I'm debugging on localhost. out of Visual Studio 2008, (that's "localhost." so fiddler picks it up) and Fiddler2 sees the expire date, but says that the cache header is set to private, and the page isn't being cached. Can anybody see what's going wrong here?

    Read the article

  • ASP.NET 4 UpdatePanel and IIS7 Problem

    - by rwponu
    I have an ASP.NET 4 webpage that contains an update panel which just allows me to add a few items to a drop down list without reloading the entire page. The page works fine on the Visual Studio 2010 ASP.NET Development Server, performs the Async call and the page is properly laid out. However, when I deploy the page to IIS7, the Async call no longer works (the page is completely reloaded) and the layout of some items on the page is incorrect. I used Fiddler to look at what's happening and it looks like there are 404's when the page tries to access ScriptResource.axd, with everything else working correctly. I think that has to do with the Javascript required for the call but I'm not sure how to fix it. Any suggestions?

    Read the article

  • calling a wcf/soap method as an http get

    - by gleasonomicon
    Is there any way to enforce that a method call in soap based wcf is called as an HTTP get? I'm not sure if this would be handled on the client or server side. We wanted to have the wcf call process as a get vs. post for cacheability, etc. I'm also not sure how to monitor a wcf service to determine if calls are doing gets or posts (or if it always does one or the other). Can I use fiddler for this? I would imagine I could use a restful wcf service to wrap the call, but I wasn't sure if there was a way to do it straight in a soap based service.

    Read the article

  • IE 6&7 Hanging When Opening New Window

    - by user310490
    I've got a real interesting situation. I have an existing web app that runs on a number of desktops fine. On a few desktops I see the following behavior: Upon clicking a link that opens up a new window (to another URL in the same domain) the IE window freezes and IE needs to be killed. This happens on IE 6 & 7. When using Fiddler I see NO traffic when clicking the link. When using IE HttpAnalyzer I see a request register but no response. If I change the MaxConnectionsPerServer registry setting to a higher value, e.g. 10 the problem goes away. Looking at netstat I dont see any abnormal connections. So I'm totally confused, the issue seems to be on the client side and seems to be related to IE not being able to make an additional socket connection to the server, but netstat doesn't show that. Ideas?

    Read the article

  • Linux-alternative to Fiddler2

    - by Epcylon
    I have used Fiddler2 with great results on windows before, but now I have moved to using linux for development. The problem I have, is that I have not been able to find a decent replacement for Fiddler2 that will run on linux. I have tried Wireshark, but it is perhaps too generic in what it does, and I can never really make any sense of its output. What tools do you use on linux to debug/inspect web-traffic during development?

    Read the article

  • West Wind WebSurge - an easy way to Load Test Web Applications

    - by Rick Strahl
    A few months ago on a project the subject of load testing came up. We were having some serious issues with a Web application that would start spewing SQL lock errors under somewhat heavy load. These sort of errors can be tough to catch, precisely because they only occur under load and not during typical development testing. To replicate this error more reliably we needed to put a load on the application and run it for a while before these SQL errors would flare up. It’s been a while since I’d looked at load testing tools, so I spent a bit of time looking at different tools and frankly didn’t really find anything that was a good fit. A lot of tools were either a pain to use, didn’t have the basic features I needed, or are extravagantly expensive. In  the end I got frustrated enough to build an initially small custom load test solution that then morphed into a more generic library, then gained a console front end and eventually turned into a full blown Web load testing tool that is now called West Wind WebSurge. I got seriously frustrated looking for tools every time I needed some quick and dirty load testing for an application. If my aim is to just put an application under heavy enough load to find a scalability problem in code, or to simply try and push an application to its limits on the hardware it’s running I shouldn’t have to have to struggle to set up tests. It should be easy enough to get going in a few minutes, so that the testing can be set up quickly so that it can be done on a regular basis without a lot of hassle. And that was the goal when I started to build out my initial custom load tester into a more widely usable tool. If you’re in a hurry and you want to check it out, you can find more information and download links here: West Wind WebSurge Product Page Walk through Video Download link (zip) Install from Chocolatey Source on GitHub For a more detailed discussion of the why’s and how’s and some background continue reading. How did I get here? When I started out on this path, I wasn’t planning on building a tool like this myself – but I got frustrated enough looking at what’s out there to think that I can do better than what’s available for the most common simple load testing scenarios. When we ran into the SQL lock problems I mentioned, I started looking around what’s available for Web load testing solutions that would work for our whole team which consisted of a few developers and a couple of IT guys both of which needed to be able to run the tests. It had been a while since I looked at tools and I figured that by now there should be some good solutions out there, but as it turns out I didn’t really find anything that fit our relatively simple needs without costing an arm and a leg… I spent the better part of a day installing and trying various load testing tools and to be frank most of them were either terrible at what they do, incredibly unfriendly to use, used some terminology I couldn’t even parse, or were extremely expensive (and I mean in the ‘sell your liver’ range of expensive). Pick your poison. There are also a number of online solutions for load testing and they actually looked more promising, but those wouldn’t work well for our scenario as the application is running inside of a private VPN with no outside access into the VPN. Most of those online solutions also ended up being very pricey as well – presumably because of the bandwidth required to test over the open Web can be enormous. When I asked around on Twitter what people were using– I got mostly… crickets. Several people mentioned Visual Studio Load Test, and most other suggestions pointed to online solutions. I did get a bunch of responses though with people asking to let them know what I found – apparently I’m not alone when it comes to finding load testing tools that are effective and easy to use. As to Visual Studio, the higher end skus of Visual Studio and the test edition include a Web load testing tool, which is quite powerful, but there are a number of issues with that: First it’s tied to Visual Studio so it’s not very portable – you need a VS install. I also find the test setup and terminology used by the VS test runner extremely confusing. Heck, it’s complicated enough that there’s even a Pluralsight course on using the Visual Studio Web test from Steve Smith. And of course you need to have one of the high end Visual Studio Skus, and those are mucho Dinero ($$$) – just for the load testing that’s rarely an option. Some of the tools are ultra extensive and let you run analysis tools on the target serves which is useful, but in most cases – just plain overkill and only distracts from what I tend to be ultimately interested in: Reproducing problems that occur at high load, and finding the upper limits and ‘what if’ scenarios as load is ramped up increasingly against a site. Yes it’s useful to have Web app instrumentation, but often that’s not what you’re interested in. I still fondly remember early days of Web testing when Microsoft had the WAST (Web Application Stress Tool) tool, which was rather simple – and also somewhat limited – but easily allowed you to create stress tests very quickly. It had some serious limitations (mainly that it didn’t work with SSL),  but the idea behind it was excellent: Create tests quickly and easily and provide a decent engine to run it locally with minimal setup. You could get set up and run tests within a few minutes. Unfortunately, that tool died a quiet death as so many of Microsoft’s tools that probably were built by an intern and then abandoned, even though there was a lot of potential and it was actually fairly widely used. Eventually the tools was no longer downloadable and now it simply doesn’t work anymore on higher end hardware. West Wind Web Surge – Making Load Testing Quick and Easy So I ended up creating West Wind WebSurge out of rebellious frustration… The goal of WebSurge is to make it drop dead simple to create load tests. It’s super easy to capture sessions either using the built in capture tool (big props to Eric Lawrence, Telerik and FiddlerCore which made that piece a snap), using the full version of Fiddler and exporting sessions, or by manually or programmatically creating text files based on plain HTTP headers to create requests. I’ve been using this tool for 4 months now on a regular basis on various projects as a reality check for performance and scalability and it’s worked extremely well for finding small performance issues. I also use it regularly as a simple URL tester, as it allows me to quickly enter a URL plus headers and content and test that URL and its results along with the ability to easily save one or more of those URLs. A few weeks back I made a walk through video that goes over most of the features of WebSurge in some detail: Note that the UI has slightly changed since then, so there are some UI improvements. Most notably the test results screen has been updated recently to a different layout and to provide more information about each URL in a session at a glance. The video and the main WebSurge site has a lot of info of basic operations. For the rest of this post I’ll talk about a few deeper aspects that may be of interest while also giving a glance at how WebSurge works. Session Capturing As you would expect, WebSurge works with Sessions of Urls that are played back under load. Here’s what the main Session View looks like: You can create session entries manually by individually adding URLs to test (on the Request tab on the right) and saving them, or you can capture output from Web Browsers, Windows Desktop applications that call services, your own applications using the built in Capture tool. With this tool you can capture anything HTTP -SSL requests and content from Web pages, AJAX calls, SOAP or REST services – again anything that uses Windows or .NET HTTP APIs. Behind the scenes the capture tool uses FiddlerCore so basically anything you can capture with Fiddler you can also capture with Web Surge Session capture tool. Alternately you can actually use Fiddler as well, and then export the captured Fiddler trace to a file, which can then be imported into WebSurge. This is a nice way to let somebody capture session without having to actually install WebSurge or for your customers to provide an exact playback scenario for a given set of URLs that cause a problem perhaps. Note that not all applications work with Fiddler’s proxy unless you configure a proxy. For example, .NET Web applications that make HTTP calls usually don’t show up in Fiddler by default. For those .NET applications you can explicitly override proxy settings to capture those requests to service calls. The capture tool also has handy optional filters that allow you to filter by domain, to help block out noise that you typically don’t want to include in your requests. For example, if your pages include links to CDNs, or Google Analytics or social links you typically don’t want to include those in your load test, so by capturing just from a specific domain you are guaranteed content from only that one domain. Additionally you can provide url filters in the configuration file – filters allow to provide filter strings that if contained in a url will cause requests to be ignored. Again this is useful if you don’t filter by domain but you want to filter out things like static image, css and script files etc. Often you’re not interested in the load characteristics of these static and usually cached resources as they just add noise to tests and often skew the overall url performance results. In my testing I tend to care only about my dynamic requests. SSL Captures require Fiddler Note, that in order to capture SSL requests you’ll have to install the Fiddler’s SSL certificate. The easiest way to do this is to install Fiddler and use its SSL configuration options to get the certificate into the local certificate store. There’s a document on the Telerik site that provides the exact steps to get SSL captures to work with Fiddler and therefore with WebSurge. Session Storage A group of URLs entered or captured make up a Session. Sessions can be saved and restored easily as they use a very simple text format that simply stored on disk. The format is slightly customized HTTP header traces separated by a separator line. The headers are standard HTTP headers except that the full URL instead of just the domain relative path is stored as part of the 1st HTTP header line for easier parsing. Because it’s just text and uses the same format that Fiddler uses for exports, it’s super easy to create Sessions by hand manually or under program control writing out to a simple text file. You can see what this format looks like in the Capture window figure above – the raw captured format is also what’s stored to disk and what WebSurge parses from. The only ‘custom’ part of these headers is that 1st line contains the full URL instead of the domain relative path and Host: header. The rest of each header are just plain standard HTTP headers with each individual URL isolated by a separator line. The format used here also uses what Fiddler produces for exports, so it’s easy to exchange or view data either in Fiddler or WebSurge. Urls can also be edited interactively so you can modify the headers easily as well: Again – it’s just plain HTTP headers so anything you can do with HTTP can be added here. Use it for single URL Testing Incidentally I’ve also found this form as an excellent way to test and replay individual URLs for simple non-load testing purposes. Because you can capture a single or many URLs and store them on disk, this also provides a nice HTTP playground where you can record URLs with their headers, and fire them one at a time or as a session and see results immediately. It’s actually an easy way for REST presentations and I find the simple UI flow actually easier than using Fiddler natively. Finally you can save one or more URLs as a session for later retrieval. I’m using this more and more for simple URL checks. Overriding Cookies and Domains Speaking of HTTP headers – you can also overwrite cookies used as part of the options. One thing that happens with modern Web applications is that you have session cookies in use for authorization. These cookies tend to expire at some point which would invalidate a test. Using the Options dialog you can actually override the cookie: which replaces the cookie for all requests with the cookie value specified here. You can capture a valid cookie from a manual HTTP request in your browser and then paste into the cookie field, to replace the existing Cookie with the new one that is now valid. Likewise you can easily replace the domain so if you captured urls on west-wind.com and now you want to test on localhost you can do that easily easily as well. You could even do something like capture on store.west-wind.com and then test on localhost/store which would also work. Running Load Tests Once you’ve created a Session you can specify the length of the test in seconds, and specify the number of simultaneous threads to run each session on. Sessions run through each of the URLs in the session sequentially by default. One option in the options list above is that you can also randomize the URLs so each thread runs requests in a different order. This avoids bunching up URLs initially when tests start as all threads run the same requests simultaneously which can sometimes skew the results of the first few minutes of a test. While sessions run some progress information is displayed: By default there’s a live view of requests displayed in a Console-like window. On the bottom of the window there’s a running total summary that displays where you’re at in the test, how many requests have been processed and what the requests per second count is currently for all requests. Note that for tests that run over a thousand requests a second it’s a good idea to turn off the console display. While the console display is nice to see that something is happening and also gives you slight idea what’s happening with actual requests, once a lot of requests are processed, this UI updating actually adds a lot of CPU overhead to the application which may cause the actual load generated to be reduced. If you are running a 1000 requests a second there’s not much to see anyway as requests roll by way too fast to see individual lines anyway. If you look on the options panel, there is a NoProgressEvents option that disables the console display. Note that the summary display is still updated approximately once a second so you can always tell that the test is still running. Test Results When the test is done you get a simple Results display: On the right you get an overall summary as well as breakdown by each URL in the session. Both success and failures are highlighted so it’s easy to see what’s breaking in your load test. The report can be printed or you can also open the HTML document in your default Web Browser for printing to PDF or saving the HTML document to disk. The list on the right shows you a partial list of the URLs that were fired so you can look in detail at the request and response data. The list can be filtered by success and failure requests. Each list is partial only (at the moment) and limited to a max of 1000 items in order to render reasonably quickly. Each item in the list can be clicked to see the full request and response data: This particularly useful for errors so you can quickly see and copy what request data was used and in the case of a GET request you can also just click the link to quickly jump to the page. For non-GET requests you can find the URL in the Session list, and use the context menu to Test the URL as configured including any HTTP content data to send. You get to see the full HTTP request and response as well as a link in the Request header to go visit the actual page. Not so useful for a POST as above, but definitely useful for GET requests. Finally you can also get a few charts. The most useful one is probably the Request per Second chart which can be accessed from the Charts menu or shortcut. Here’s what it looks like:   Results can also be exported to JSON, XML and HTML. Keep in mind that these files can get very large rather quickly though, so exports can end up taking a while to complete. Command Line Interface WebSurge runs with a small core load engine and this engine is plugged into the front end application I’ve shown so far. There’s also a command line interface available to run WebSurge from the Windows command prompt. Using the command line you can run tests for either an individual URL (similar to AB.exe for example) or a full Session file. By default when it runs WebSurgeCli shows progress every second showing total request count, failures and the requests per second for the entire test. A silent option can turn off this progress display and display only the results. The command line interface can be useful for build integration which allows checking for failures perhaps or hitting a specific requests per second count etc. It’s also nice to use this as quick and dirty URL test facility similar to the way you’d use Apache Bench (ab.exe). Unlike ab.exe though, WebSurgeCli supports SSL and makes it much easier to create multi-URL tests using either manual editing or the WebSurge UI. Current Status Currently West Wind WebSurge is still in Beta status. I’m still adding small new features and tweaking the UI in an attempt to make it as easy and self-explanatory as possible to run. Documentation for the UI and specialty features is also still a work in progress. I plan on open-sourcing this product, but it won’t be free. There’s a free version available that provides a limited number of threads and request URLs to run. A relatively low cost license  removes the thread and request limitations. Pricing info can be found on the Web site – there’s an introductory price which is $99 at the moment which I think is reasonable compared to most other for pay solutions out there that are exorbitant by comparison… The reason code is not available yet is – well, the UI portion of the app is a bit embarrassing in its current monolithic state. The UI started as a very simple interface originally that later got a lot more complex – yeah, that never happens, right? Unless there’s a lot of interest I don’t foresee re-writing the UI entirely (which would be ideal), but in the meantime at least some cleanup is required before I dare to publish it :-). The code will likely be released with version 1.0. I’m very interested in feedback. Do you think this could be useful to you and provide value over other tools you may or may not have used before? I hope so – it already has provided a ton of value for me and the work I do that made the development worthwhile at this point. You can leave a comment below, or for more extensive discussions you can post a message on the West Wind Message Board in the WebSurge section Microsoft MVPs and Insiders get a free License If you’re a Microsoft MVP or a Microsoft Insider you can get a full license for free. Send me a link to your current, official Microsoft profile and I’ll send you a not-for resale license. Send any messages to [email protected]. Resources For more info on WebSurge and to download it to try it out, use the following links. West Wind WebSurge Home Download West Wind WebSurge Getting Started with West Wind WebSurge Video© Rick Strahl, West Wind Technologies, 2005-2014Posted in ASP.NET   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • How to write a program that mimics Fiddler by using tcpdump or from scratch?

    - by ????
    When Fiddler is not on Mac OS X or Ubuntu, and if we don't install/use Wireshark or any other more heavy duty tools, what is a way to use tcpdump so that 1) It can print out GET /foo/bar HTTP/1.1 [request content in RAW text] [response content in RAW text] POST /foo/... HTTP/1.1 this should be able to be done by tcpdump or by using tcpdump in a short shell script or Ruby / Python / Perl script. 2) Actually, it can be neat if a script can output HTML, with GET /foo/bar HTTP/1.1 POST /foo/... HTTP/1.1 on the page, for any browser to display, and then when clicked on any of those lines, it will expand to show the RAW content like (1) above does. Click again and it will hide the details. The expansion UI can be done using jQuery or any JS library. The script may be short... possibly less than 20 lines? Does anybody know how to do it either for (1) or (2)?

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >