Small performance test on a web service

Posted by vtortola on Stack Overflow See other posts from Stack Overflow or by vtortola
Published on 2010-04-09T19:05:53Z Indexed on 2010/04/10 2:43 UTC
Read the original article Hit count: 852


I'm trying to develop a small application that test how many requests per second can my service support but I think I'm doing something wrong. The service is in an early development stage, but I'd like to have this test handy in order to check in time to time I'm not doing something that decrease the performance. The problem is that I cannot get the web server or the database server go to the 100% of CPU.

I'm using three different computers, in one is the web server (WinSrv Standard 2008 x64 IIS7), in other the database (Win 2K - SQL Server 2005) and the last is my computer (Win7 x64 ultimate), where I'll run the test. The computers are connected through a 100 ethernet switch. The request POST is 9 bytes and the response will be 842 bytes.

The test launches several threads, and each thread has a "while" loop, in each loop it creates a WebRequest object, performs a call, increment a common counter and waits between 1 and 5 millisencods, then it do it again:

    static Int32 counter = 0;

    static void Main(string[] args)
        ServicePointManager.DefaultConnectionLimit = 250;

        Console.WriteLine("Ready. Press any key...");

        String localhost = "localhost";
        String linuxmono = "";
        String server= "";

        DateTime start = DateTime.Now;

        Random r = new Random(DateTime.Now.Millisecond);
        for (int i = 0; i < 50; i++)
            new Thread(new ParameterizedThreadStart(Test)).Start(server);
            Thread.Sleep(r.Next(1, 3));


        while (true)
            Console.WriteLine("Request per second :" + counter/DateTime.Now.Subtract(start).TotalSeconds );

    public static void Test(Object ip)
        Guid guid = Guid.NewGuid();

        Random r = new Random(DateTime.Now.Millisecond);
        while (true)
            String test = "<lalala/>";
            WebRequest req = WebRequest.Create("http://" + (String)ip + "/WebApp/"+guid.ToString()+"/Data/Tables=whatever");
            req.Method = "POST";
            req.ContentType = "application/xml";
            req.Credentials = new NetworkCredential("aaa", "aaa","domain");
            Byte[] array = Encoding.UTF8.GetBytes(test);
            req.ContentLength = array.Length;
            using (Stream reqStream = req.GetRequestStream())
                reqStream.Write(array, 0, array.Length);

            using (Stream responseStream = req.GetResponse().GetResponseStream())
                String response = new StreamReader(responseStream).ReadToEnd();
                if (response.Length != 842)
                    Console.Write(" EEEE ");

            Interlocked.Increment(ref counter);


If I run the test neither of the computers do an excesive CPU usage. Let's say I get a X requests per second, if I run the console application two times at the same moment, I get X/2 request per second in each one... but still the web server is on 30% of CPU, the database server on 25%...

I've tried to remove the thread.sleep in the loop, but it doesn't make a big difference.

I'd like to put the machines to the maximun, to check how may requests per second they can provide. I guessed that I could do it in this way... but apparently I'm missing something here... What is the problem?

Kind regards.

© Stack Overflow or respective owner

Related posts about .NET

Related posts about Performance