Why are 32-bit application pools more efficient in IIS? [closed]

Posted by mhenry1384 on Server Fault See other posts from Server Fault or by mhenry1384
Published on 2013-04-01T14:00:48Z Indexed on 2014/06/09 15:30 UTC
Read the original article Hit count: 375

I've been running load tests with two different ASP.NET web applications in IIS. The tests are run with 5,10,25, and 250 user agents. Tested on a box with 8 GB RAM, Windows 7 Ultimate x64. The same box running both IIS and the load test project.

I did many runs, and the data is very consistent. For every load, I see a lower "Avg. Page Time (sec)" and a lower "Avg. Response Time (sec)" if I have "Enable 32-bit Applications" set to True in the Application Pools. The difference gets more pronounced the higher the load. At very high loads, the web applications start to throw errors (503) if the application pools are 64-bit, but they can can keep up if set to 32-bit.

Why are 32-bit app pools so much more efficient? Why isn't the default for application pools 32-bit?

© Server Fault or respective owner

Related posts about iis

Related posts about ASP.NET