- 
            
            as seen on Pro Webmasters
            - Search for 'Pro Webmasters' 
            
 I have a question regarding how to write robots.txt files for many domains and subdomains with redirects in place.
We have a hosting account that enacts primary and add-on domains. All of our domains and subdomains, including the primary domain, is redirected via htaccess 301s to their own subdirectories…
            >>> More
 
- 
            
            as seen on Server Fault
            - Search for 'Server Fault' 
            
 I have an ubuntu 10.04 server where I installed mod_evasive using apt-get install libapache2-mod-evasive
I already tried several configurations, the result stays the same.
The blocking does work, but randomly.
I tried with low limis and long blocking periods as well as short limits.
The behaviour…
            >>> More
 
- 
            
            as seen on Stack Overflow
            - Search for 'Stack Overflow' 
            
 On the page http://qxlapps.dk/test.htm I am trying to perform an Ajax load from another domain, qxlapp.dk. I am using James Padolsey's xdomainajax.js plugin from:
http://james.padolsey.com/javascript/cross-domain-requests-with-jquery/
When I open my test page, I get no output, but FireBug shows…
            >>> More
 
- 
            
            as seen on Stack Overflow
            - Search for 'Stack Overflow' 
            
 Wondering if following will work for google in robots.txt
Disallow: /*.action
I need to exclude all urls ending with .action. 
Is this correct?
            >>> More
 
- 
            
            as seen on Stack Overflow
            - Search for 'Stack Overflow' 
            
 Hi there,
I recently edited the robots.txt file in my site using a wordpress plugin. However, since i did this, google seems to have removed my site from their search page. I'd appreciate if I could get an expert opinion on why this is so, and a possible solution. I'd initially done it to increase…
            >>> More