How should I handle pages that move to a new url with regards to search engines?
        Posted  
        
            by Anders Juul
        on Stack Overflow
        
        See other posts from Stack Overflow
        
            or by Anders Juul
        
        
        
        Published on 2010-05-07T06:31:15Z
        Indexed on 
            2010/05/07
            6:38 UTC
        
        
        Read the original article
        Hit count: 447
        
Hi all,
I have done some refactoring on a asp.net mvc application already deployed to a live web site. Among the refactoring was moving functionality to a new controller, causing some urls to change. Shortly after the various search engine robots start hammering the old urls.
What is the right way to handle this in general?
- Ignore it? In time the SEs should find out that they get nothing but 400 from the old urls.
 - Block old urls with robots.txt?
 - Continue to catch the old urls, then redirect to new ones? Users navigating the site would never get the redirection as the urls are updated through-out the new version of the site. I see it as garbage code - unless it could be handled by some fancy routing?
 - Other?
 
As always, all comments welcome...
Thanks, Anders, Denmark
© Stack Overflow or respective owner