How to get search engines to properly index an ajax driven search page

Posted by Redtopia on Pro Webmasters See other posts from Pro Webmasters or by Redtopia
Published on 2012-10-22T04:09:05Z Indexed on 2012/10/22 11:19 UTC
Read the original article Hit count: 396

Filed under:
|
|
|
|

I have an ajax-driven search page that will allow users to search through a large collection of records. Each search result points to index.php?id=xyz (where xyz is the id of the record). The initial view does not have any records listed, and there is no interface that allows you to browse through all records. You can only conduct a search.

How do I build the page so that spiders can crawl each record? Or is there another way (outside of this specific search page) that will allow me to point spiders to a list of all records.

FYI, the collection is rather large, so dumping links to every record in a single request is not a workable solution. Outputting the records must be done in multiple requests.

Each record can be viewed via a single page (eg "record.php?id=xyz"). I would like all the records indexed without anything indexed from the sitemap that shows where the records exist, for example:

<a href="/result.php?id=record1">Record 1</a>
<a href="/result.php?id=record2">Record 2</a>
<a href="/result.php?id=record3">Record 3</a>

<a href="/seo.php?page=2">next</a>

Assuming this is the correct approach, I have these questions:

  1. How would the search engines find the crawl page?

  2. Is it possible to prevent the search engines from indexing the words "Record 1", etc. and "next"? Can I output only the links? Or maybe something like:  

© Pro Webmasters or respective owner

Related posts about seo

Related posts about html