Top techniques to avoid 'data scraping' from a website database

Posted by Addsy on Stack Overflow See other posts from Stack Overflow or by Addsy
Published on 2010-01-14T19:02:03Z Indexed on 2010/04/03 22:33 UTC
Read the original article Hit count: 275

Filed under:
|
|
|
|

I am setting up a site using PHP and MySQL that is essentially just a web front-end to an existing database. Understandably my client is very keen to prevent anyone from being able to make a copy of the data in the database yet at the same time wants everything publicly available and even a "view all" link to display every record in the db.

Whilst I have put everything in place to prevent attacks such as SQL injection attacks, there is nothing to prevent anyone from viewing all the records as html and running some sort of script to parse this data back into another database. Even if I was to remove the "view all" link, someone could still, in theory, use an automated process to go through each record one by one and compile these into a new database, essentially pinching all the information.

Does anyone have any good tactics for preventing or even just dettering this that they could share.

Thanks

© Stack Overflow or respective owner

Related posts about database

Related posts about web-development