Importing wikipedia database dumb - kills navicat - anyone got any ideas?

Posted by Ali on Stack Overflow See other posts from Stack Overflow or by Ali
Published on 2009-05-14T10:26:38Z Indexed on 2010/03/14 20:35 UTC
Read the original article Hit count: 376

Filed under:
|
|
|

Ok guys I've downloaded the wikipedia xml dump and its a whopping 12 GB of data :\ for one table and I wanted to import it into mysql databse on my localhost - however its a humongous file 12GB and obviously navicats taking its sweet time in importing it or its more likely its hanged :(.

Is there a way to include this dump or atleast partially at most you know bit by bit.


Let me correct that its 21 GB of data - not that it helps :\ - does any one have any idea of importing humongous files like this into MySQL database.

© Stack Overflow or respective owner

Related posts about wikipedia

Related posts about mysql