Extract data from PostgreSQL DB without using pg_dump

Posted by John Horton on Stack Overflow See other posts from Stack Overflow or by John Horton
Published on 2010-05-05T05:17:53Z Indexed on 2010/05/05 8:38 UTC
Read the original article Hit count: 247

Filed under:
|
|
|

There is a PostgreSQL database on which I only have limited access (e.g, I can't use pg_dump). I am trying to create a local "mirror" by exporting certain tables from the database. I do not have the permissions needed to just dump a table as SQL from within psql. Right now, I just have a Python script that iterates through my table_names, selects all fields and then exports them as a CSV:

for table_name, file_name in zip(table_names, file_names):
    cmd = """echo "\\\copy (select * from %s)" to stdout WITH CSV HEADER | psql -d remote_db | gzip > ./%s/%s.gz"""%(table_name,dir_name,file_name)
    os.system(cmd)

I would like to not use CSV if possible, as I lose the field types and the encoding can get messed up. First best would probably be some way of getting the generating SQL code for the table using \copy. Next best would be XML, ideally with some way of preserving the field types. If that doesn't work, I think the final option might be two queries---one to get the field data types, the other to get the actual data.

Any thoughts or advice would be greatly appreciated - thanks!

© Stack Overflow or respective owner

Related posts about postgresql

Related posts about Xml