By Jon Saints - 15 Oct 2014
I’ve spent some time using the method that I outlined below and found some problems with it. I now have a new prefered way of querying postgres from python and getting results as a dictionary. You should use the new way in most cases.
Here are the problems with the previous method:
My new suggestion is to use psycopg2 and the
with statement to ensure that connections are efficient and handled properly by garbage collection when they are no longer in use.
See also the psycopg2 list of Best Practices:
There is also a dictcursor which will give you results as python dictionaries:
Deep in the SQLAlchemy docs, I found this gem: a simple way of querying a database from Python that returns query results as a python dictionary.
First, setup a virtualenv for your project and install SQLAlchemy and psycopg2
cd my_app virtualenv venv . venv/bin/activate pip install SQlAlchemy pip install psycopg2
Now, with just a few lines you can run queries from your python scripts. Results are returned as python dictionaries:
from sqlalchemy import create_engine engine = create_engine('postgresql://scott:tiger@localhost/mydatabase') result = engine.execute("select * from users where fname=%s", ('Jon',) ) for row in result: print "username:", row['username'] print "email:", row['email'] print "First Name:", row['fname']
There are a few advantages to doing things this way:
Note: To install psycopg2 on my Mac I had to first install Postgres.app and set my PATH in .bash_profile to
Note: If you are on Ubuntu you will need to
sudo apt-get install libpq-dev python-dev before running
pip install psycopg2