Should PostgreSQL connections be pooled in a Python web app, or create a new connection per request?
PgBouncer is pretty neat and transparent to the application and server.
We have been using PgBouncer in production for 2 years without a single issue. It's a pretty awesome PostgreSQL connection pooler.
http://wiki.postgresql.org/wiki/PgBouncer
Yeah connection pooling will help but yes you have to find out the right numbers for live connection or pool size based on which load will be on database.
from psycopg2.pool import SimpleConnectionPool
from contextlib import contextmanager
dbConnection = "dbname='dbname' user='postgres' host='localhost' password='postgres'"
# pool define with 10 live connections
connectionpool = SimpleConnectionPool(1,10,dsn=dbConnection)
@contextmanager
def getcursor():
con = connectionpool.getconn()
try:
yield con.cursor()
finally:
connectionpool.putconn(con)
def main_work():
try:
# with here will take care of put connection when its done
with getcursor() as cur:
cur.execute("select * from \"TableName\"")
result_set = cur.fetchall()
except Exception as e:
print "error in executing with exception: ", e
The answer depends on how many such requests will happen and how many concurrently in your web app ? Connection pooling is usually a better idea if you expect your web app to be busy with 100s or even 1000s of user concurrently logged in. If you are only doing this as a side project and expect less than few hundred users, you can probably get away without pooling.